NASA Astrophysics Data System (ADS)
Long, Nicholas James
This thesis serves to develop a preliminary foundational methodology for evaluating the static complexity of future lunar oxygen production systems when extensive information is not yet available about the various systems under consideration. Evaluating static complexity, as part of a overall system complexity analysis, is an important consideration in ultimately selecting a process to be used in a lunar base. When system complexity is higher, there is generally an overall increase in risk which could impact the safety of astronauts and the economic performance of the mission. To evaluate static complexity in lunar oxygen production, static complexity is simplified and defined into its essential components. First, three essential dimensions of static complexity are investigated, including interconnective complexity, strength of connections, and complexity in variety. Then a set of methods is developed upon which to separately evaluate each dimension. Q-connectivity analysis is proposed as a means to evaluate interconnective complexity and strength of connections. The law of requisite variety originating from cybernetic theory is suggested to interpret complexity in variety. Secondly, a means to aggregate the results of each analysis is proposed to create holistic measurement for static complexity using the Single Multi-Attribute Ranking Technique (SMART). Each method of static complexity analysis and the aggregation technique is demonstrated using notional data for four lunar oxygen production processes.
Galas, David J; Sakhanenko, Nikita A; Skupin, Alexander; Ignac, Tomasz
2014-02-01
Context dependence is central to the description of complexity. Keying on the pairwise definition of "set complexity," we use an information theory approach to formulate general measures of systems complexity. We examine the properties of multivariable dependency starting with the concept of interaction information. We then present a new measure for unbiased detection of multivariable dependency, "differential interaction information." This quantity for two variables reduces to the pairwise "set complexity" previously proposed as a context-dependent measure of information in biological systems. We generalize it here to an arbitrary number of variables. Critical limiting properties of the "differential interaction information" are key to the generalization. This measure extends previous ideas about biological information and provides a more sophisticated basis for the study of complexity. The properties of "differential interaction information" also suggest new approaches to data analysis. Given a data set of system measurements, differential interaction information can provide a measure of collective dependence, which can be represented in hypergraphs describing complex system interaction patterns. We investigate this kind of analysis using simulated data sets. The conjoining of a generalized set complexity measure, multivariable dependency analysis, and hypergraphs is our central result. While our focus is on complex biological systems, our results are applicable to any complex system.
Reliability analysis in interdependent smart grid systems
NASA Astrophysics Data System (ADS)
Peng, Hao; Kan, Zhe; Zhao, Dandan; Han, Jianmin; Lu, Jianfeng; Hu, Zhaolong
2018-06-01
Complex network theory is a useful way to study many real complex systems. In this paper, a reliability analysis model based on complex network theory is introduced in interdependent smart grid systems. In this paper, we focus on understanding the structure of smart grid systems and studying the underlying network model, their interactions, and relationships and how cascading failures occur in the interdependent smart grid systems. We propose a practical model for interdependent smart grid systems using complex theory. Besides, based on percolation theory, we also study the effect of cascading failures effect and reveal detailed mathematical analysis of failure propagation in such systems. We analyze the reliability of our proposed model caused by random attacks or failures by calculating the size of giant functioning components in interdependent smart grid systems. Our simulation results also show that there exists a threshold for the proportion of faulty nodes, beyond which the smart grid systems collapse. Also we determine the critical values for different system parameters. In this way, the reliability analysis model based on complex network theory can be effectively utilized for anti-attack and protection purposes in interdependent smart grid systems.
Situational Analysis for Complex Systems: Methodological Development in Public Health Research.
Martin, Wanda; Pauly, Bernie; MacDonald, Marjorie
2016-01-01
Public health systems have suffered infrastructure losses worldwide. Strengthening public health systems requires not only good policies and programs, but also development of new research methodologies to support public health systems renewal. Our research team considers public health systems to be complex adaptive systems and as such new methods are necessary to generate knowledge about the process of implementing public health programs and services. Within our program of research, we have employed situational analysis as a method for studying complex adaptive systems in four distinct research studies on public health program implementation. The purpose of this paper is to demonstrate the use of situational analysis as a method for studying complex systems and highlight the need for further methodological development.
Demonstration of a Safety Analysis on a Complex System
NASA Technical Reports Server (NTRS)
Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey;
1997-01-01
For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.
Energy Analysis Publications | Energy Analysis | NREL
Systems Impact Analysis We perform impact analysis to evaluate and understand the impact of markets publications. Featured Publications Complex Systems Analysis Complex systems analysis integrates all aspects of , policies, and financing on technology uptake and the impact of new technologies on markets and policy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hertz, P.R.
Fluorescence spectroscopy is a highly sensitive and selective tool for the analysis of complex systems. In order to investigate the efficacy of several steady state and dynamic techniques for the analysis of complex systems, this work focuses on two types of complex, multicomponent samples: petrolatums and coal liquids. It is shown in these studies dynamic, fluorescence lifetime-based measurements provide enhanced discrimination between complex petrolatum samples. Additionally, improved quantitative analysis of multicomponent systems is demonstrated via incorporation of organized media in coal liquid samples. This research provides the first systematic studies of (1) multifrequency phase-resolved fluorescence spectroscopy for dynamic fluorescence spectralmore » fingerprinting of complex samples, and (2) the incorporation of bile salt micellar media to improve accuracy and sensitivity for characterization of complex systems. In the petroleum studies, phase-resolved fluorescence spectroscopy is used to combine spectral and lifetime information through the measurement of phase-resolved fluorescence intensity. The intensity is collected as a function of excitation and emission wavelengths, angular modulation frequency, and detector phase angle. This multidimensional information enhances the ability to distinguish between complex samples with similar spectral characteristics. Examination of the eigenvalues and eigenvectors from factor analysis of phase-resolved and steady state excitation-emission matrices, using chemometric methods of data analysis, confirms that phase-resolved fluorescence techniques offer improved discrimination between complex samples as compared with conventional steady state methods.« less
Complex systems and the technology of variability analysis
Seely, Andrew JE; Macklem, Peter T
2004-01-01
Characteristic patterns of variation over time, namely rhythms, represent a defining feature of complex systems, one that is synonymous with life. Despite the intrinsic dynamic, interdependent and nonlinear relationships of their parts, complex biological systems exhibit robust systemic stability. Applied to critical care, it is the systemic properties of the host response to a physiological insult that manifest as health or illness and determine outcome in our patients. Variability analysis provides a novel technology with which to evaluate the overall properties of a complex system. This review highlights the means by which we scientifically measure variation, including analyses of overall variation (time domain analysis, frequency distribution, spectral power), frequency contribution (spectral analysis), scale invariant (fractal) behaviour (detrended fluctuation and power law analysis) and regularity (approximate and multiscale entropy). Each technique is presented with a definition, interpretation, clinical application, advantages, limitations and summary of its calculation. The ubiquitous association between altered variability and illness is highlighted, followed by an analysis of how variability analysis may significantly improve prognostication of severity of illness and guide therapeutic intervention in critically ill patients. PMID:15566580
Risk Modeling of Interdependent Complex Systems of Systems: Theory and Practice.
Haimes, Yacov Y
2018-01-01
The emergence of the complexity characterizing our systems of systems (SoS) requires a reevaluation of the way we model, assess, manage, communicate, and analyze the risk thereto. Current models for risk analysis of emergent complex SoS are insufficient because too often they rely on the same risk functions and models used for single systems. These models commonly fail to incorporate the complexity derived from the networks of interdependencies and interconnectedness (I-I) characterizing SoS. There is a need to reevaluate currently practiced risk analysis to respond to this reality by examining, and thus comprehending, what makes emergent SoS complex. The key to evaluating the risk to SoS lies in understanding the genesis of characterizing I-I of systems manifested through shared states and other essential entities within and among the systems that constitute SoS. The term "essential entities" includes shared decisions, resources, functions, policies, decisionmakers, stakeholders, organizational setups, and others. This undertaking can be accomplished by building on state-space theory, which is fundamental to systems engineering and process control. This article presents a theoretical and analytical framework for modeling the risk to SoS with two case studies performed with the MITRE Corporation and demonstrates the pivotal contributions made by shared states and other essential entities to modeling and analysis of the risk to complex SoS. A third case study highlights the multifarious representations of SoS, which require harmonizing the risk analysis process currently applied to single systems when applied to complex SoS. © 2017 Society for Risk Analysis.
Complex adaptive systems and their relevance for nursing: An evolutionary concept analysis.
Notarnicola, Ippolito; Petrucci, Cristina; De Jesus Barbosa, Maria Rosimar; Giorgi, Fabio; Stievano, Alessandro; Rocco, Gennaro; Lancia, Loreto
2017-06-01
This study aimed to analyse the concept of "complex adaptive systems." The construct is still nebulous in the literature, and a further explanation of the idea is needed to have a shared knowledge of it. A concept analysis was conducted utilizing Rodgers evolutionary method. The inclusive years of bibliographic search started from 2005 to 2015. The search was conducted at PubMed©, CINAHL© (EBSCO host©), Scopus©, Web of Science©, and Academic Search Premier©. Retrieved papers were critically analysed to explore the attributes, antecedents, and consequences of the concept. Moreover, surrogates, related terms, and a pattern recognition scheme were identified. The concept analysis showed that complex systems are adaptive and have the ability to process information. They can adapt to the environment and consequently evolve. Nursing is a complex adaptive system, and the nursing profession in practice exhibits complex adaptive system characteristics. Complexity science through complex adaptive systems provides new ways of seeing and understanding the mechanisms that underpin the nursing profession. © 2017 John Wiley & Sons Australia, Ltd.
Tools and techniques for developing policies for complex and uncertain systems.
Bankes, Steven C
2002-05-14
Agent-based models (ABM) are examples of complex adaptive systems, which can be characterized as those systems for which no model less complex than the system itself can accurately predict in detail how the system will behave at future times. Consequently, the standard tools of policy analysis, based as they are on devising policies that perform well on some best estimate model of the system, cannot be reliably used for ABM. This paper argues that policy analysis by using ABM requires an alternative approach to decision theory. The general characteristics of such an approach are described, and examples are provided of its application to policy analysis.
Supporting Space Systems Design via Systems Dependency Analysis Methodology
NASA Astrophysics Data System (ADS)
Guariniello, Cesare
The increasing size and complexity of space systems and space missions pose severe challenges to space systems engineers. When complex systems and Systems-of-Systems are involved, the behavior of the whole entity is not only due to that of the individual systems involved but also to the interactions and dependencies between the systems. Dependencies can be varied and complex, and designers usually do not perform analysis of the impact of dependencies at the level of complex systems, or this analysis involves excessive computational cost, or occurs at a later stage of the design process, after designers have already set detailed requirements, following a bottom-up approach. While classical systems engineering attempts to integrate the perspectives involved across the variety of engineering disciplines and the objectives of multiple stakeholders, there is still a need for more effective tools and methods capable to identify, analyze and quantify properties of the complex system as a whole and to model explicitly the effect of some of the features that characterize complex systems. This research describes the development and usage of Systems Operational Dependency Analysis and Systems Developmental Dependency Analysis, two methods based on parametric models of the behavior of complex systems, one in the operational domain and one in the developmental domain. The parameters of the developed models have intuitive meaning, are usable with subjective and quantitative data alike, and give direct insight into the causes of observed, and possibly emergent, behavior. The approach proposed in this dissertation combines models of one-to-one dependencies among systems and between systems and capabilities, to analyze and evaluate the impact of failures or delays on the outcome of the whole complex system. The analysis accounts for cascading effects, partial operational failures, multiple failures or delays, and partial developmental dependencies. The user of these methods can assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.
Stanton, Neville A; Bessell, Kevin
2014-01-01
This paper presents the application of Cognitive Work Analysis to the description of the functions, situations, activities, decisions, strategies, and competencies of a Trafalgar class submarine when performing the function of returning to periscope depth. All five phases of Cognitive Work Analysis are presented, namely: Work Domain Analysis, Control Task Analysis, Strategies Analysis, Social Organisation and Cooperation Analysis, and Worker Competencies Analysis. Complex socio-technical systems are difficult to analyse but Cognitive Work Analysis offers an integrated way of analysing complex systems with the core of functional means-ends analysis underlying all of the other representations. The joined-up analysis offers a coherent framework for understanding how socio-technical systems work. Data were collected through observation and interviews at different sites across the UK. The resultant representations present a statement of how the work domain and current activities are configured in this complex socio-technical system. This is intended to provide a baseline, from which all future conceptions of the domain may be compared. The strength of the analysis is in the multiple representations from which the constraints acting on the work may be analysed. Future research needs to challenge the assumptions behind these constraints in order to develop new ways of working. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Analysis and design of algorithm-based fault-tolerant systems
NASA Technical Reports Server (NTRS)
Nair, V. S. Sukumaran
1990-01-01
An important consideration in the design of high performance multiprocessor systems is to ensure the correctness of the results computed in the presence of transient and intermittent failures. Concurrent error detection and correction have been applied to such systems in order to achieve reliability. Algorithm Based Fault Tolerance (ABFT) was suggested as a cost-effective concurrent error detection scheme. The research was motivated by the complexity involved in the analysis and design of ABFT systems. To that end, a matrix-based model was developed and, based on that, algorithms for both the design and analysis of ABFT systems are formulated. These algorithms are less complex than the existing ones. In order to reduce the complexity further, a hierarchical approach is developed for the analysis of large systems.
Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling
NASA Technical Reports Server (NTRS)
Hojnicki, Jeffrey S.; Rusick, Jeffrey J.
2005-01-01
Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).
2006-12-01
IACCARINO AND Q. WANG 3 Strain and stress analysis of uncertain engineering systems . D. GHOSH, C. FARHAT AND P. AVERY 17 Separated flow in a three...research in predictive science in complex systems , CTR has strived to maintain a critical mass in numerical analysis , computer science and physics based... analysis for a linear problem: heat conduction The design and analysis of complex engineering systems is challenging not only be- cause of the physical
1983-09-01
6ENFRAL. ELECTROMAGNETIC MODEL FOR THE ANALYSIS OF COMPLEX SYSTEMS **%(GEMA CS) Computer Code Documentation ii( Version 3 ). A the BDM Corporation Dr...ANALYSIS FnlTcnclRpr F COMPLEX SYSTEM (GmCS) February 81 - July 83- I TR CODE DOCUMENTATION (Version 3 ) 6.PROMN N.REPORT NUMBER 5. CONTRACT ORGAT97...the ti and t2 directions on the source patch. 3 . METHOD: The electric field at a segment observation point due to the source patch j is given by 1-- lnA
How do precision medicine and system biology response to human body's complex adaptability?
Yuan, Bing
2016-12-01
In the field of life sciences, although system biology and "precision medicine" introduce some complex scientifific methods and techniques, it is still based on the "analysis-reconstruction" of reductionist theory as a whole. Adaptability of complex system increase system behaviour uncertainty as well as the difficulties of precise identifification and control. It also put systems biology research into trouble. To grasp the behaviour and characteristics of organism fundamentally, systems biology has to abandon the "analysis-reconstruction" concept. In accordance with the guidelines of complexity science, systems biology should build organism model from holistic level, just like the Chinese medicine did in dealing with human body and disease. When we study the living body from the holistic level, we will fifind the adaptability of complex system is not the obstacle that increases the diffificulty of problem solving. It is the "exceptional", "right-hand man" that helping us to deal with the complexity of life more effectively.
Functional complexity and ecosystem stability: an experimental approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Voris, P.; O'Neill, R.V.; Shugart, H.H.
1978-01-01
The complexity-stability hypothesis was experimentally tested using intact terrestrial microcosms. Functional complexity was defined as the number and significance of component interactions (i.e., population interactions, physical-chemical reactions, biological turnover rates) influenced by nonlinearities, feedbacks, and time delays. It was postulated that functional complexity could be nondestructively measured through analysis of a signal generated from the system. Power spectral analysis of hourly CO/sub 2/ efflux, from eleven old-field microcosms, was analyzed for the number of low frequency peaks and used to rank the functional complexity of each system. Ranking of ecosystem stability was based on the capacity of the system tomore » retain essential nutrients and was measured by net loss of Ca after the system was stressed. Rank correlation supported the hypothesis that increasing ecosystem functional complexity leads to increasing ecosystem stability. The results indicated that complex functional dynamics can serve to stabilize the system. The results also demonstrated that microcosms are useful tools for system-level investigations.« less
Complex adaptive systems: concept analysis.
Holden, Lela M
2005-12-01
The aim of this paper is to explicate the concept of complex adaptive systems through an analysis that provides a description, antecedents, consequences, and a model case from the nursing and health care literature. Life is more than atoms and molecules--it is patterns of organization. Complexity science is the latest generation of systems thinking that investigates patterns and has emerged from the exploration of the subatomic world and quantum physics. A key component of complexity science is the concept of complex adaptive systems, and active research is found in many disciplines--from biology to economics to health care. However, the research and literature related to these appealing topics have generated confusion. A thorough explication of complex adaptive systems is needed. A modified application of the methods recommended by Walker and Avant for concept analysis was used. A complex adaptive system is a collection of individual agents with freedom to act in ways that are not always totally predictable and whose actions are interconnected. Examples include a colony of termites, the financial market, and a surgical team. It is often referred to as chaos theory, but the two are not the same. Chaos theory is actually a subset of complexity science. Complexity science offers a powerful new approach--beyond merely looking at clinical processes and the skills of healthcare professionals. The use of complex adaptive systems as a framework is increasing for a wide range of scientific applications, including nursing and healthcare management research. When nursing and other healthcare managers focus on increasing connections, diversity, and interactions they increase information flow and promote creative adaptation referred to as self-organization. Complexity science builds on the rich tradition in nursing that views patients and nursing care from a systems perspective.
Complex multidisciplinary system composition for aerospace vehicle conceptual design
NASA Astrophysics Data System (ADS)
Gonzalez, Lex
Although, there exists a vast amount of work concerning the analysis, design, integration of aerospace vehicle systems, there is no standard for how this data and knowledge should be combined in order to create a synthesis system. Each institution creating a synthesis system has in house vehicle and hardware components they are attempting to model and proprietary methods with which to model them. This leads to the fact that synthesis systems begin as one-off creations meant to answer a specific problem. As the scope of the synthesis system grows to encompass more and more problems, so does its size and complexity; in order for a single synthesis system to answer multiple questions the number of methods and method interface must increase. As a means to curtail the requirement that the increase of an aircraft synthesis systems capability leads to an increase in its size and complexity, this research effort focuses on the idea that each problem in aerospace requires its own analysis framework. By focusing on the creation of a methodology which centers on the matching of an analysis framework towards the problem being solved, the complexity of the analysis framework is decoupled from the complexity of the system that creates it. The derived methodology allows for the composition of complex multi-disciplinary systems (CMDS) through the automatic creation and implementation of system and disciplinary method interfaces. The CMDS Composition process follows a four step methodology meant to take a problem definition and progress towards the creation of an analysis framework meant to answer said problem. The unique implementation of the CMDS Composition process take user selected disciplinary analysis methods and automatically integrates them, together in order to create a syntactically composable analysis framework. As a means of assessing the validity of the CMDS Composition process a prototype system (AVDDBMS) has been developed. AVD DBMS has been used to model the Generic Hypersonic Vehicle (GHV), an open source family of hypersonic vehicles originating from the Air Force Research Laboratory. AVDDBMS has been applied in three different ways in order to assess its validity: Verification using GHV disciplinary data, Validation using selected disciplinary analysis methods, and Application of the CMDS Composition Process to assess the design solution space for the GHV hardware. The research demonstrates the holistic effect that selection of individual disciplinary analysis methods has on the structure and integration of the analysis framework.
Designing novel cellulase systems through agent-based modeling and global sensitivity analysis.
Apte, Advait A; Senger, Ryan S; Fong, Stephen S
2014-01-01
Experimental techniques allow engineering of biological systems to modify functionality; however, there still remains a need to develop tools to prioritize targets for modification. In this study, agent-based modeling (ABM) was used to build stochastic models of complexed and non-complexed cellulose hydrolysis, including enzymatic mechanisms for endoglucanase, exoglucanase, and β-glucosidase activity. Modeling results were consistent with experimental observations of higher efficiency in complexed systems than non-complexed systems and established relationships between specific cellulolytic mechanisms and overall efficiency. Global sensitivity analysis (GSA) of model results identified key parameters for improving overall cellulose hydrolysis efficiency including: (1) the cellulase half-life, (2) the exoglucanase activity, and (3) the cellulase composition. Overall, the following parameters were found to significantly influence cellulose consumption in a consolidated bioprocess (CBP): (1) the glucose uptake rate of the culture, (2) the bacterial cell concentration, and (3) the nature of the cellulase enzyme system (complexed or non-complexed). Broadly, these results demonstrate the utility of combining modeling and sensitivity analysis to identify key parameters and/or targets for experimental improvement.
Designing novel cellulase systems through agent-based modeling and global sensitivity analysis
Apte, Advait A; Senger, Ryan S; Fong, Stephen S
2014-01-01
Experimental techniques allow engineering of biological systems to modify functionality; however, there still remains a need to develop tools to prioritize targets for modification. In this study, agent-based modeling (ABM) was used to build stochastic models of complexed and non-complexed cellulose hydrolysis, including enzymatic mechanisms for endoglucanase, exoglucanase, and β-glucosidase activity. Modeling results were consistent with experimental observations of higher efficiency in complexed systems than non-complexed systems and established relationships between specific cellulolytic mechanisms and overall efficiency. Global sensitivity analysis (GSA) of model results identified key parameters for improving overall cellulose hydrolysis efficiency including: (1) the cellulase half-life, (2) the exoglucanase activity, and (3) the cellulase composition. Overall, the following parameters were found to significantly influence cellulose consumption in a consolidated bioprocess (CBP): (1) the glucose uptake rate of the culture, (2) the bacterial cell concentration, and (3) the nature of the cellulase enzyme system (complexed or non-complexed). Broadly, these results demonstrate the utility of combining modeling and sensitivity analysis to identify key parameters and/or targets for experimental improvement. PMID:24830736
NASA Technical Reports Server (NTRS)
Consoli, Robert David; Sobieszczanski-Sobieski, Jaroslaw
1990-01-01
Advanced multidisciplinary analysis and optimization methods, namely system sensitivity analysis and non-hierarchical system decomposition, are applied to reduce the cost and improve the visibility of an automated vehicle design synthesis process. This process is inherently complex due to the large number of functional disciplines and associated interdisciplinary couplings. Recent developments in system sensitivity analysis as applied to complex non-hierarchic multidisciplinary design optimization problems enable the decomposition of these complex interactions into sub-processes that can be evaluated in parallel. The application of these techniques results in significant cost, accuracy, and visibility benefits for the entire design synthesis process.
AN ADVANCED SYSTEM FOR POLLUTION PREVENTION IN CHEMICAL COMPLEXES
One important accomplishment is that the system will give process engineers interactively and simultaneously use of programs for total cost analysis, life cycle assessment and sustainability metrics to provide direction for the optimal chemical complex analysis pro...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitra, Vramori; Sarma, Bornali; Sarma, Arun
Recurrence is an ubiquitous feature which provides deep insights into the dynamics of real dynamical systems. A suitable tool for investigating recurrences is recurrence quantification analysis (RQA). It allows, e.g., the detection of regime transitions with respect to varying control parameters. We investigate the complexity of different coexisting nonlinear dynamical regimes of the plasma floating potential fluctuations at different magnetic fields and discharge voltages by using recurrence quantification variables, in particular, DET, L{sub max}, and Entropy. The recurrence analysis reveals that the predictability of the system strongly depends on discharge voltage. Furthermore, the persistent behaviour of the plasma time seriesmore » is characterized by the Detrended fluctuation analysis technique to explore the complexity in terms of long range correlation. The enhancement of the discharge voltage at constant magnetic field increases the nonlinear correlations; hence, the complexity of the system decreases, which corroborates the RQA analysis.« less
ADAM: analysis of discrete models of biological systems using computer algebra.
Hinkelmann, Franziska; Brandon, Madison; Guang, Bonny; McNeill, Rustin; Blekherman, Grigoriy; Veliz-Cuba, Alan; Laubenbacher, Reinhard
2011-07-20
Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web-based tool for several different input formats, and it makes analysis of complex models accessible to a larger community, as it is platform independent as a web-service and does not require understanding of the underlying mathematics.
Multivariate analysis: greater insights into complex systems
USDA-ARS?s Scientific Manuscript database
Many agronomic researchers measure and collect multiple response variables in an effort to understand the more complex nature of the system being studied. Multivariate (MV) statistical methods encompass the simultaneous analysis of all random variables (RV) measured on each experimental or sampling ...
State analysis requirements database for engineering complex embedded systems
NASA Technical Reports Server (NTRS)
Bennett, Matthew B.; Rasmussen, Robert D.; Ingham, Michel D.
2004-01-01
It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer's intent, potentially leading to software errors. This problem is addressed by a systems engineering tool called the State Analysis Database, which provides a tool for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using the State Analysis Database.
Analysis of Multilayered Printed Circuit Boards using Computed Tomography
2014-05-01
complex PCBs that present a challenge for any testing or fault analysis. Set-to- work testing and fault analysis of any electronic circuit require...Electronic Warfare and Radar Division in December 2010. He is currently in Electro- Optic Countermeasures Group. Samuel works on embedded system design...and software optimisation of complex electro-optical systems, including the set to work and characterisation of these systems. He has a Bachelor of
Flavel, Richard J; Guppy, Chris N; Rabbi, Sheikh M R; Young, Iain M
2017-01-01
The objective of this study was to develop a flexible and free image processing and analysis solution, based on the Public Domain ImageJ platform, for the segmentation and analysis of complex biological plant root systems in soil from x-ray tomography 3D images. Contrasting root architectures from wheat, barley and chickpea root systems were grown in soil and scanned using a high resolution micro-tomography system. A macro (Root1) was developed that reliably identified with good to high accuracy complex root systems (10% overestimation for chickpea, 1% underestimation for wheat, 8% underestimation for barley) and provided analysis of root length and angle. In-built flexibility allowed the user interaction to (a) amend any aspect of the macro to account for specific user preferences, and (b) take account of computational limitations of the platform. The platform is free, flexible and accurate in analysing root system metrics.
SEU System Analysis: Not Just the Sum of All Parts
NASA Technical Reports Server (NTRS)
Berg, Melanie D.; Label, Kenneth
2014-01-01
Single event upset (SEU) analysis of complex systems is challenging. Currently, system SEU analysis is performed by component level partitioning and then either: the most dominant SEU cross-sections (SEUs) are used in system error rate calculations; or the partition SEUs are summed to eventually obtain a system error rate. In many cases, system error rates are overestimated because these methods generally overlook system level derating factors. The problem with overestimating is that it can cause overdesign and consequently negatively affect the following: cost, schedule, functionality, and validation/verification. The scope of this presentation is to discuss the risks involved with our current scheme of SEU analysis for complex systems; and to provide alternative methods for improvement.
Mattei, Tobias A
2014-12-01
In self-adapting dynamical systems, a significant improvement in the signaling flow among agents constitutes one of the most powerful triggering events for the emergence of new complex behaviors. Ackermann and colleagues' comprehensive phylogenetic analysis of the brain structures involved in acoustic communication provides further evidence of the essential role which speech, as a breakthrough signaling resource, has played in the evolutionary development of human cognition viewed from the standpoint of complex adaptive system analysis.
Engineering Complex Embedded Systems with State Analysis and the Mission Data System
NASA Technical Reports Server (NTRS)
Ingham, Michel D.; Rasmussen, Robert D.; Bennett, Matthew B.; Moncada, Alex C.
2004-01-01
It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer s intent, potentially leading to software errors. This problem is addressed by a systems engineering methodology called State Analysis, which provides a process for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using State Analysis and how these requirements inform the design of the system software, using representative spacecraft examples.
Manufacturing complexity analysis
NASA Technical Reports Server (NTRS)
Delionback, L. M.
1977-01-01
The analysis of the complexity of a typical system is presented. Starting with the subsystems of an example system, the step-by-step procedure for analysis of the complexity of an overall system is given. The learning curves for the various subsystems are determined as well as the concurrent numbers of relevant design parameters. Then trend curves are plotted for the learning curve slopes versus the various design-oriented parameters, e.g. number of parts versus slope of learning curve, or number of fasteners versus slope of learning curve, etc. Representative cuts are taken from each trend curve, and a figure-of-merit analysis is made for each of the subsystems. Based on these values, a characteristic curve is plotted which is indicative of the complexity of the particular subsystem. Each such characteristic curve is based on a universe of trend curve data taken from data points observed for the subsystem in question. Thus, a characteristic curve is developed for each of the subsystems in the overall system.
General description and understanding of the nonlinear dynamics of mode-locked fiber lasers.
Wei, Huai; Li, Bin; Shi, Wei; Zhu, Xiushan; Norwood, Robert A; Peyghambarian, Nasser; Jian, Shuisheng
2017-05-02
As a type of nonlinear system with complexity, mode-locked fiber lasers are known for their complex behaviour. It is a challenging task to understand the fundamental physics behind such complex behaviour, and a unified description for the nonlinear behaviour and the systematic and quantitative analysis of the underlying mechanisms of these lasers have not been developed. Here, we present a complexity science-based theoretical framework for understanding the behaviour of mode-locked fiber lasers by going beyond reductionism. This hierarchically structured framework provides a model with variable dimensionality, resulting in a simple view that can be used to systematically describe complex states. Moreover, research into the attractors' basins reveals the origin of stochasticity, hysteresis and multistability in these systems and presents a new method for quantitative analysis of these nonlinear phenomena. These findings pave the way for dynamics analysis and system designs of mode-locked fiber lasers. We expect that this paradigm will also enable potential applications in diverse research fields related to complex nonlinear phenomena.
Knowledge Management for the Analysis of Complex Experimentation.
ERIC Educational Resources Information Center
Maule, R.; Schacher, G.; Gallup, S.
2002-01-01
Describes a knowledge management system that was developed to help provide structure for dynamic and static data and to aid in the analysis of complex experimentation. Topics include quantitative and qualitative data; mining operations using artificial intelligence techniques; information architecture of the system; and transforming data into…
Borland, Ron; Coghill, Ken
2010-01-01
Complex, transnational issues like the tobacco epidemic are major challenges that defy analysis and management by conventional methods, as are other public health issues, such as those associated with global food distribution and climate change. We examined the evolution of indoor smoke-free regulations, a tobacco control policy innovation, and identified the key attributes of those jurisdictions that successfully pursued this innovation and those that to date have not. In doing so, we employed the actor-network theory, a comprehensive framework for the analysis of fundamental system change. Through our analysis, we identified approaches to help overcome some systemic barriers to the solution of the tobacco problem and comment on other complex transnational problems. PMID:20466949
Young, David; Borland, Ron; Coghill, Ken
2010-07-01
Complex, transnational issues like the tobacco epidemic are major challenges that defy analysis and management by conventional methods, as are other public health issues, such as those associated with global food distribution and climate change. We examined the evolution of indoor smoke-free regulations, a tobacco control policy innovation, and identified the key attributes of those jurisdictions that successfully pursued this innovation and those that to date have not. In doing so, we employed the actor-network theory, a comprehensive framework for the analysis of fundamental system change. Through our analysis, we identified approaches to help overcome some systemic barriers to the solution of the tobacco problem and comment on other complex transnational problems.
Analysis and Perspective from the Complex Aerospace Systems Exchange (CASE) 2013
NASA Technical Reports Server (NTRS)
Jones, Kennie H.; Parker, Peter A.; Detweiler, Kurt N.; McGowan, Anna-Maria R.; Dress, David A.; Kimmel, William M.
2014-01-01
NASA Langley Research Center embedded four rapporteurs at the Complex Aerospace Systems Exchange (CASE) held in August 2013 with the objective to capture the essence of the conference presentations and discussions. CASE was established to provide a discussion forum among chief engineers, program managers, and systems engineers on challenges in the engineering of complex aerospace systems. The meeting consists of invited presentations and panels from industry, academia, and government followed by discussions among attendees. This report presents the major and reoccurring themes captured throughout the meeting and provides analysis and insights to further the CASE mission.
Fault management for the Space Station Freedom control center
NASA Technical Reports Server (NTRS)
Clark, Colin; Jowers, Steven; Mcnenny, Robert; Culbert, Chris; Kirby, Sarah; Lauritsen, Janet
1992-01-01
This paper describes model based reasoning fault isolation in complex systems using automated digraph analysis. It discusses the use of the digraph representation as the paradigm for modeling physical systems and a method for executing these failure models to provide real-time failure analysis. It also discusses the generality, ease of development and maintenance, complexity management, and susceptibility to verification and validation of digraph failure models. It specifically describes how a NASA-developed digraph evaluation tool and an automated process working with that tool can identify failures in a monitored system when supplied with one or more fault indications. This approach is well suited to commercial applications of real-time failure analysis in complex systems because it is both powerful and cost effective.
Time Factor in the Theory of Anthropogenic Risk Prediction in Complex Dynamic Systems
NASA Astrophysics Data System (ADS)
Ostreikovsky, V. A.; Shevchenko, Ye N.; Yurkov, N. K.; Kochegarov, I. I.; Grishko, A. K.
2018-01-01
The article overviews the anthropogenic risk models that take into consideration the development of different factors in time that influence the complex system. Three classes of mathematical models have been analyzed for the use in assessing the anthropogenic risk of complex dynamic systems. These models take into consideration time factor in determining the prospect of safety change of critical systems. The originality of the study is in the analysis of five time postulates in the theory of anthropogenic risk and the safety of highly important objects. It has to be stressed that the given postulates are still rarely used in practical assessment of equipment service life of critically important systems. That is why, the results of study presented in the article can be used in safety engineering and analysis of critically important complex technical systems.
A new decision sciences for complex systems.
Lempert, Robert J
2002-05-14
Models of complex systems can capture much useful information but can be difficult to apply to real-world decision-making because the type of information they contain is often inconsistent with that required for traditional decision analysis. New approaches, which use inductive reasoning over large ensembles of computational experiments, now make possible systematic comparison of alternative policy options using models of complex systems. This article describes Computer-Assisted Reasoning, an approach to decision-making under conditions of deep uncertainty that is ideally suited to applying complex systems to policy analysis. The article demonstrates the approach on the policy problem of global climate change, with a particular focus on the role of technology policies in a robust, adaptive strategy for greenhouse gas abatement.
ADAM: Analysis of Discrete Models of Biological Systems Using Computer Algebra
2011-01-01
Background Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. Results We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Conclusions Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web-based tool for several different input formats, and it makes analysis of complex models accessible to a larger community, as it is platform independent as a web-service and does not require understanding of the underlying mathematics. PMID:21774817
Scope Complexity Options Risks Excursions (SCORE) Factor Mathematical Description.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gearhart, Jared Lee; Samberson, Jonell Nicole; Shettigar, Subhasini
The purpose of the Scope, Complexity, Options, Risks, Excursions (SCORE) model is to estimate the relative complexity of design variants of future warhead options, resulting in scores. SCORE factors extend this capability by providing estimates of complexity relative to a base system (i.e., all design options are normalized to one weapon system). First, a clearly defined set of scope elements for a warhead option is established. The complexity of each scope element is estimated by Subject Matter Experts (SMEs), including a level of uncertainty, relative to a specific reference system. When determining factors, complexity estimates for a scope element canmore » be directly tied to the base system or chained together via comparable scope elements in a string of reference systems that ends with the base system. The SCORE analysis process is a growing multi-organizational Nuclear Security Enterprise (NSE) effort, under the management of the NA-12 led Enterprise Modeling and Analysis Consortium (EMAC). Historically, it has provided the data elicitation, integration, and computation needed to support the out-year Life Extension Program (LEP) cost estimates included in the Stockpile Stewardship Management Plan (SSMP).« less
Visual analysis and exploration of complex corporate shareholder networks
NASA Astrophysics Data System (ADS)
Tekušová, Tatiana; Kohlhammer, Jörn
2008-01-01
The analysis of large corporate shareholder network structures is an important task in corporate governance, in financing, and in financial investment domains. In a modern economy, large structures of cross-corporation, cross-border shareholder relationships exist, forming complex networks. These networks are often difficult to analyze with traditional approaches. An efficient visualization of the networks helps to reveal the interdependent shareholding formations and the controlling patterns. In this paper, we propose an effective visualization tool that supports the financial analyst in understanding complex shareholding networks. We develop an interactive visual analysis system by combining state-of-the-art visualization technologies with economic analysis methods. Our system is capable to reveal patterns in large corporate shareholder networks, allows the visual identification of the ultimate shareholders, and supports the visual analysis of integrated cash flow and control rights. We apply our system on an extensive real-world database of shareholder relationships, showing its usefulness for effective visual analysis.
Translations on USSR Science and Technology, Biomedical and Behavioral Sciences, Number 15
1977-11-16
processed. By applying systems theory to synthesis of complex man-machine systems we form ergatic organisms which not only have external and internal...without exception (and this is extremely important to emphasize) as a complex , integral formation, which through various traditions has acquired a...and outputs of the whole, which has a complex internal organization and structure, which we can no longer ignore in our analysis. Thus analysis and
Social networks as embedded complex adaptive systems.
Benham-Hutchins, Marge; Clancy, Thomas R
2010-09-01
As systems evolve over time, their natural tendency is to become increasingly more complex. Studies in the field of complex systems have generated new perspectives on management in social organizations such as hospitals. Much of this research appears as a natural extension of the cross-disciplinary field of systems theory. This is the 15th in a series of articles applying complex systems science to the traditional management concepts of planning, organizing, directing, coordinating, and controlling. In this article, the authors discuss healthcare social networks as a hierarchy of embedded complex adaptive systems. The authors further examine the use of social network analysis tools as a means to understand complex communication patterns and reduce medical errors.
Reliability/safety analysis of a fly-by-wire system
NASA Technical Reports Server (NTRS)
Brock, L. D.; Goddman, H. A.
1980-01-01
An analysis technique has been developed to estimate the reliability of a very complex, safety-critical system by constructing a diagram of the reliability equations for the total system. This diagram has many of the characteristics of a fault-tree or success-path diagram, but is much easier to construct for complex redundant systems. The diagram provides insight into system failure characteristics and identifies the most likely failure modes. A computer program aids in the construction of the diagram and the computation of reliability. Analysis of the NASA F-8 Digital Fly-by-Wire Flight Control System is used to illustrate the technique.
Systemic Analysis Approaches for Air Transportation
NASA Technical Reports Server (NTRS)
Conway, Sheila
2005-01-01
Air transportation system designers have had only limited success using traditional operations research and parametric modeling approaches in their analyses of innovations. They need a systemic methodology for modeling of safety-critical infrastructure that is comprehensive, objective, and sufficiently concrete, yet simple enough to be used with reasonable investment. The methodology must also be amenable to quantitative analysis so issues of system safety and stability can be rigorously addressed. However, air transportation has proven itself an extensive, complex system whose behavior is difficult to describe, no less predict. There is a wide range of system analysis techniques available, but some are more appropriate for certain applications than others. Specifically in the area of complex system analysis, the literature suggests that both agent-based models and network analysis techniques may be useful. This paper discusses the theoretical basis for each approach in these applications, and explores their historic and potential further use for air transportation analysis.
Complex Physical, Biophysical and Econophysical Systems
NASA Astrophysics Data System (ADS)
Dewar, Robert L.; Detering, Frank
1. Introduction to complex and econophysics systems: a navigation map / T. Aste and T. Di Matteo -- 2. An introduction to fractional diffusion / B. I. Henry, T.A.M. Langlands and P. Straka -- 3. Space plasmas and fusion plasmas as complex systems / R. O. Dendy -- 4. Bayesian data analysis / M. S. Wheatland -- 5. Inverse problems and complexity in earth system science / I. G. Enting -- 6. Applied fluid chaos: designing advection with periodically reoriented flows for micro to geophysical mixing and transport enhancement / G. Metcalfe -- 7. Approaches to modelling the dynamical activity of brain function based on the electroencephalogram / D. T. J. Liley and F. Frascoli -- 8. Jaynes' maximum entropy principle, Riemannian metrics and generalised least action bound / R. K. Niven and B. Andresen -- 9. Complexity, post-genomic biology and gene expression programs / R. B. H. Williams and O. J.-H. Luo -- 10. Tutorials on agent-based modelling with NetLogo and network analysis with Pajek / M. J. Berryman and S. D. Angus.
Guyon, Hervé; Falissard, Bruno; Kop, Jean-Luc
2017-01-01
Network Analysis is considered as a new method that challenges Latent Variable models in inferring psychological attributes. With Network Analysis, psychological attributes are derived from a complex system of components without the need to call on any latent variables. But the ontological status of psychological attributes is not adequately defined with Network Analysis, because a psychological attribute is both a complex system and a property emerging from this complex system. The aim of this article is to reappraise the legitimacy of latent variable models by engaging in an ontological and epistemological discussion on psychological attributes. Psychological attributes relate to the mental equilibrium of individuals embedded in their social interactions, as robust attractors within complex dynamic processes with emergent properties, distinct from physical entities located in precise areas of the brain. Latent variables thus possess legitimacy, because the emergent properties can be conceptualized and analyzed on the sole basis of their manifestations, without exploring the upstream complex system. However, in opposition with the usual Latent Variable models, this article is in favor of the integration of a dynamic system of manifestations. Latent Variables models and Network Analysis thus appear as complementary approaches. New approaches combining Latent Network Models and Network Residuals are certainly a promising new way to infer psychological attributes, placing psychological attributes in an inter-subjective dynamic approach. Pragmatism-realism appears as the epistemological framework required if we are to use latent variables as representations of psychological attributes. PMID:28572780
Ross, Amy M; Ilic, Kelley; Kiyoshi-Teo, Hiroko; Lee, Christopher S
2017-12-26
The purpose of this study was to establish the psychometric properties of the new 16-item leadership environment scale. The leadership environment scale was based on complexity science concepts relevant to complex adaptive health care systems. A workforce survey of direct-care nurses was conducted (n = 1,443) in Oregon. Confirmatory factor analysis, exploratory factor analysis, concordant validity test and reliability tests were conducted to establish the structure and internal consistency of the leadership environment scale. Confirmatory factor analysis indices approached acceptable thresholds of fit with a single factor solution. Exploratory factor analysis showed improved fit with a two-factor model solution; the factors were labelled 'influencing relationships' and 'interdependent system supports'. Moderate to strong convergent validity was observed between the leadership environment scale/subscales and both the nursing workforce index and the safety organising scale. Reliability of the leadership environment scale and subscales was strong, with all alphas ≥.85. The leadership environment scale is structurally sound and reliable. Nursing management can employ adaptive complexity leadership attributes, measure their influence on the leadership environment, subsequently modify system supports and relationships and improve the quality of health care systems. The leadership environment scale is an innovative fit to complex adaptive systems and how nurses act as leaders within these systems. © 2017 John Wiley & Sons Ltd.
System for decision analysis support on complex waste management issues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shropshire, D.E.
1997-10-01
A software system called the Waste Flow Analysis has been developed and applied to complex environmental management processes for the United States Department of Energy (US DOE). The system can evaluate proposed methods of waste retrieval, treatment, storage, transportation, and disposal. Analysts can evaluate various scenarios to see the impacts to waste slows and schedules, costs, and health and safety risks. Decision analysis capabilities have been integrated into the system to help identify preferred alternatives based on a specific objectives may be to maximize the waste moved to final disposition during a given time period, minimize health risks, minimize costs,more » or combinations of objectives. The decision analysis capabilities can support evaluation of large and complex problems rapidly, and under conditions of variable uncertainty. The system is being used to evaluate environmental management strategies to safely disposition wastes in the next ten years and reduce the environmental legacy resulting from nuclear material production over the past forty years.« less
PeTTSy: a computational tool for perturbation analysis of complex systems biology models.
Domijan, Mirela; Brown, Paul E; Shulgin, Boris V; Rand, David A
2016-03-10
Over the last decade sensitivity analysis techniques have been shown to be very useful to analyse complex and high dimensional Systems Biology models. However, many of the currently available toolboxes have either used parameter sampling, been focused on a restricted set of model observables of interest, studied optimisation of a objective function, or have not dealt with multiple simultaneous model parameter changes where the changes can be permanent or temporary. Here we introduce our new, freely downloadable toolbox, PeTTSy (Perturbation Theory Toolbox for Systems). PeTTSy is a package for MATLAB which implements a wide array of techniques for the perturbation theory and sensitivity analysis of large and complex ordinary differential equation (ODE) based models. PeTTSy is a comprehensive modelling framework that introduces a number of new approaches and that fully addresses analysis of oscillatory systems. It examines sensitivity analysis of the models to perturbations of parameters, where the perturbation timing, strength, length and overall shape can be controlled by the user. This can be done in a system-global setting, namely, the user can determine how many parameters to perturb, by how much and for how long. PeTTSy also offers the user the ability to explore the effect of the parameter perturbations on many different types of outputs: period, phase (timing of peak) and model solutions. PeTTSy can be employed on a wide range of mathematical models including free-running and forced oscillators and signalling systems. To enable experimental optimisation using the Fisher Information Matrix it efficiently allows one to combine multiple variants of a model (i.e. a model with multiple experimental conditions) in order to determine the value of new experiments. It is especially useful in the analysis of large and complex models involving many variables and parameters. PeTTSy is a comprehensive tool for analysing large and complex models of regulatory and signalling systems. It allows for simulation and analysis of models under a variety of environmental conditions and for experimental optimisation of complex combined experiments. With its unique set of tools it makes a valuable addition to the current library of sensitivity analysis toolboxes. We believe that this software will be of great use to the wider biological, systems biology and modelling communities.
SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission
NASA Technical Reports Server (NTRS)
Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph
2015-01-01
This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.
Recurrence quantity analysis based on matrix eigenvalues
NASA Astrophysics Data System (ADS)
Yang, Pengbo; Shang, Pengjian
2018-06-01
Recurrence plots is a powerful tool for visualization and analysis of dynamical systems. Recurrence quantification analysis (RQA), based on point density and diagonal and vertical line structures in the recurrence plots, is considered to be alternative measures to quantify the complexity of dynamical systems. In this paper, we present a new measure based on recurrence matrix to quantify the dynamical properties of a given system. Matrix eigenvalues can reflect the basic characteristics of the complex systems, so we show the properties of the system by exploring the eigenvalues of the recurrence matrix. Considering that Shannon entropy has been defined as a complexity measure, we propose the definition of entropy of matrix eigenvalues (EOME) as a new RQA measure. We confirm that EOME can be used as a metric to quantify the behavior changes of the system. As a given dynamical system changes from a non-chaotic to a chaotic regime, the EOME will increase as well. The bigger EOME values imply higher complexity and lower predictability. We also study the effect of some factors on EOME,including data length, recurrence threshold, the embedding dimension, and additional noise. Finally, we demonstrate an application in physiology. The advantage of this measure lies in a high sensitivity and simple computation.
Fuel cell on-site integrated energy system parametric analysis of a residential complex
NASA Technical Reports Server (NTRS)
Simons, S. N.
1977-01-01
A parametric energy-use analysis was performed for a large apartment complex served by a fuel cell on-site integrated energy system (OS/IES). The variables parameterized include operating characteristics for four phosphoric acid fuel cells, eight OS/IES energy recovery systems, and four climatic locations. The annual fuel consumption for selected parametric combinations are presented and a breakeven economic analysis is presented for one parametric combination. The results show fuel cell electrical efficiency and system component choice have the greatest effect on annual fuel consumption; fuel cell thermal efficiency and geographic location have less of an effect.
A Multifaceted Mathematical Approach for Complex Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alexander, F.; Anitescu, M.; Bell, J.
2012-03-07
Applied mathematics has an important role to play in developing the tools needed for the analysis, simulation, and optimization of complex problems. These efforts require the development of the mathematical foundations for scientific discovery, engineering design, and risk analysis based on a sound integrated approach for the understanding of complex systems. However, maximizing the impact of applied mathematics on these challenges requires a novel perspective on approaching the mathematical enterprise. Previous reports that have surveyed the DOE's research needs in applied mathematics have played a key role in defining research directions with the community. Although these reports have had significantmore » impact, accurately assessing current research needs requires an evaluation of today's challenges against the backdrop of recent advances in applied mathematics and computing. To address these needs, the DOE Applied Mathematics Program sponsored a Workshop for Mathematics for the Analysis, Simulation and Optimization of Complex Systems on September 13-14, 2011. The workshop had approximately 50 participants from both the national labs and academia. The goal of the workshop was to identify new research areas in applied mathematics that will complement and enhance the existing DOE ASCR Applied Mathematics Program efforts that are needed to address problems associated with complex systems. This report describes recommendations from the workshop and subsequent analysis of the workshop findings by the organizing committee.« less
Health care organizations as complex systems: new perspectives on design and management.
McDaniel, Reuben R; Driebe, Dean J; Lanham, Holly Jordan
2013-01-01
We discuss the impact of complexity science on the design and management of health care organizations over the past decade. We provide an overview of complexity science issues and their impact on thinking about health care systems, particularly with the rising importance of information systems. We also present a complexity science perspective on current issues in today's health care organizations and suggest ways that this perspective might help in approaching these issues. We review selected research, focusing on work in which we participated, to identify specific examples of applications of complexity science. We then take a look at information systems in health care organizations from a complexity viewpoint. Complexity science is a fundamentally different way of understanding nature and has influenced the thinking of scholars and practitioners as they have attempted to understand health care organizations. Many scholars study health care organizations as complex adaptive systems and through this perspective develop new management strategies. Most important, perhaps, is the understanding that attention to relationships and interdependencies is critical for developing effective management strategies. Increased understanding of complexity science can enhance the ability of researchers and practitioners to develop new ways of understanding and improving health care organizations. This analysis opens new vistas for scholars and practitioners attempting to understand health care organizations as complex adaptive systems. The analysis holds value for those already familiar with this approach as well as those who may not be as familiar.
Using multi-criteria analysis of simulation models to understand complex biological systems
Maureen C. Kennedy; E. David Ford
2011-01-01
Scientists frequently use computer-simulation models to help solve complex biological problems. Typically, such models are highly integrated, they produce multiple outputs, and standard methods of model analysis are ill suited for evaluating them. We show how multi-criteria optimization with Pareto optimality allows for model outputs to be compared to multiple system...
Methodological aspects of fuel performance system analysis at raw hydrocarbon processing plants
NASA Astrophysics Data System (ADS)
Kulbjakina, A. V.; Dolotovskij, I. V.
2018-01-01
The article discusses the methodological aspects of fuel performance system analysis at raw hydrocarbon (RH) processing plants. Modern RH processing facilities are the major consumers of energy resources (ER) for their own needs. To reduce ER, including fuel consumption, and to develop rational fuel system structure are complex and relevant scientific tasks that can only be done using system analysis and complex system synthesis. In accordance with the principles of system analysis, the hierarchical structure of the fuel system, the block scheme for the synthesis of the most efficient alternative of the fuel system using mathematical models and the set of performance criteria have been developed on the main stages of the study. The results from the introduction of specific engineering solutions to develop their own energy supply sources for RH processing facilities have been provided.
Challenges in the analysis of complex systems: introduction and overview
NASA Astrophysics Data System (ADS)
Hastings, Harold M.; Davidsen, Jörn; Leung, Henry
2017-12-01
One of the main challenges of modern physics is to provide a systematic understanding of systems far from equilibrium exhibiting emergent behavior. Prominent examples of such complex systems include, but are not limited to the cardiac electrical system, the brain, the power grid, social systems, material failure and earthquakes, and the climate system. Due to the technological advances over the last decade, the amount of observations and data available to characterize complex systems and their dynamics, as well as the capability to process that data, has increased substantially. The present issue discusses a cross section of the current research on complex systems, with a focus on novel experimental and data-driven approaches to complex systems that provide the necessary platform to model the behavior of such systems.
Deconstructing the core dynamics from a complex time-lagged regulatory biological circuit.
Eriksson, O; Brinne, B; Zhou, Y; Björkegren, J; Tegnér, J
2009-03-01
Complex regulatory dynamics is ubiquitous in molecular networks composed of genes and proteins. Recent progress in computational biology and its application to molecular data generate a growing number of complex networks. Yet, it has been difficult to understand the governing principles of these networks beyond graphical analysis or extensive numerical simulations. Here the authors exploit several simplifying biological circumstances which thereby enable to directly detect the underlying dynamical regularities driving periodic oscillations in a dynamical nonlinear computational model of a protein-protein network. System analysis is performed using the cell cycle, a mathematically well-described complex regulatory circuit driven by external signals. By introducing an explicit time delay and using a 'tearing-and-zooming' approach the authors reduce the system to a piecewise linear system with two variables that capture the dynamics of this complex network. A key step in the analysis is the identification of functional subsystems by identifying the relations between state-variables within the model. These functional subsystems are referred to as dynamical modules operating as sensitive switches in the original complex model. By using reduced mathematical representations of the subsystems the authors derive explicit conditions on how the cell cycle dynamics depends on system parameters, and can, for the first time, analyse and prove global conditions for system stability. The approach which includes utilising biological simplifying conditions, identification of dynamical modules and mathematical reduction of the model complexity may be applicable to other well-characterised biological regulatory circuits. [Includes supplementary material].
NASA Astrophysics Data System (ADS)
Donges, Jonathan; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik; Marwan, Norbert; Dijkstra, Henk; Kurths, Jürgen
2016-04-01
We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. pyunicorn is available online at https://github.com/pik-copan/pyunicorn. Reference: J.F. Donges, J. Heitzig, B. Beronov, M. Wiedermann, J. Runge, Q.-Y. Feng, L. Tupikina, V. Stolbova, R.V. Donner, N. Marwan, H.A. Dijkstra, and J. Kurths, Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package, Chaos 25, 113101 (2015), DOI: 10.1063/1.4934554, Preprint: arxiv.org:1507.01571 [physics.data-an].
Sensitivity Analysis of Multidisciplinary Rotorcraft Simulations
NASA Technical Reports Server (NTRS)
Wang, Li; Diskin, Boris; Biedron, Robert T.; Nielsen, Eric J.; Bauchau, Olivier A.
2017-01-01
A multidisciplinary sensitivity analysis of rotorcraft simulations involving tightly coupled high-fidelity computational fluid dynamics and comprehensive analysis solvers is presented and evaluated. An unstructured sensitivity-enabled Navier-Stokes solver, FUN3D, and a nonlinear flexible multibody dynamics solver, DYMORE, are coupled to predict the aerodynamic loads and structural responses of helicopter rotor blades. A discretely-consistent adjoint-based sensitivity analysis available in FUN3D provides sensitivities arising from unsteady turbulent flows and unstructured dynamic overset meshes, while a complex-variable approach is used to compute DYMORE structural sensitivities with respect to aerodynamic loads. The multidisciplinary sensitivity analysis is conducted through integrating the sensitivity components from each discipline of the coupled system. Numerical results verify accuracy of the FUN3D/DYMORE system by conducting simulations for a benchmark rotorcraft test model and comparing solutions with established analyses and experimental data. Complex-variable implementation of sensitivity analysis of DYMORE and the coupled FUN3D/DYMORE system is verified by comparing with real-valued analysis and sensitivities. Correctness of adjoint formulations for FUN3D/DYMORE interfaces is verified by comparing adjoint-based and complex-variable sensitivities. Finally, sensitivities of the lift and drag functions obtained by complex-variable FUN3D/DYMORE simulations are compared with sensitivities computed by the multidisciplinary sensitivity analysis, which couples adjoint-based flow and grid sensitivities of FUN3D and FUN3D/DYMORE interfaces with complex-variable sensitivities of DYMORE structural responses.
A Simplified Approach to Risk Assessment Based on System Dynamics: An Industrial Case Study.
Garbolino, Emmanuel; Chery, Jean-Pierre; Guarnieri, Franck
2016-01-01
Seveso plants are complex sociotechnical systems, which makes it appropriate to support any risk assessment with a model of the system. However, more often than not, this step is only partially addressed, simplified, or avoided in safety reports. At the same time, investigations have shown that the complexity of industrial systems is frequently a factor in accidents, due to interactions between their technical, human, and organizational dimensions. In order to handle both this complexity and changes in the system over time, this article proposes an original and simplified qualitative risk evaluation method based on the system dynamics theory developed by Forrester in the early 1960s. The methodology supports the development of a dynamic risk assessment framework dedicated to industrial activities. It consists of 10 complementary steps grouped into two main activities: system dynamics modeling of the sociotechnical system and risk analysis. This system dynamics risk analysis is applied to a case study of a chemical plant and provides a way to assess the technological and organizational components of safety. © 2016 Society for Risk Analysis.
Verification of Space Station Secondary Power System Stability Using Design of Experiment
NASA Technical Reports Server (NTRS)
Karimi, Kamiar J.; Booker, Andrew J.; Mong, Alvin C.; Manners, Bruce
1998-01-01
This paper describes analytical methods used in verification of large DC power systems with applications to the International Space Station (ISS). Large DC power systems contain many switching power converters with negative resistor characteristics. The ISS power system presents numerous challenges with respect to system stability such as complex sources and undefined loads. The Space Station program has developed impedance specifications for sources and loads. The overall approach to system stability consists of specific hardware requirements coupled with extensive system analysis and testing. Testing of large complex distributed power systems is not practical due to size and complexity of the system. Computer modeling has been extensively used to develop hardware specifications as well as to identify system configurations for lab testing. The statistical method of Design of Experiments (DoE) is used as an analysis tool for verification of these large systems. DOE reduces the number of computer runs which are necessary to analyze the performance of a complex power system consisting of hundreds of DC/DC converters. DoE also provides valuable information about the effect of changes in system parameters on the performance of the system. DoE provides information about various operating scenarios and identification of the ones with potential for instability. In this paper we will describe how we have used computer modeling to analyze a large DC power system. A brief description of DoE is given. Examples using applications of DoE to analysis and verification of the ISS power system are provided.
Understanding health system reform - a complex adaptive systems perspective.
Sturmberg, Joachim P; O'Halloran, Di M; Martin, Carmel M
2012-02-01
Everyone wants a sustainable well-functioning health system. However, this notion has different meaning to policy makers and funders compared to clinicians and patients. The former perceive public policy and economic constraints, the latter clinical or patient-centred strategies as the means to achieving a desired outcome. Theoretical development and critical analysis of a complex health system model. We introduce the concept of the health care vortex as a metaphor by which to understand the complex adaptive nature of health systems, and the degree to which their behaviour is predetermined by their 'shared values' or attractors. We contrast the likely functions and outcomes of a health system with a people-centred attractor and one with a financial attractor. This analysis suggests a shift in the system's attractor is fundamental to progress health reform thinking. © 2012 Blackwell Publishing Ltd.
ERIC Educational Resources Information Center
FRIEDMAN, BURTON DEAN; AND OTHERS
THIS DOCUMENT IS THE SECOND PART OF A REPORT, PROGRAM-ORIENTED INFORMATION--A MANAGEMENT SYSTEMS COMPLEX FOR STATE EDUCATION AGENCIES. PART 1, EA 001 170, SUBTITLED "ANALYSIS AND PROPOSALS," CONTAINS AN OUTLINE OF THE NEED FOR A MANAGEMENT SYSTEMS COMPLEX WITHIN EACH STATE EDUCATION AGENCY. THIS DOCUMENT IS A MANUAL PRESENTING THE…
NASA Astrophysics Data System (ADS)
Donges, Jonathan F.; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V.; Marwan, Norbert; Dijkstra, Henk A.; Kurths, Jürgen
2015-11-01
We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology.
Epidemic modeling in complex realities.
Colizza, Vittoria; Barthélemy, Marc; Barrat, Alain; Vespignani, Alessandro
2007-04-01
In our global world, the increasing complexity of social relations and transport infrastructures are key factors in the spread of epidemics. In recent years, the increasing availability of computer power has enabled both to obtain reliable data allowing one to quantify the complexity of the networks on which epidemics may propagate and to envision computational tools able to tackle the analysis of such propagation phenomena. These advances have put in evidence the limits of homogeneous assumptions and simple spatial diffusion approaches, and stimulated the inclusion of complex features and heterogeneities relevant in the description of epidemic diffusion. In this paper, we review recent progresses that integrate complex systems and networks analysis with epidemic modelling and focus on the impact of the various complex features of real systems on the dynamics of epidemic spreading.
Multifractality and heteroscedastic dynamics: An application to time series analysis
NASA Astrophysics Data System (ADS)
Nascimento, C. M.; Júnior, H. B. N.; Jennings, H. D.; Serva, M.; Gleria, Iram; Viswanathan, G. M.
2008-01-01
An increasingly important problem in physics concerns scale invariance symmetry in diverse complex systems, often characterized by heteroscedastic dynamics. We investigate the nature of the relationship between the heteroscedastic and fractal aspects of the dynamics of complex systems, by analyzing the sensitivity to heteroscedasticity of the scaling properties of weakly nonstationary time series. By using multifractal detrended fluctuation analysis, we study the singularity spectra of currency exchange rate fluctuations, after partially or completely eliminating n-point correlations via data shuffling techniques. We conclude that heteroscedasticity can significantly increase multifractality and interpret these findings in the context of self-organizing and adaptive complex systems.
Generalized sample entropy analysis for traffic signals based on similarity measure
NASA Astrophysics Data System (ADS)
Shang, Du; Xu, Mengjia; Shang, Pengjian
2017-05-01
Sample entropy is a prevailing method used to quantify the complexity of a time series. In this paper a modified method of generalized sample entropy and surrogate data analysis is proposed as a new measure to assess the complexity of a complex dynamical system such as traffic signals. The method based on similarity distance presents a different way of signals patterns match showing distinct behaviors of complexity. Simulations are conducted over synthetic data and traffic signals for providing the comparative study, which is provided to show the power of the new method. Compared with previous sample entropy and surrogate data analysis, the new method has two main advantages. The first one is that it overcomes the limitation about the relationship between the dimension parameter and the length of series. The second one is that the modified sample entropy functions can be used to quantitatively distinguish time series from different complex systems by the similar measure.
Probabilistic structural analysis methods for select space propulsion system components
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Cruse, T. A.
1989-01-01
The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.
Chen, Jin-Long; Chen, Pin-Fan; Wang, Hung-Ming
2014-07-15
Parameters of glucose dynamics recorded by the continuous glucose monitoring system (CGMS) could help in the control of glycemic fluctuations, which is important in diabetes management. Multiscale entropy (MSE) analysis has recently been developed to measure the complexity of physical and physiological time sequences. A reduced MSE complexity index indicates the increased repetition patterns of the time sequence, and, thus, a decreased complexity in this system. No study has investigated the MSE analysis of glucose dynamics in diabetes. This study was designed to compare the complexity of glucose dynamics between the diabetic patients (n = 17) and the control subjects (n = 13), who were matched for sex, age, and body mass index via MSE analysis using the CGMS data. Compared with the control subjects, the diabetic patients revealed a significant increase (P < 0.001) in the mean (diabetic patients 166.0 ± 10.4 vs. control subjects 93.3 ± 1.5 mg/dl), the standard deviation (51.7 ± 4.3 vs. 11.1 ± 0.5 mg/dl), and the mean amplitude of glycemic excursions (127.0 ± 9.2 vs. 27.7 ± 1.3 mg/dl) of the glucose levels; and a significant decrease (P < 0.001) in the MSE complexity index (5.09 ± 0.23 vs. 7.38 ± 0.28). In conclusion, the complexity of glucose dynamics is decreased in diabetes. This finding implies the reactivity of glucoregulation is impaired in the diabetic patients. Such impairment presenting as an increased regularity of glycemic fluctuating pattern could be detected by MSE analysis. Thus, the MSE complexity index could potentially be used as a biomarker in the monitoring of diabetes.
Applicability of Complexity Theory to Martian Fluvial Systems: A Preliminary Analysis
NASA Technical Reports Server (NTRS)
Rosenshein, E. B.
2003-01-01
In the last 15 years, terrestrial geomorphology has been revolutionized by the theories of chaotic systems, fractals, self-organization, and selforganized criticality. Except for the application of fractal theory to the analysis of lava flows and rampart craters on Mars, these theories have not yet been applied to problems of Martian landscape evolution. These complexity theories are elucidated below, along with the methods used to relate these theories to the realities of Martian fluvial systems.
Gladys, Granero; Claudia, Garnero; Marcela, Longhi
2003-11-01
A novel complexation of sulfisoxazole with hydroxypropyl-beta-cyclodextrin (HP-beta-CD) was studied. Two systems were used: binary complexes prepared with HP-beta-CD and multicomponent system (HP-beta-CD and the basic compound triethanolamine (TEA)). Inclusion complex formation in aqueous solutions and in solid state were investigated by the solubility method, thermal analysis (differential scanning calorimetry (DSC) and thermogravimetric analysis (TGA)), Fourier-transform infrared spectroscopy (FT-IR) and dissolution studies. The solid complexes of sulfisoxazole were prepared by freeze-drying the homogeneous concentrated aqueous solutions in molar ratios of sulfisoxazole:HP-beta-CD 1:1 and 1:2, and sulfisoxazole:TEA:HP-beta-CD 1:1:2. FT-IR and thermal analysis showed differences among sulfisoxazole:HP-beta-CD and sulfisoxazole:TEA:HP-beta-CD and their corresponding physical mixtures and individual components. The HP-beta-CD solubilization of sulfisoxazole could be improved by ionization of the drug molecule through pH adjustments. However, larger improvements of the HP-beta-CD solubilization are obtained when multicomponent systems are used, allowing to reduce the amount of CD necessary to prepare the target formulation.
Analyzing SystemC Designs: SystemC Analysis Approaches for Varying Applications
Stoppe, Jannis; Drechsler, Rolf
2015-01-01
The complexity of hardware designs is still increasing according to Moore's law. With embedded systems being more and more intertwined and working together not only with each other, but also with their environments as cyber physical systems (CPSs), more streamlined development workflows are employed to handle the increasing complexity during a system's design phase. SystemC is a C++ library for the design of hardware/software systems, enabling the designer to quickly prototype, e.g., a distributed CPS without having to decide about particular implementation details (such as whether to implement a feature in hardware or in software) early in the design process. Thereby, this approach reduces the initial implementation's complexity by offering an abstract layer with which to build a working prototype. However, as SystemC is based on C++, analyzing designs becomes a difficult task due to the complex language features that are available to the designer. Several fundamentally different approaches for analyzing SystemC designs have been suggested. This work illustrates several different SystemC analysis approaches, including their specific advantages and shortcomings, allowing designers to pick the right tools to assist them with a specific problem during the design of a system using SystemC. PMID:25946632
Analyzing SystemC Designs: SystemC Analysis Approaches for Varying Applications.
Stoppe, Jannis; Drechsler, Rolf
2015-05-04
The complexity of hardware designs is still increasing according to Moore's law. With embedded systems being more and more intertwined and working together not only with each other, but also with their environments as cyber physical systems (CPSs), more streamlined development workflows are employed to handle the increasing complexity during a system's design phase. SystemC is a C++ library for the design of hardware/software systems, enabling the designer to quickly prototype, e.g., a distributed CPS without having to decide about particular implementation details (such as whether to implement a feature in hardware or in software) early in the design process. Thereby, this approach reduces the initial implementation's complexity by offering an abstract layer with which to build a working prototype. However, as SystemC is based on C++, analyzing designs becomes a difficult task due to the complex language features that are available to the designer. Several fundamentally different approaches for analyzing SystemC designs have been suggested. This work illustrates several different SystemC analysis approaches, including their specific advantages and shortcomings, allowing designers to pick the right tools to assist them with a specific problem during the design of a system using SystemC.
Structured analysis and modeling of complex systems
NASA Technical Reports Server (NTRS)
Strome, David R.; Dalrymple, Mathieu A.
1992-01-01
The Aircrew Evaluation Sustained Operations Performance (AESOP) facility at Brooks AFB, Texas, combines the realism of an operational environment with the control of a research laboratory. In recent studies we collected extensive data from the Airborne Warning and Control Systems (AWACS) Weapons Directors subjected to high and low workload Defensive Counter Air Scenarios. A critical and complex task in this environment involves committing a friendly fighter against a hostile fighter. Structured Analysis and Design techniques and computer modeling systems were applied to this task as tools for analyzing subject performance and workload. This technology is being transferred to the Man-Systems Division of NASA Johnson Space Center for application to complex mission related tasks, such as manipulating the Shuttle grappler arm.
Uncertainties in building a strategic defense.
Zraket, C A
1987-03-27
Building a strategic defense against nuclear ballistic missiles involves complex and uncertain functional, spatial, and temporal relations. Such a defensive system would evolve and grow over decades. It is too complex, dynamic, and interactive to be fully understood initially by design, analysis, and experiments. Uncertainties exist in the formulation of requirements and in the research and design of a defense architecture that can be implemented incrementally and be fully tested to operate reliably. The analysis and measurement of system survivability, performance, and cost-effectiveness are critical to this process. Similar complexities exist for an adversary's system that would suppress or use countermeasures against a missile defense. Problems and opportunities posed by these relations are described, with emphasis on the unique characteristics and vulnerabilities of space-based systems.
Development of a structured approach for decomposition of complex systems on a functional basis
NASA Astrophysics Data System (ADS)
Yildirim, Unal; Felician Campean, I.
2014-07-01
The purpose of this paper is to present the System State Flow Diagram (SSFD) as a structured and coherent methodology to decompose a complex system on a solution- independent functional basis. The paper starts by reviewing common function modelling frameworks in literature and discusses practical requirements of the SSFD in the context of the current literature and current approaches in industry. The proposed methodology is illustrated through the analysis of a case study: design analysis of a generic Bread Toasting System (BTS).
NASA Astrophysics Data System (ADS)
Aggarwal, Anil Kr.; Kumar, Sanjeev; Singh, Vikram
2017-03-01
The binary states, i.e., success or failed state assumptions used in conventional reliability are inappropriate for reliability analysis of complex industrial systems due to lack of sufficient probabilistic information. For large complex systems, the uncertainty of each individual parameter enhances the uncertainty of the system reliability. In this paper, the concept of fuzzy reliability has been used for reliability analysis of the system, and the effect of coverage factor, failure and repair rates of subsystems on fuzzy availability for fault-tolerant crystallization system of sugar plant is analyzed. Mathematical modeling of the system is carried out using the mnemonic rule to derive Chapman-Kolmogorov differential equations. These governing differential equations are solved with Runge-Kutta fourth-order method.
Ares I Integrated Vehicle System Safety Team
NASA Technical Reports Server (NTRS)
Wetherholt, Jon; McNairy, Lisa; Shackelford, Carla
2009-01-01
Complex systems require integrated analysis teams which sometimes are divided into subsystem teams. Proper division of the analysis in to subsystem teams is important. Safety analysis is one of the most difficult aspects of integration.
Complexity and dynamics of topological and community structure in complex networks
NASA Astrophysics Data System (ADS)
Berec, Vesna
2017-07-01
Complexity is highly susceptible to variations in the network dynamics, reflected on its underlying architecture where topological organization of cohesive subsets into clusters, system's modular structure and resulting hierarchical patterns, are cross-linked with functional dynamics of the system. Here we study connection between hierarchical topological scales of the simplicial complexes and the organization of functional clusters - communities in complex networks. The analysis reveals the full dynamics of different combinatorial structures of q-th-dimensional simplicial complexes and their Laplacian spectra, presenting spectral properties of resulting symmetric and positive semidefinite matrices. The emergence of system's collective behavior from inhomogeneous statistical distribution is induced by hierarchically ordered topological structure, which is mapped to simplicial complex where local interactions between the nodes clustered into subcomplexes generate flow of information that characterizes complexity and dynamics of the full system.
Visualizing Parallel Computer System Performance
NASA Technical Reports Server (NTRS)
Malony, Allen D.; Reed, Daniel A.
1988-01-01
Parallel computer systems are among the most complex of man's creations, making satisfactory performance characterization difficult. Despite this complexity, there are strong, indeed, almost irresistible, incentives to quantify parallel system performance using a single metric. The fallacy lies in succumbing to such temptations. A complete performance characterization requires not only an analysis of the system's constituent levels, it also requires both static and dynamic characterizations. Static or average behavior analysis may mask transients that dramatically alter system performance. Although the human visual system is remarkedly adept at interpreting and identifying anomalies in false color data, the importance of dynamic, visual scientific data presentation has only recently been recognized Large, complex parallel system pose equally vexing performance interpretation problems. Data from hardware and software performance monitors must be presented in ways that emphasize important events while eluding irrelevant details. Design approaches and tools for performance visualization are the subject of this paper.
Sensitivity analysis of dynamic biological systems with time-delays.
Wu, Wu Hsiung; Wang, Feng Sheng; Chang, Maw Shang
2010-10-15
Mathematical modeling has been applied to the study and analysis of complex biological systems for a long time. Some processes in biological systems, such as the gene expression and feedback control in signal transduction networks, involve a time delay. These systems are represented as delay differential equation (DDE) models. Numerical sensitivity analysis of a DDE model by the direct method requires the solutions of model and sensitivity equations with time-delays. The major effort is the computation of Jacobian matrix when computing the solution of sensitivity equations. The computation of partial derivatives of complex equations either by the analytic method or by symbolic manipulation is time consuming, inconvenient, and prone to introduce human errors. To address this problem, an automatic approach to obtain the derivatives of complex functions efficiently and accurately is necessary. We have proposed an efficient algorithm with an adaptive step size control to compute the solution and dynamic sensitivities of biological systems described by ordinal differential equations (ODEs). The adaptive direct-decoupled algorithm is extended to solve the solution and dynamic sensitivities of time-delay systems describing by DDEs. To save the human effort and avoid the human errors in the computation of partial derivatives, an automatic differentiation technique is embedded in the extended algorithm to evaluate the Jacobian matrix. The extended algorithm is implemented and applied to two realistic models with time-delays: the cardiovascular control system and the TNF-α signal transduction network. The results show that the extended algorithm is a good tool for dynamic sensitivity analysis on DDE models with less user intervention. By comparing with direct-coupled methods in theory, the extended algorithm is efficient, accurate, and easy to use for end users without programming background to do dynamic sensitivity analysis on complex biological systems with time-delays.
NASA Astrophysics Data System (ADS)
Eduardo Virgilio Silva, Luiz; Otavio Murta, Luiz
2012-12-01
Complexity in time series is an intriguing feature of living dynamical systems, with potential use for identification of system state. Although various methods have been proposed for measuring physiologic complexity, uncorrelated time series are often assigned high values of complexity, errouneously classifying them as a complex physiological signals. Here, we propose and discuss a method for complex system analysis based on generalized statistical formalism and surrogate time series. Sample entropy (SampEn) was rewritten inspired in Tsallis generalized entropy, as function of q parameter (qSampEn). qSDiff curves were calculated, which consist of differences between original and surrogate series qSampEn. We evaluated qSDiff for 125 real heart rate variability (HRV) dynamics, divided into groups of 70 healthy, 44 congestive heart failure (CHF), and 11 atrial fibrillation (AF) subjects, and for simulated series of stochastic and chaotic process. The evaluations showed that, for nonperiodic signals, qSDiff curves have a maximum point (qSDiffmax) for q ≠1. Values of q where the maximum point occurs and where qSDiff is zero were also evaluated. Only qSDiffmax values were capable of distinguish HRV groups (p-values 5.10×10-3, 1.11×10-7, and 5.50×10-7 for healthy vs. CHF, healthy vs. AF, and CHF vs. AF, respectively), consistently with the concept of physiologic complexity, and suggests a potential use for chaotic system analysis.
An R Package for Open, Reproducible Analysis of Urban Water Systems, With Application to Chicago
Urban water systems consist of natural and engineered flows of water interacting in complex ways. System complexity can be understood via mass conservative models that account for the interrelationships among all major flows and storages. We have developed a generic urban water s...
The application of sensitivity analysis to models of large scale physiological systems
NASA Technical Reports Server (NTRS)
Leonard, J. I.
1974-01-01
A survey of the literature of sensitivity analysis as it applies to biological systems is reported as well as a brief development of sensitivity theory. A simple population model and a more complex thermoregulatory model illustrate the investigatory techniques and interpretation of parameter sensitivity analysis. The role of sensitivity analysis in validating and verifying models, and in identifying relative parameter influence in estimating errors in model behavior due to uncertainty in input data is presented. This analysis is valuable to the simulationist and the experimentalist in allocating resources for data collection. A method for reducing highly complex, nonlinear models to simple linear algebraic models that could be useful for making rapid, first order calculations of system behavior is presented.
Sulis, William H
2017-10-01
Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.
Model-Based Engineering for Supply Chain Risk Management
2015-09-30
Privacy, 2009 [19] Julien Delange Wheel Brake System Example using AADL; Feiler, Peter; Hansson, Jörgen; de Niz, Dionisio; & Wrage, Lutz. System ...University Software Engineering Institute Abstract—Expanded use of commercial components has increased the complexity of system assurance...verification. Model- based engineering (MBE) offers a means to design, develop, analyze, and maintain a complex system architecture. Architecture Analysis
Knightbridge, Stephen M; King, Robert; Rolfe, Timothy J
2006-04-01
This paper describes the first phase of a larger project that utilizes participatory action research to examine complex mental health needs across an extensive group of stakeholders in the community. Within an objective qualitative analysis of focus group discussions the social ecological model is utilized to explore how integrative activities can be informed, planned and implemented across multiple elements and levels of a system. Seventy-one primary care workers, managers, policy-makers, consumers and carers from across the southern metropolitan and Gippsland regions of Victoria, Australia took part in seven focus groups. All groups responded to an identical set of focusing questions. Participants produced an explanatory model describing the service system, as it relates to people with complex needs, across the levels of social ecological analysis. Qualitative themes analysis identified four priority areas to be addressed in order to improve the system's capacity for working with complexity. These included: (i) system fragmentation; (ii) integrative case management practices; (iii) community attitudes; and (iv) money and resources. The emergent themes provide clues as to how complexity is constructed and interpreted across the system of involved agencies and interest groups. The implications these findings have for the development and evaluation of this community capacity-building project were examined from the perspective of constructing interventions that address both top-down and bottom-up processes.
Methods Used to Support a Life Cycle of Complex Engineering Products
NASA Astrophysics Data System (ADS)
Zakharova, Alexandra A.; Kolegova, Olga A.; Nekrasova, Maria E.; Eremenko, Andrey O.
2016-08-01
Management of companies involved in the design, development and operation of complex engineering products recognize the relevance of creating systems for product lifecycle management. A system of methods is proposed to support life cycles of complex engineering products, based on fuzzy set theory and hierarchical analysis. The system of methods serves to demonstrate the grounds for making strategic decisions in an environment of uncertainty, allows the use of expert knowledge, and provides interconnection of decisions at all phases of strategic management and all stages of a complex engineering product lifecycle.
Transportation Network Topologies
NASA Technical Reports Server (NTRS)
Alexandrov, Natalia (Editor)
2004-01-01
The existing U.S. hub-and-spoke air transportation system is reaching saturation. Major aspects of the current system, such as capacity, safety, mobility, customer satisfaction, security, communications, and ecological effects, require improvements. The changing dynamics - increased presence of general aviation, unmanned autonomous vehicles, military aircraft in civil airspace as part of homeland defense - contributes to growing complexity of airspace. The system has proven remarkably resistant to change. NASA Langley Research Center and the National Institute of Aerospace conducted a workshop on Transportation Network Topologies on 9-10 December 2003 in Williamsburg, Virginia. The workshop aimed to examine the feasibility of traditional methods for complex system analysis and design as well as potential novel alternatives in application to transportation systems, identify state-of-the-art models and methods, conduct gap analysis, and thus to lay a foundation for establishing a focused research program in complex systems applied to air transportation.
Analysis of Complex Valve and Feed Systems
NASA Technical Reports Server (NTRS)
Ahuja, Vineet; Hosangadi, Ashvin; Shipman, Jeremy; Cavallo, Peter; Dash, Sanford
2007-01-01
A numerical framework for analysis of complex valve systems supports testing of propulsive systems by simulating key valve and control system components in the test loop. In particular, it is designed to enhance the analysis capability in terms of identifying system transients and quantifying the valve response to these transients. This system has analysis capability for simulating valve motion in complex systems operating in diverse flow regimes ranging from compressible gases to cryogenic liquids. A key feature is the hybrid, unstructured framework with sub-models for grid movement and phase change including cryogenic cavitations. The multi-element unstructured framework offers improved predictions of valve performance characteristics under steady conditions for structurally complex valves such as pressure regulator valve. Unsteady simulations of valve motion using this computational approach have been carried out for various valves in operation at Stennis Space Center such as the split-body valve and the 10-in. (approx.25.4-cm) LOX (liquid oxygen) valve and the 4-in. (approx.10 cm) Y-pattern valve (liquid nitrogen). Such simulations make use of variable grid topologies, thereby permitting solution accuracy and resolving important flow physics in the seat region of the moving valve. An advantage to this software includes possible reduction in testing costs incurred due to disruptions relating to unexpected flow transients or functioning of valve/flow control systems. Prediction of the flow anomalies leading to system vibrations, flow resonance, and valve stall can help in valve scheduling and significantly reduce the need for activation tests. This framework has been evaluated for its ability to predict performance metrics like flow coefficient for cavitating venturis and valve coefficient curves, and could be a valuable tool in predicting and understanding anomalous behavior of system components at rocket propulsion testing and design sites.
Mutual influence between triel bond and cation-π interactions: an ab initio study
NASA Astrophysics Data System (ADS)
Esrafili, Mehdi D.; Mousavian, Parisasadat
2017-12-01
Using ab initio calculations, the cooperative and solvent effects on cation-π and B...N interactions are studied in some model ternary complexes, where these interactions coexist. The nature of the interactions and the mechanism of cooperativity are investigated by means of quantum theory of atoms in molecules (QTAIM), noncovalent interaction (NCI) index and natural bond orbital analysis. The results indicate that all cation-π and B...N binding distances in the ternary complexes are shorter than those of corresponding binary systems. The QTAIM analysis reveals that ternary complexes have higher electron density at their bond critical points relative to the corresponding binary complexes. In addition, according to the QTAIM analysis, the formation of cation-π interaction increases covalency of B...N bonds. The NCI analysis indicates that the cooperative effects in the ternary complexes make a shift in the location of the spike associated with each interaction, which can be regarded as an evidence for the reinforcement of both cation-π and B...N interactions in these systems. Solvent effects on the cooperativity of cation-π and B...N interactions are also investigated.
NASA Astrophysics Data System (ADS)
Box, Paul W.
GIS and spatial analysis is suited mainly for static pictures of the landscape, but many of the processes that need exploring are dynamic in nature. Dynamic processes can be complex when put in a spatial context; our ability to study such processes will probably come with advances in understanding complex systems in general. Cellular automata and agent-based models are two prime candidates for exploring complex spatial systems, but are difficult to implement. Innovative tools that help build complex simulations will create larger user communities, who will probably find novel solutions for understanding complexity. A significant source for such innovations is likely to be from the collective efforts of hobbyists and part-time programmers, who have been dubbed ``garage-band scientists'' in the popular press.
Approaching human language with complex networks
NASA Astrophysics Data System (ADS)
Cong, Jin; Liu, Haitao
2014-12-01
The interest in modeling and analyzing human language with complex networks is on the rise in recent years and a considerable body of research in this area has already been accumulated. We survey three major lines of linguistic research from the complex network approach: 1) characterization of human language as a multi-level system with complex network analysis; 2) linguistic typological research with the application of linguistic networks and their quantitative measures; and 3) relationships between the system-level complexity of human language (determined by the topology of linguistic networks) and microscopic linguistic (e.g., syntactic) features (as the traditional concern of linguistics). We show that the models and quantitative tools of complex networks, when exploited properly, can constitute an operational methodology for linguistic inquiry, which contributes to the understanding of human language and the development of linguistics. We conclude our review with suggestions for future linguistic research from the complex network approach: 1) relationships between the system-level complexity of human language and microscopic linguistic features; 2) expansion of research scope from the global properties to other levels of granularity of linguistic networks; and 3) combination of linguistic network analysis with other quantitative studies of language (such as quantitative linguistics).
Near infrared spectroscopy and chemometrics analysis of complex traits in animal physiology
USDA-ARS?s Scientific Manuscript database
Near infrared reflectance (NIR) applications have been expanding from the traditional framework of small molecule chemical purity and composition (as defined by spectral libraries) to complex system analysis and holistic exploratory approaches to questions in biochemistry, biophysics and environment...
The methodology of multi-viewpoint clustering analysis
NASA Technical Reports Server (NTRS)
Mehrotra, Mala; Wild, Chris
1993-01-01
One of the greatest challenges facing the software engineering community is the ability to produce large and complex computer systems, such as ground support systems for unmanned scientific missions, that are reliable and cost effective. In order to build and maintain these systems, it is important that the knowledge in the system be suitably abstracted, structured, and otherwise clustered in a manner which facilitates its understanding, manipulation, testing, and utilization. Development of complex mission-critical systems will require the ability to abstract overall concepts in the system at various levels of detail and to consider the system from different points of view. Multi-ViewPoint - Clustering Analysis MVP-CA methodology has been developed to provide multiple views of large, complicated systems. MVP-CA provides an ability to discover significant structures by providing an automated mechanism to structure both hierarchically (from detail to abstract) and orthogonally (from different perspectives). We propose to integrate MVP/CA into an overall software engineering life cycle to support the development and evolution of complex mission critical systems.
NASA Technical Reports Server (NTRS)
Evans, Emory; Young, Steven D.; Daniels, Taumi; Santiago-Espada, Yamira; Etherington, Tim
2016-01-01
A flight simulation study was conducted at NASA Langley Research Center to evaluate flight deck systems that (1) predict aircraft energy state and/or autoflight configuration, (2) present the current state and expected future state of automated systems, and/or (3) show the state of flight-critical data systems in use by automated systems and primary flight instruments. Four new technology concepts were evaluated vis-à-vis current state-of-the-art flight deck systems and indicators. This human-in-the-loop study was conducted using commercial airline crews. Scenarios spanned a range of complex conditions and several emulated causal factors and complexity in recent accidents involving loss of state awareness by pilots (e.g. energy state, automation state, and/or system state). Data were collected via questionnaires administered after each flight, audio/video recordings, physiological data, head and eye tracking data, pilot control inputs, and researcher observations. This paper strictly focuses on findings derived from the questionnaire responses. It includes analysis of pilot subjective measures of complexity, decision making, workload, situation awareness, usability, and acceptability.
Held, Jürgen; Manser, Tanja
2005-02-01
This article outlines how a Palm- or Newton-based PDA (personal digital assistant) system for online event recording was used to record and analyze concurrent events. We describe the features of this PDA-based system, called the FIT-System (flexible interface technique), and its application to the analysis of concurrent events in complex behavioral processes--in this case, anesthesia work processes. The patented FIT-System has a unique user interface design allowing the user to design an interface template with a pencil and paper or using a transparency film. The template usually consists of a drawing or sketch that includes icons or symbols that depict the observer's representation of the situation to be observed. In this study, the FIT-System allowed us to create a design for fast, intuitive online recording of concurrent events using a set of 41 observation codes. An analysis of concurrent events leads to a description of action density, and our results revealed a characteristic distribution of action density during the administration of anesthesia in the operating room. This distribution indicated the central role of the overlapping operations in the action sequences of medical professionals as they deal with the varying requirements of this complex task. We believe that the FIT-System for online recording of concurrent events in complex behavioral processes has the potential to be useful across a broad spectrum of research areas.
76 FR 64330 - Advanced Scientific Computing Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-18
... talks on HPC Reliability, Diffusion on Complex Networks, and Reversible Software Execution Systems Report from Applied Math Workshop on Mathematics for the Analysis, Simulation, and Optimization of Complex Systems Report from ASCR-BES Workshop on Data Challenges from Next Generation Facilities Public...
Qualitative analysis of a discrete thermostatted kinetic framework modeling complex adaptive systems
NASA Astrophysics Data System (ADS)
Bianca, Carlo; Mogno, Caterina
2018-01-01
This paper deals with the derivation of a new discrete thermostatted kinetic framework for the modeling of complex adaptive systems subjected to external force fields (nonequilibrium system). Specifically, in order to model nonequilibrium stationary states of the system, the external force field is coupled to a dissipative term (thermostat). The well-posedness of the related Cauchy problem is investigated thus allowing the new discrete thermostatted framework to be suitable for the derivation of specific models and the related computational analysis. Applications to crowd dynamics and future research directions are also discussed within the paper.
Air Traffic Complexity Measurement Environment (ACME): Software User's Guide
NASA Technical Reports Server (NTRS)
1996-01-01
A user's guide for the Air Traffic Complexity Measurement Environment (ACME) software is presented. The ACME consists of two major components, a complexity analysis tool and user interface. The Complexity Analysis Tool (CAT) analyzes complexity off-line, producing data files which may be examined interactively via the Complexity Data Analysis Tool (CDAT). The Complexity Analysis Tool is composed of three independently executing processes that communicate via PVM (Parallel Virtual Machine) and Unix sockets. The Runtime Data Management and Control process (RUNDMC) extracts flight plan and track information from a SAR input file, and sends the information to GARP (Generate Aircraft Routes Process) and CAT (Complexity Analysis Task). GARP in turn generates aircraft trajectories, which are utilized by CAT to calculate sector complexity. CAT writes flight plan, track and complexity data to an output file, which can be examined interactively. The Complexity Data Analysis Tool (CDAT) provides an interactive graphic environment for examining the complexity data produced by the Complexity Analysis Tool (CAT). CDAT can also play back track data extracted from System Analysis Recording (SAR) tapes. The CDAT user interface consists of a primary window, a controls window, and miscellaneous pop-ups. Aircraft track and position data is displayed in the main viewing area of the primary window. The controls window contains miscellaneous control and display items. Complexity data is displayed in pop-up windows. CDAT plays back sector complexity and aircraft track and position data as a function of time. Controls are provided to start and stop playback, adjust the playback rate, and reposition the display to a specified time.
Marshall, Najja; Timme, Nicholas M; Bennett, Nicholas; Ripp, Monica; Lautzenhiser, Edward; Beggs, John M
2016-01-01
Neural systems include interactions that occur across many scales. Two divergent methods for characterizing such interactions have drawn on the physical analysis of critical phenomena and the mathematical study of information. Inferring criticality in neural systems has traditionally rested on fitting power laws to the property distributions of "neural avalanches" (contiguous bursts of activity), but the fractal nature of avalanche shapes has recently emerged as another signature of criticality. On the other hand, neural complexity, an information theoretic measure, has been used to capture the interplay between the functional localization of brain regions and their integration for higher cognitive functions. Unfortunately, treatments of all three methods-power-law fitting, avalanche shape collapse, and neural complexity-have suffered from shortcomings. Empirical data often contain biases that introduce deviations from true power law in the tail and head of the distribution, but deviations in the tail have often been unconsidered; avalanche shape collapse has required manual parameter tuning; and the estimation of neural complexity has relied on small data sets or statistical assumptions for the sake of computational efficiency. In this paper we present technical advancements in the analysis of criticality and complexity in neural systems. We use maximum-likelihood estimation to automatically fit power laws with left and right cutoffs, present the first automated shape collapse algorithm, and describe new techniques to account for large numbers of neural variables and small data sets in the calculation of neural complexity. In order to facilitate future research in criticality and complexity, we have made the software utilized in this analysis freely available online in the MATLAB NCC (Neural Complexity and Criticality) Toolbox.
Transdisciplinary Application of Cross-Scale Resilience ...
The cross-scale resilience model was developed in ecology to explain the emergence of resilience from the distribution of ecological functions within and across scales, and as a tool to assess resilience. We propose that the model and the underlyingdiscontinuity hypothesis are relevant to other complex adaptive systems, and can be used to identify and track changes in system parameters related to resilience. We explain the theory behind the cross-scale resilience model, review the cases where it has been applied to non-ecological systems, and discuss some examples of social-ecological, archaeological/anthropological, and economic systems where a cross-scale resilience analysis could add a quantitative dimension to our current understanding of system dynamics and resilience. We argue that the scaling and diversity parameters suitable for a resilience analysis of ecological systems are appropriate for a broad suite of systems where non-normative quantitative assessments of resilience are desired. Our planet is currently characterized by fast environmental and social change, and the cross-scale resilience model has the potential to quantify resilience across many types of complex adaptive systems. Comparative analyses of complex systems have, in fact, demonstrated commonalities among distinctly different types of systems (Schneider & Kay 1994; Holling 2001; Lansing 2003; Foster 2005; Bullmore et al. 2009). Both biological and non-biological complex systems appear t
Statistical Field Estimation for Complex Coastal Regions and Archipelagos (PREPRINT)
2011-04-09
and study the computational properties of these schemes. Specifically, we extend a multiscale Objective Analysis (OA) approach to complex coastal...computational properties of these schemes. Specifically, we extend a multiscale Objective Analysis (OA) approach to complex coastal regions and... multiscale free-surface code builds on the primitive-equation model of the Harvard Ocean Predic- tion System (HOPS, Haley et al. (2009)). Additionally
Fault Identification Based on Nlpca in Complex Electrical Engineering
NASA Astrophysics Data System (ADS)
Zhang, Yagang; Wang, Zengping; Zhang, Jinfang
2012-07-01
The fault is inevitable in any complex systems engineering. Electric power system is essentially a typically nonlinear system. It is also one of the most complex artificial systems in this world. In our researches, based on the real-time measurements of phasor measurement unit, under the influence of white Gaussian noise (suppose the standard deviation is 0.01, and the mean error is 0), we used mainly nonlinear principal component analysis theory (NLPCA) to resolve fault identification problem in complex electrical engineering. The simulation results show that the fault in complex electrical engineering is usually corresponding to the variable with the maximum absolute value coefficient in the first principal component. These researches will have significant theoretical value and engineering practical significance.
A Pedagogical Software for the Analysis of Loudspeaker Systems
ERIC Educational Resources Information Center
Pueo, B.; Roma, M.; Escolano, J.; Lopez, J. J.
2009-01-01
In this paper, a pedagogical software for the design and analysis of loudspeaker systems is presented, with emphasis on training students in the interaction between system parameters. Loudspeakers are complex electromechanical system, whose behavior is neither intuitive nor easy to understand by inexperienced students. Although commercial…
The ASSIST: Bringing Information and Software Together for Scientists
NASA Technical Reports Server (NTRS)
Mandel, Eric
1997-01-01
The ASSIST was developed as a step toward overcoming the problems faced by researchers when trying to utilize complex and often conflicting astronomical data analysis systems. It implements a uniform graphical interface to analysis systems, documentation, data, and organizational memory. It is layered on top of the Answer Garden Substrate (AGS), a system specially designed to facilitate the collection and dissemination of organizational memory. Under the AISRP program, we further developed the ASSIST to make it even easier for researchers to overcome the difficulties of accessing software and information in a complex computer environment.
Managing Complex Dynamical Systems
ERIC Educational Resources Information Center
Cox, John C.; Webster, Robert L.; Curry, Jeanie A.; Hammond, Kevin L.
2011-01-01
Management commonly engages in a variety of research designed to provide insight into the motivation and relationships of individuals, departments, organizations, etc. This paper demonstrates how the application of concepts associated with the analysis of complex systems applied to such data sets can yield enhanced insights for managerial action.
PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deelman, Ewa; Carothers, Christopher; Mandal, Anirban
Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less
PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows
Deelman, Ewa; Carothers, Christopher; Mandal, Anirban; ...
2015-07-14
Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less
Archetypes for Organisational Safety
NASA Technical Reports Server (NTRS)
Marais, Karen; Leveson, Nancy G.
2003-01-01
We propose a framework using system dynamics to model the dynamic behavior of organizations in accident analysis. Most current accident analysis techniques are event-based and do not adequately capture the dynamic complexity and non-linear interactions that characterize accidents in complex systems. In this paper we propose a set of system safety archetypes that model common safety culture flaws in organizations, i.e., the dynamic behaviour of organizations that often leads to accidents. As accident analysis and investigation tools, the archetypes can be used to develop dynamic models that describe the systemic and organizational factors contributing to the accident. The archetypes help clarify why safety-related decisions do not always result in the desired behavior, and how independent decisions in different parts of the organization can combine to impact safety.
State Analysis: A Control Architecture View of Systems Engineering
NASA Technical Reports Server (NTRS)
Rasmussen, Robert D.
2005-01-01
A viewgraph presentation on the state analysis process is shown. The topics include: 1) Issues with growing complexity; 2) Limits of common practice; 3) Exploiting a control point of view; 4) A glimpse at the State Analysis process; 5) Synergy with model-based systems engineering; and 6) Bridging the systems to software gap.
Hettinger, Lawrence J.; Kirlik, Alex; Goh, Yang Miang; Buckle, Peter
2015-01-01
Accurate comprehension and analysis of complex sociotechnical systems is a daunting task. Empirically examining, or simply envisioning the structure and behaviour of such systems challenges traditional analytic and experimental approaches as well as our everyday cognitive capabilities. Computer-based models and simulations afford potentially useful means of accomplishing sociotechnical system design and analysis objectives. From a design perspective, they can provide a basis for a common mental model among stakeholders, thereby facilitating accurate comprehension of factors impacting system performance and potential effects of system modifications. From a research perspective, models and simulations afford the means to study aspects of sociotechnical system design and operation, including the potential impact of modifications to structural and dynamic system properties, in ways not feasible with traditional experimental approaches. This paper describes issues involved in the design and use of such models and simulations and describes a proposed path forward to their development and implementation. Practitioner Summary: The size and complexity of real-world sociotechnical systems can present significant barriers to their design, comprehension and empirical analysis. This article describes the potential advantages of computer-based models and simulations for understanding factors that impact sociotechnical system design and operation, particularly with respect to process and occupational safety. PMID:25761227
Difficult Decisions Made Easier
NASA Technical Reports Server (NTRS)
2006-01-01
NASA missions are extremely complex and prone to sudden, catastrophic failure if equipment falters or if an unforeseen event occurs. For these reasons, NASA trains to expect the unexpected. It tests its equipment and systems in extreme conditions, and it develops risk-analysis tests to foresee any possible problems. The Space Agency recently worked with an industry partner to develop reliability analysis software capable of modeling complex, highly dynamic systems, taking into account variations in input parameters and the evolution of the system over the course of a mission. The goal of this research was multifold. It included performance and risk analyses of complex, multiphase missions, like the insertion of the Mars Reconnaissance Orbiter; reliability analyses of systems with redundant and/or repairable components; optimization analyses of system configurations with respect to cost and reliability; and sensitivity analyses to identify optimal areas for uncertainty reduction or performance enhancement.
Hay, L.; Knapp, L.
1996-01-01
Investigating natural, potential, and man-induced impacts on hydrological systems commonly requires complex modelling with overlapping data requirements, and massive amounts of one- to four-dimensional data at multiple scales and formats. Given the complexity of most hydrological studies, the requisite software infrastructure must incorporate many components including simulation modelling, spatial analysis and flexible, intuitive displays. There is a general requirement for a set of capabilities to support scientific analysis which, at this time, can only come from an integration of several software components. Integration of geographic information systems (GISs) and scientific visualization systems (SVSs) is a powerful technique for developing and analysing complex models. This paper describes the integration of an orographic precipitation model, a GIS and a SVS. The combination of these individual components provides a robust infrastructure which allows the scientist to work with the full dimensionality of the data and to examine the data in a more intuitive manner.
From globally coupled maps to complex-systems biology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaneko, Kunihiko, E-mail: kaneko@complex.c.u-tokyo.ac.jp
Studies of globally coupled maps, introduced as a network of chaotic dynamics, are briefly reviewed with an emphasis on novel concepts therein, which are universal in high-dimensional dynamical systems. They include clustering of synchronized oscillations, hierarchical clustering, chimera of synchronization and desynchronization, partition complexity, prevalence of Milnor attractors, chaotic itinerancy, and collective chaos. The degrees of freedom necessary for high dimensionality are proposed to equal the number in which the combinatorial exceeds the exponential. Future analysis of high-dimensional dynamical systems with regard to complex-systems biology is briefly discussed.
Studying the HIT-Complexity Interchange.
Kuziemsky, Craig E; Borycki, Elizabeth M; Kushniruk, Andre W
2016-01-01
The design and implementation of health information technology (HIT) is challenging, particularly when it is being introduced into complex settings. While complex adaptive system (CASs) can be a valuable means of understanding relationships between users, HIT and tasks, much of the existing work using CASs is descriptive in nature. This paper addresses that issue by integrating a model for analyzing task complexity with approaches for HIT evaluation and systems analysis. The resulting framework classifies HIT-user tasks and issues as simple, complicated or complex, and provides insight on how to study them.
Principal process analysis of biological models.
Casagranda, Stefano; Touzeau, Suzanne; Ropers, Delphine; Gouzé, Jean-Luc
2018-06-14
Understanding the dynamical behaviour of biological systems is challenged by their large number of components and interactions. While efforts have been made in this direction to reduce model complexity, they often prove insufficient to grasp which and when model processes play a crucial role. Answering these questions is fundamental to unravel the functioning of living organisms. We design a method for dealing with model complexity, based on the analysis of dynamical models by means of Principal Process Analysis. We apply the method to a well-known model of circadian rhythms in mammals. The knowledge of the system trajectories allows us to decompose the system dynamics into processes that are active or inactive with respect to a certain threshold value. Process activities are graphically represented by Boolean and Dynamical Process Maps. We detect model processes that are always inactive, or inactive on some time interval. Eliminating these processes reduces the complex dynamics of the original model to the much simpler dynamics of the core processes, in a succession of sub-models that are easier to analyse. We quantify by means of global relative errors the extent to which the simplified models reproduce the main features of the original system dynamics and apply global sensitivity analysis to test the influence of model parameters on the errors. The results obtained prove the robustness of the method. The analysis of the sub-model dynamics allows us to identify the source of circadian oscillations. We find that the negative feedback loop involving proteins PER, CRY, CLOCK-BMAL1 is the main oscillator, in agreement with previous modelling and experimental studies. In conclusion, Principal Process Analysis is a simple-to-use method, which constitutes an additional and useful tool for analysing the complex dynamical behaviour of biological systems.
Complex Digital Visual Systems
ERIC Educational Resources Information Center
Sweeny, Robert W.
2013-01-01
This article identifies possibilities for data visualization as art educational research practice. The author presents an analysis of the relationship between works of art and digital visual culture, employing aspects of network analysis drawn from the work of Barabási, Newman, and Watts (2006) and Castells (1994). Describing complex network…
ERIC Educational Resources Information Center
Coggshall, Elizabeth Learn
2017-01-01
The study of short-"a" (e.g., the vowel in words such as "bat," "bad," "bang," "ban") in New York City English (NYCE) has a long history, and with many different descriptions of this complex system (e.g., Babbitt 1896; Trager 1930; Labov 1966/2006; Cohen 1970; Labov 2007). It is complex due to the…
Large-scale systems: Complexity, stability, reliability
NASA Technical Reports Server (NTRS)
Siljak, D. D.
1975-01-01
After showing that a complex dynamic system with a competitive structure has highly reliable stability, a class of noncompetitive dynamic systems for which competitive models can be constructed is defined. It is shown that such a construction is possible in the context of the hierarchic stability analysis. The scheme is based on the comparison principle and vector Liapunov functions.
NASA Technical Reports Server (NTRS)
Dill, Evan T.; Young, Steven D.
2015-01-01
In the constant drive to further the safety and efficiency of air travel, the complexity of avionics-related systems, and the procedures for interacting with these systems, appear to be on an ever-increasing trend. While this growing complexity often yields productive results with respect to system capabilities and flight efficiency, it can place a larger burden on pilots to manage increasing amounts of information and to understand intricate system designs. Evidence supporting this observation is becoming widespread, yet has been largely anecdotal or the result of subjective analysis. One way to gain more insight into this issue is through experimentation using more objective measures or indicators. This study utilizes and analyzes eye-tracking data obtained during a high-fidelity flight simulation study wherein many of the complexities of current flight decks, as well as those planned for the next generation air transportation system (NextGen), were emulated. The following paper presents the findings of this study with a focus on electronic flight bag (EFB) usage, system state awareness (SSA) and events involving suspected inattentional blindness (IB).
Self-conscious robotic system design process--from analysis to implementation.
Chella, Antonio; Cossentino, Massimo; Seidita, Valeria
2011-01-01
Developing robotic systems endowed with self-conscious capabilities means realizing complex sub-systems needing ad-hoc software engineering techniques for their modelling, analysis and implementation. In this chapter the whole process (from analysis to implementation) to model the development of self-conscious robotic systems is presented and the new created design process, PASSIC, supporting each part of it, is fully illustrated.
NASA Technical Reports Server (NTRS)
Franck, Bruno M.
1990-01-01
The research is focused on automating the evaluation of complex structural systems, whether for the design of a new system or the analysis of an existing one, by developing new structural analysis techniques based on qualitative reasoning. The problem is to identify and better understand: (1) the requirements for the automation of design, and (2) the qualitative reasoning associated with the conceptual development of a complex system. The long-term objective is to develop an integrated design-risk assessment environment for the evaluation of complex structural systems. The scope of this short presentation is to describe the design and cognition components of the research. Design has received special attention in cognitive science because it is now identified as a problem solving activity that is different from other information processing tasks (1). Before an attempt can be made to automate design, a thorough understanding of the underlying design theory and methodology is needed, since the design process is, in many cases, multi-disciplinary, complex in size and motivation, and uses various reasoning processes involving different kinds of knowledge in ways which vary from one context to another. The objective is to unify all the various types of knowledge under one framework of cognition. This presentation focuses on the cognitive science framework that we are using to represent the knowledge aspects associated with the human mind's abstraction abilities and how we apply it to the engineering knowledge and engineering reasoning in design.
Interpreting Popov criteria in Lure´ systems with complex scaling stability analysis
NASA Astrophysics Data System (ADS)
Zhou, J.
2018-06-01
The paper presents a novel frequency-domain interpretation of Popov criteria for absolute stability in Lure´ systems by means of what we call complex scaling stability analysis. The complex scaling technique is developed for exponential/asymptotic stability in LTI feedback systems, which dispenses open-loop poles distribution, contour/locus orientation and prior frequency sweeping. Exploiting the technique for alternatively revealing positive realness of transfer functions, re-interpreting Popov criteria is explicated. More specifically, the suggested frequency-domain stability conditions are conformable both in scalar and multivariable cases, and can be implemented either graphically with locus plotting or numerically without; in particular, the latter is suitable as a design tool with auxiliary parameter freedom. The interpretation also reveals further frequency-domain facts about Lure´ systems. Numerical examples are included to illustrate the main results.
Adjoint equations and analysis of complex systems: Application to virus infection modelling
NASA Astrophysics Data System (ADS)
Marchuk, G. I.; Shutyaev, V.; Bocharov, G.
2005-12-01
Recent development of applied mathematics is characterized by ever increasing attempts to apply the modelling and computational approaches across various areas of the life sciences. The need for a rigorous analysis of the complex system dynamics in immunology has been recognized since more than three decades ago. The aim of the present paper is to draw attention to the method of adjoint equations. The methodology enables to obtain information about physical processes and examine the sensitivity of complex dynamical systems. This provides a basis for a better understanding of the causal relationships between the immune system's performance and its parameters and helps to improve the experimental design in the solution of applied problems. We show how the adjoint equations can be used to explain the changes in hepatitis B virus infection dynamics between individual patients.
A Fault Tree Approach to Analysis of Organizational Communication Systems.
ERIC Educational Resources Information Center
Witkin, Belle Ruth; Stephens, Kent G.
Fault Tree Analysis (FTA) is a method of examing communication in an organization by focusing on: (1) the complex interrelationships in human systems, particularly in communication systems; (2) interactions across subsystems and system boundaries; and (3) the need to select and "prioritize" channels which will eliminate noise in the…
Micro-Macro Analysis of Complex Networks
Marchiori, Massimo; Possamai, Lino
2015-01-01
Complex systems have attracted considerable interest because of their wide range of applications, and are often studied via a “classic” approach: study a specific system, find a complex network behind it, and analyze the corresponding properties. This simple methodology has produced a great deal of interesting results, but relies on an often implicit underlying assumption: the level of detail on which the system is observed. However, in many situations, physical or abstract, the level of detail can be one out of many, and might also depend on intrinsic limitations in viewing the data with a different level of abstraction or precision. So, a fundamental question arises: do properties of a network depend on its level of observability, or are they invariant? If there is a dependence, then an apparently correct network modeling could in fact just be a bad approximation of the true behavior of a complex system. In order to answer this question, we propose a novel micro-macro analysis of complex systems that quantitatively describes how the structure of complex networks varies as a function of the detail level. To this extent, we have developed a new telescopic algorithm that abstracts from the local properties of a system and reconstructs the original structure according to a fuzziness level. This way we can study what happens when passing from a fine level of detail (“micro”) to a different scale level (“macro”), and analyze the corresponding behavior in this transition, obtaining a deeper spectrum analysis. The obtained results show that many important properties are not universally invariant with respect to the level of detail, but instead strongly depend on the specific level on which a network is observed. Therefore, caution should be taken in every situation where a complex network is considered, if its context allows for different levels of observability. PMID:25635812
Alam, Maksudul; Deng, Xinwei; Philipson, Casandra; Bassaganya-Riera, Josep; Bisset, Keith; Carbo, Adria; Eubank, Stephen; Hontecillas, Raquel; Hoops, Stefan; Mei, Yongguo; Abedi, Vida; Marathe, Madhav
2015-01-01
Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close “neighborhood” of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa. PMID:26327290
Alam, Maksudul; Deng, Xinwei; Philipson, Casandra; Bassaganya-Riera, Josep; Bisset, Keith; Carbo, Adria; Eubank, Stephen; Hontecillas, Raquel; Hoops, Stefan; Mei, Yongguo; Abedi, Vida; Marathe, Madhav
2015-01-01
Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close "neighborhood" of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa.
Impulse Response Operators for Structural Complexes
1990-05-12
systems of the complex. The statistical energy analysis (SEA) is one such a device [ 13, 14]. The rendering of SEA from equation (21) and/or (25) lies...Propagation.] 13. L. Cremer, M. Heckl, and E.E. Ungar 1973 Structure-Borne Sound (Springer Verlag). 14. R. H. Lyon 1975 Statistical Energy Analysis of
A Chemical Engineer's Perspective on Health and Disease
Androulakis, Ioannis P.
2014-01-01
Chemical process systems engineering considers complex supply chains which are coupled networks of dynamically interacting systems. The quest to optimize the supply chain while meeting robustness and flexibility constraints in the face of ever changing environments necessitated the development of theoretical and computational tools for the analysis, synthesis and design of such complex engineered architectures. However, it was realized early on that optimality is a complex characteristic required to achieve proper balance between multiple, often competing, objectives. As we begin to unravel life's intricate complexities, we realize that that living systems share similar structural and dynamic characteristics; hence much can be learned about biological complexity from engineered systems. In this article, we draw analogies between concepts in process systems engineering and conceptual models of health and disease; establish connections between these concepts and physiologic modeling; and describe how these mirror onto the physiological counterparts of engineered systems. PMID:25506103
Analysis of space vehicle structures using the transfer-function concept
NASA Technical Reports Server (NTRS)
Heer, E.; Trubert, M. R.
1969-01-01
Analysis of large complex systems is accomplished by dividing it into suitable subsystems and determining the individual dynamical and vibrational responses. Frequency transfer functions then determine the vibrational response of the whole system.
Approaching human language with complex networks.
Cong, Jin; Liu, Haitao
2014-12-01
The interest in modeling and analyzing human language with complex networks is on the rise in recent years and a considerable body of research in this area has already been accumulated. We survey three major lines of linguistic research from the complex network approach: 1) characterization of human language as a multi-level system with complex network analysis; 2) linguistic typological research with the application of linguistic networks and their quantitative measures; and 3) relationships between the system-level complexity of human language (determined by the topology of linguistic networks) and microscopic linguistic (e.g., syntactic) features (as the traditional concern of linguistics). We show that the models and quantitative tools of complex networks, when exploited properly, can constitute an operational methodology for linguistic inquiry, which contributes to the understanding of human language and the development of linguistics. We conclude our review with suggestions for future linguistic research from the complex network approach: 1) relationships between the system-level complexity of human language and microscopic linguistic features; 2) expansion of research scope from the global properties to other levels of granularity of linguistic networks; and 3) combination of linguistic network analysis with other quantitative studies of language (such as quantitative linguistics). Copyright © 2014 Elsevier B.V. All rights reserved.
Righi, Angela Weber; Wachs, Priscila; Saurin, Tarcísio Abreu
2012-01-01
Complexity theory has been adopted by a number of studies as a benchmark to investigate the performance of socio-technical systems, especially those that are characterized by relevant cognitive work. However, there is little guidance on how to assess, systematically, the extent to which a system is complex. The main objective of this study is to carry out a systematic analysis of a SAMU (Mobile Emergency Medical Service) Medical Regulation Center in Brazil, based on the core characteristics of complex systems presented by previous studies. The assessment was based on direct observations and nine interviews: three of them with regulator of emergencies medical doctor, three with radio operators and three with telephone attendants. The results indicated that, to a great extent, the core characteristics of complexity are magnified) due to basic shortcomings in the design of the work system. Thus, some recommendations are put forward with a view to reducing unnecessary complexity that hinders the performance of the socio-technical system.
2012-03-01
EMPIRICAL ANALYSIS OF OPTICAL ATTENUATOR PERFORMANCE IN QUANTUM KEY DISTRIBUTION SYSTEMS USING A...DISTRIBUTION IS UNLIMITED AFIT/GCS/ENG/12-01 EMPIRICAL ANALYSIS OF OPTICAL ATTENUATOR PERFORMANCE IN QUANTUM KEY DISTRIBUTION SYSTEMS USING ...challenging as the complexity of actual implementation specifics are considered. Two components common to most quantum key distribution
2009-06-01
AUTOMATED GEOSPATIAL TOOLS : AGILITY IN COMPLEX PLANNING Primary Topic: Track 5 – Experimentation and Analysis Walter A. Powell [STUDENT] - GMU...TITLE AND SUBTITLE Results of an Experimental Exploration of Advanced Automated Geospatial Tools : Agility in Complex Planning 5a. CONTRACT NUMBER...Std Z39-18 Abstract Typically, the development of tools and systems for the military is requirement driven; systems are developed to meet
Refined two-index entropy and multiscale analysis for complex system
NASA Astrophysics Data System (ADS)
Bian, Songhan; Shang, Pengjian
2016-10-01
As a fundamental concept in describing complex system, entropy measure has been proposed to various forms, like Boltzmann-Gibbs (BG) entropy, one-index entropy, two-index entropy, sample entropy, permutation entropy etc. This paper proposes a new two-index entropy Sq,δ and we find the new two-index entropy is applicable to measure the complexity of wide range of systems in the terms of randomness and fluctuation range. For more complex system, the value of two-index entropy is smaller and the correlation between parameter δ and entropy Sq,δ is weaker. By combining the refined two-index entropy Sq,δ with scaling exponent h(δ), this paper analyzes the complexities of simulation series and classifies several financial markets in various regions of the world effectively.
Marshall, Najja; Timme, Nicholas M.; Bennett, Nicholas; Ripp, Monica; Lautzenhiser, Edward; Beggs, John M.
2016-01-01
Neural systems include interactions that occur across many scales. Two divergent methods for characterizing such interactions have drawn on the physical analysis of critical phenomena and the mathematical study of information. Inferring criticality in neural systems has traditionally rested on fitting power laws to the property distributions of “neural avalanches” (contiguous bursts of activity), but the fractal nature of avalanche shapes has recently emerged as another signature of criticality. On the other hand, neural complexity, an information theoretic measure, has been used to capture the interplay between the functional localization of brain regions and their integration for higher cognitive functions. Unfortunately, treatments of all three methods—power-law fitting, avalanche shape collapse, and neural complexity—have suffered from shortcomings. Empirical data often contain biases that introduce deviations from true power law in the tail and head of the distribution, but deviations in the tail have often been unconsidered; avalanche shape collapse has required manual parameter tuning; and the estimation of neural complexity has relied on small data sets or statistical assumptions for the sake of computational efficiency. In this paper we present technical advancements in the analysis of criticality and complexity in neural systems. We use maximum-likelihood estimation to automatically fit power laws with left and right cutoffs, present the first automated shape collapse algorithm, and describe new techniques to account for large numbers of neural variables and small data sets in the calculation of neural complexity. In order to facilitate future research in criticality and complexity, we have made the software utilized in this analysis freely available online in the MATLAB NCC (Neural Complexity and Criticality) Toolbox. PMID:27445842
ERIC Educational Resources Information Center
Forsman, Jonas; van den Bogaard, Maartje; Linder, Cedric; Fraser, Duncan
2015-01-01
This study uses multilayer minimum spanning tree analysis to develop a model for student retention from a complex system perspective, using data obtained from first-year engineering students at a large well-regarded institution in the European Union. The results show that the elements of the system of student retention are related to one another…
The new challenges of multiplex networks: Measures and models
NASA Astrophysics Data System (ADS)
Battiston, Federico; Nicosia, Vincenzo; Latora, Vito
2017-02-01
What do societies, the Internet, and the human brain have in common? They are all examples of complex relational systems, whose emerging behaviours are largely determined by the non-trivial networks of interactions among their constituents, namely individuals, computers, or neurons, rather than only by the properties of the units themselves. In the last two decades, network scientists have proposed models of increasing complexity to better understand real-world systems. Only recently we have realised that multiplexity, i.e. the coexistence of several types of interactions among the constituents of a complex system, is responsible for substantial qualitative and quantitative differences in the type and variety of behaviours that a complex system can exhibit. As a consequence, multilayer and multiplex networks have become a hot topic in complexity science. Here we provide an overview of some of the measures proposed so far to characterise the structure of multiplex networks, and a selection of models aiming at reproducing those structural properties and quantifying their statistical significance. Focusing on a subset of relevant topics, this brief review is a quite comprehensive introduction to the most basic tools for the analysis of multiplex networks observed in the real-world. The wide applicability of multiplex networks as a framework to model complex systems in different fields, from biology to social sciences, and the colloquial tone of the paper will make it an interesting read for researchers working on both theoretical and experimental analysis of networked systems.
Parametric Analysis of a Hover Test Vehicle using Advanced Test Generation and Data Analysis
NASA Technical Reports Server (NTRS)
Gundy-Burlet, Karen; Schumann, Johann; Menzies, Tim; Barrett, Tony
2009-01-01
Large complex aerospace systems are generally validated in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. This is due to the large parameter space, and complex, highly coupled nonlinear nature of the different systems that contribute to the performance of the aerospace system. We have addressed the factors deterring such an analysis by applying a combination of technologies to the area of flight envelop assessment. We utilize n-factor (2,3) combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. The data generated is automatically analyzed through a combination of unsupervised learning using a Bayesian multivariate clustering technique (AutoBayes) and supervised learning of critical parameter ranges using the machine-learning tool TAR3, a treatment learner. Covariance analysis with scatter plots and likelihood contours are used to visualize correlations between simulation parameters and simulation results, a task that requires tool support, especially for large and complex models. We present results of simulation experiments for a cold-gas-powered hover test vehicle.
Recurrence quantity analysis based on singular value decomposition
NASA Astrophysics Data System (ADS)
Bian, Songhan; Shang, Pengjian
2017-05-01
Recurrence plot (RP) has turned into a powerful tool in many different sciences in the last three decades. To quantify the complexity and structure of RP, recurrence quantification analysis (RQA) has been developed based on the measures of recurrence density, diagonal lines, vertical lines and horizontal lines. This paper will study the RP based on singular value decomposition which is a new perspective of RP study. Principal singular value proportion (PSVP) will be proposed as one new RQA measure and bigger PSVP means higher complexity for one system. In contrast, smaller PSVP reflects a regular and stable system. Considering the advantage of this method in detecting the complexity and periodicity of systems, several simulation and real data experiments are chosen to examine the performance of this new RQA.
D'Suze, Gina; Sandoval, Moisés; Sevcik, Carlos
2015-12-15
A characteristic of venom elution patterns, shared with many other complex systems, is that many their features cannot be properly described with statistical or euclidean concepts. The understanding of such systems became possible with Mandelbrot's fractal analysis. Venom elution patterns were produced using the reversed phase high performance liquid chromatography (HPLC) with 1 mg of venom. One reason for the lack of quantitative analyses of the sources of venom variability is parametrizing the venom chromatograms' complexity. We quantize this complexity by means of an algorithm which estimates the contortedness (Q) of a waveform. Fractal analysis was used to compare venoms and to measure inter- and intra-specific venom variability. We studied variations in venom complexity derived from gender, seasonal and environmental factors, duration of captivity in the laboratory, technique used to milk venom. Copyright © 2015 Elsevier Ltd. All rights reserved.
Javorka, M; Turianikova, Z; Tonhajzerova, I; Javorka, K; Baumert, M
2009-01-01
The purpose of this paper is to investigate the effect of orthostatic challenge on recurrence plot based complexity measures of heart rate and blood pressure variability (HRV and BPV). HRV and BPV complexities were assessed in 28 healthy subjects over 15 min in the supine and standing positions. The complexity of HRV and BPV was assessed based on recurrence quantification analysis. HRV complexity was reduced along with the HRV magnitude after changing from the supine to the standing position. In contrast, the BPV magnitude increased and BPV complexity decreased upon standing. Recurrence quantification analysis (RQA) of HRV and BPV is sensitive to orthostatic challenge and might therefore be suited to assess changes in autonomic neural outflow to the cardiovascular system.
The pyramid system for multiscale raster analysis
De Cola, L.; Montagne, N.
1993-01-01
Geographical research requires the management and analysis of spatial data at multiple scales. As part of the U.S. Geological Survey's global change research program a software system has been developed that reads raster data (such as an image or digital elevation model) and produces a pyramid of aggregated lattices as well as various measurements of spatial complexity. For a given raster dataset the system uses the pyramid to report: (1) mean, (2) variance, (3) a spatial autocorrelation parameter based on multiscale analysis of variance, and (4) a monofractal scaling parameter based on the analysis of isoline lengths. The system is applied to 1-km digital elevation model (DEM) data for a 256-km2 region of central California, as well as to 64 partitions of the region. PYRAMID, which offers robust descriptions of data complexity, also is used to describe the behavior of topographic aspect with scale. ?? 1993.
Vernon, Ian; Liu, Junli; Goldstein, Michael; Rowe, James; Topping, Jen; Lindsey, Keith
2018-01-02
Many mathematical models have now been employed across every area of systems biology. These models increasingly involve large numbers of unknown parameters, have complex structure which can result in substantial evaluation time relative to the needs of the analysis, and need to be compared to observed data of various forms. The correct analysis of such models usually requires a global parameter search, over a high dimensional parameter space, that incorporates and respects the most important sources of uncertainty. This can be an extremely difficult task, but it is essential for any meaningful inference or prediction to be made about any biological system. It hence represents a fundamental challenge for the whole of systems biology. Bayesian statistical methodology for the uncertainty analysis of complex models is introduced, which is designed to address the high dimensional global parameter search problem. Bayesian emulators that mimic the systems biology model but which are extremely fast to evaluate are embeded within an iterative history match: an efficient method to search high dimensional spaces within a more formal statistical setting, while incorporating major sources of uncertainty. The approach is demonstrated via application to a model of hormonal crosstalk in Arabidopsis root development, which has 32 rate parameters, for which we identify the sets of rate parameter values that lead to acceptable matches between model output and observed trend data. The multiple insights into the model's structure that this analysis provides are discussed. The methodology is applied to a second related model, and the biological consequences of the resulting comparison, including the evaluation of gene functions, are described. Bayesian uncertainty analysis for complex models using both emulators and history matching is shown to be a powerful technique that can greatly aid the study of a large class of systems biology models. It both provides insight into model behaviour and identifies the sets of rate parameters of interest.
An empirical comparison of a dynamic software testability metric to static cyclomatic complexity
NASA Technical Reports Server (NTRS)
Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffrey E.
1993-01-01
This paper compares the dynamic testability prediction technique termed 'sensitivity analysis' to the static testability technique termed cyclomatic complexity. The application that we chose in this empirical study is a CASE generated version of a B-737 autoland system. For the B-737 system we analyzed, we isolated those functions that we predict are more prone to hide errors during system/reliability testing. We also analyzed the code with several other well-known static metrics. This paper compares and contrasts the results of sensitivity analysis to the results of the static metrics.
NASA Astrophysics Data System (ADS)
Karmazikov, Y. V.; Fainberg, E. M.
2005-06-01
Work with DICOM compatible equipment integrated into hardware and software systems for medical purposes has been considered. Structures of process of reception and translormation of the data are resulted by the example of digital rentgenography and angiography systems, included in hardware-software complex DIMOL-IK. Algorithms of reception and the analysis of the data are offered. Questions of the further processing and storage of the received data are considered.
Kushniruk, Andre W; Borycki, Elizabeth M
2015-01-01
Innovations in healthcare information systems promise to revolutionize and streamline healthcare processes worldwide. However, the complexity of these systems and the need to better understand issues related to human-computer interaction have slowed progress in this area. In this chapter the authors describe their work in using methods adapted from usability engineering, video ethnography and analysis of digital log files for improving our understanding of complex real-world healthcare interactions between humans and technology. The approaches taken are cost-effective and practical and can provide detailed ethnographic data on issues health professionals and consumers encounter while using systems as well as potential safety problems. The work is important in that it can be used in techno-anthropology to characterize complex user interactions with technologies and also to provide feedback into redesign and optimization of improved healthcare information systems.
Sensitivity based coupling strengths in complex engineering systems
NASA Technical Reports Server (NTRS)
Bloebaum, C. L.; Sobieszczanski-Sobieski, J.
1993-01-01
The iterative design scheme necessary for complex engineering systems is generally time consuming and difficult to implement. Although a decomposition approach results in a more tractable problem, the inherent couplings make establishing the interdependencies of the various subsystems difficult. Another difficulty lies in identifying the most efficient order of execution for the subsystem analyses. The paper describes an approach for determining the dependencies that could be suspended during the system analysis with minimal accuracy losses, thereby reducing the system complexity. A new multidisciplinary testbed is presented, involving the interaction of structures, aerodynamics, and performance disciplines. Results are presented to demonstrate the effectiveness of the system reduction scheme.
Hioki, Yusaku; Tanimura, Ritsuko; Iwamoto, Shinichi; Tanaka, Koichi
2014-03-04
Nanoflow liquid chromatography (nano-LC) is an essential technique for highly sensitive analysis of complex biological samples, and matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is advantageous for rapid identification of proteins and in-depth analysis of post-translational modifications (PTMs). A combination of nano-LC and MALDI-MS (nano-LC/MALDI-MS) is useful for highly sensitive and detailed analysis in life sciences. However, the existing system does not fully utilize the advantages of each technique, especially in the interface of eluate transfer from nano-LC to a MALDI plate. To effectively combine nano-LC with MALDI-MS, we integrated a nano-LC column and a deposition probe for the first time (column probe) and incorporated it into a nano-LC/MALDI-MS system. Spotting nanoliter eluate droplets directly from the column onto the MALDI plate prevents postcolumn diffusion and preserves the chromatographic resolution. A DHB prespotted plate was prepared to suit the fabricated column probe to concentrate the droplets of nano-LC eluate. The performance of the advanced nano-LC/MALDI-MS system was substantiated by analyzing protein digests. When the system was coupled with multidimensional liquid chromatography (MDLC), trace amounts of glycopeptides that spiked into complex samples were successfully detected. Thus, a nano-LC/MALDI-MS direct-spotting system that eliminates postcolumn diffusion was constructed, and the efficacy of the system was demonstrated through highly sensitive analysis of the protein digests or spiked glycopeptides.
Integrating complex business processes for knowledge-driven clinical decision support systems.
Kamaleswaran, Rishikesan; McGregor, Carolyn
2012-01-01
This paper presents in detail the component of the Complex Business Process for Stream Processing framework that is responsible for integrating complex business processes to enable knowledge-driven Clinical Decision Support System (CDSS) recommendations. CDSSs aid the clinician in supporting the care of patients by providing accurate data analysis and evidence-based recommendations. However, the incorporation of a dynamic knowledge-management system that supports the definition and enactment of complex business processes and real-time data streams has not been researched. In this paper we discuss the process web service as an innovative method of providing contextual information to a real-time data stream processing CDSS.
NASA Astrophysics Data System (ADS)
Wang, Rongxi; Gao, Xu; Gao, Jianmin; Gao, Zhiyong; Kang, Jiani
2018-02-01
As one of the most important approaches for analyzing the mechanism of fault pervasion, fault root cause tracing is a powerful and useful tool for detecting the fundamental causes of faults so as to prevent any further propagation and amplification. Focused on the problems arising from the lack of systematic and comprehensive integration, an information transfer-based novel data-driven framework for fault root cause tracing of complex electromechanical systems in the processing industry was proposed, taking into consideration the experience and qualitative analysis of conventional fault root cause tracing methods. Firstly, an improved symbolic transfer entropy method was presented to construct a directed-weighted information model for a specific complex electromechanical system based on the information flow. Secondly, considering the feedback mechanisms in the complex electromechanical systems, a method for determining the threshold values of weights was developed to explore the disciplines of fault propagation. Lastly, an iterative method was introduced to identify the fault development process. The fault root cause was traced by analyzing the changes in information transfer between the nodes along with the fault propagation pathway. An actual fault root cause tracing application of a complex electromechanical system is used to verify the effectiveness of the proposed framework. A unique fault root cause is obtained regardless of the choice of the initial variable. Thus, the proposed framework can be flexibly and effectively used in fault root cause tracing for complex electromechanical systems in the processing industry, and formulate the foundation of system vulnerability analysis and condition prediction, as well as other engineering applications.
Power-rate-distortion analysis for wireless video communication under energy constraint
NASA Astrophysics Data System (ADS)
He, Zhihai; Liang, Yongfang; Ahmad, Ishfaq
2004-01-01
In video coding and streaming over wireless communication network, the power-demanding video encoding operates on the mobile devices with limited energy supply. To analyze, control, and optimize the rate-distortion (R-D) behavior of the wireless video communication system under the energy constraint, we need to develop a power-rate-distortion (P-R-D) analysis framework, which extends the traditional R-D analysis by including another dimension, the power consumption. Specifically, in this paper, we analyze the encoding mechanism of typical video encoding systems and develop a parametric video encoding architecture which is fully scalable in computational complexity. Using dynamic voltage scaling (DVS), a hardware technology recently developed in CMOS circuits design, the complexity scalability can be translated into the power consumption scalability of the video encoder. We investigate the rate-distortion behaviors of the complexity control parameters and establish an analytic framework to explore the P-R-D behavior of the video encoding system. Both theoretically and experimentally, we show that, using this P-R-D model, the encoding system is able to automatically adjust its complexity control parameters to match the available energy supply of the mobile device while maximizing the picture quality. The P-R-D model provides a theoretical guideline for system design and performance optimization in wireless video communication under energy constraint, especially over the wireless video sensor network.
Hybrid modeling and empirical analysis of automobile supply chain network
NASA Astrophysics Data System (ADS)
Sun, Jun-yan; Tang, Jian-ming; Fu, Wei-ping; Wu, Bing-ying
2017-05-01
Based on the connection mechanism of nodes which automatically select upstream and downstream agents, a simulation model for dynamic evolutionary process of consumer-driven automobile supply chain is established by integrating ABM and discrete modeling in the GIS-based map. Firstly, the rationality is proved by analyzing the consistency of sales and changes in various agent parameters between the simulation model and a real automobile supply chain. Second, through complex network theory, hierarchical structures of the model and relationships of networks at different levels are analyzed to calculate various characteristic parameters such as mean distance, mean clustering coefficients, and degree distributions. By doing so, it verifies that the model is a typical scale-free network and small-world network. Finally, the motion law of this model is analyzed from the perspective of complex self-adaptive systems. The chaotic state of the simulation system is verified, which suggests that this system has typical nonlinear characteristics. This model not only macroscopically illustrates the dynamic evolution of complex networks of automobile supply chain but also microcosmically reflects the business process of each agent. Moreover, the model construction and simulation of the system by means of combining CAS theory and complex networks supplies a novel method for supply chain analysis, as well as theory bases and experience for supply chain analysis of auto companies.
New method for estimation of fluence complexity in IMRT fields and correlation with gamma analysis
NASA Astrophysics Data System (ADS)
Hanušová, T.; Vondráček, V.; Badraoui-Čuprová, K.; Horáková, I.; Koniarová, I.
2015-01-01
A new method for estimation of fluence complexity in Intensity Modulated Radiation Therapy (IMRT) fields is proposed. Unlike other previously published works, it is based on portal images calculated by the Portal Dose Calculation algorithm in Eclipse (version 8.6, Varian Medical Systems) in the plane of the EPID aS500 detector (Varian Medical Systems). Fluence complexity is given by the number and the amplitudes of dose gradients in these matrices. Our method is validated using a set of clinical plans where fluence has been smoothed manually so that each plan has a different level of complexity. Fluence complexity calculated with our tool is in accordance with the different levels of smoothing as well as results of gamma analysis, when calculated and measured dose matrices are compared. Thus, it is possible to estimate plan complexity before carrying out the measurement. If appropriate thresholds are determined which would distinguish between acceptably and overly modulated plans, this might save time in the re-planning and re-measuring process.
A modeling framework for exposing risks in complex systems.
Sharit, J
2000-08-01
This article introduces and develops a modeling framework for exposing risks in the form of human errors and adverse consequences in high-risk systems. The modeling framework is based on two components: a two-dimensional theory of accidents in systems developed by Perrow in 1984, and the concept of multiple system perspectives. The theory of accidents differentiates systems on the basis of two sets of attributes. One set characterizes the degree to which systems are interactively complex; the other emphasizes the extent to which systems are tightly coupled. The concept of multiple perspectives provides alternative descriptions of the entire system that serve to enhance insight into system processes. The usefulness of these two model components derives from a modeling framework that cross-links them, enabling a variety of work contexts to be exposed and understood that would otherwise be very difficult or impossible to identify. The model components and the modeling framework are illustrated in the case of a large and comprehensive trauma care system. In addition to its general utility in the area of risk analysis, this methodology may be valuable in applications of current methods of human and system reliability analysis in complex and continually evolving high-risk systems.
Hasson, Uri; Skipper, Jeremy I; Wilde, Michael J; Nusbaum, Howard C; Small, Steven L
2008-01-15
The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data.
Hasson, Uri; Skipper, Jeremy I.; Wilde, Michael J.; Nusbaum, Howard C.; Small, Steven L.
2007-01-01
The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data. PMID:17964812
Integrated Formulation of Beacon-Based Exception Analysis for Multimissions
NASA Technical Reports Server (NTRS)
Mackey, Ryan; James, Mark; Park, Han; Zak, Mickail
2003-01-01
Further work on beacon-based exception analysis for multimissions (BEAM), a method of real-time, automated diagnosis of a complex electromechanical systems, has greatly expanded its capability and suitability of application. This expanded formulation, which fully integrates physical models and symbolic analysis, is described. The new formulation of BEAM expands upon previous advanced techniques for analysis of signal data, utilizing mathematical modeling of the system physics, and expert-system reasoning,
Designing to Support Command and Control in Urban Firefighting
2008-06-01
complex human- machine systems. Keywords: Command and control, firefighting, cognitive systems engineering, cognitive task analysis 1...Elm, W. (2000). Bootstrapping multiple converging cognitive task analysis techniques for system design. In J.M.C. Schraagen, S.F. Chipman, & V.L...Shalin, (Eds.), Cognitive Task Analysis . (pp. 317-340). Mahwah, NJ: Lawrence Erlbaum. Rasmussen, J., Pejtersen, A., Goodman, L. (1994). Cognitive
Availability Analysis of Dual Mode Systems
DOT National Transportation Integrated Search
1974-04-01
The analytical procedures presented define a method of evaluating the effects of failures in a complex dual-mode system based on a worst case steady-state analysis. The computed result is an availability figure of merit and not an absolute prediction...
Analysis of Software Systems for Specialized Computers,
computer) with given computer hardware and software . The object of study is the software system of a computer, designed for solving a fixed complex of...purpose of the analysis is to find parameters that characterize the system and its elements during operation, i.e., when servicing the given requirement flow. (Author)
Drewes, Rich; Zou, Quan; Goodman, Philip H
2009-01-01
Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading "glue" tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS.
Drewes, Rich; Zou, Quan; Goodman, Philip H.
2008-01-01
Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading “glue” tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS. PMID:19506707
Detailed requirements document for the integrated structural analysis system, phase B
NASA Technical Reports Server (NTRS)
Rainey, J. A.
1976-01-01
The requirements are defined for a software system entitled integrated Structural Analysis System (ISAS) Phase B which is being developed to provide the user with a tool by which a complete and detailed analysis of a complex structural system can be performed. This software system will allow for automated interface with numerous structural analysis batch programs and for user interaction in the creation, selection, and validation of data. This system will include modifications to the 4 functions developed for ISAS, and the development of 25 new functions. The new functions are described.
ERIC Educational Resources Information Center
Gilstrap, Donald L.
2009-01-01
This article provides a historiographical analysis of major leadership and organizational development theories that have shaped our thinking about how we lead and administrate academic libraries. Drawing from behavioral, cognitive, systems, and complexity theories, this article discusses major theorists and research studies appearing over the past…
Interpolation problem for the solutions of linear elasticity equations based on monogenic functions
NASA Astrophysics Data System (ADS)
Grigor'ev, Yuri; Gürlebeck, Klaus; Legatiuk, Dmitrii
2017-11-01
Interpolation is an important tool for many practical applications, and very often it is beneficial to interpolate not only with a simple basis system, but rather with solutions of a certain differential equation, e.g. elasticity equation. A typical example for such type of interpolation are collocation methods widely used in practice. It is known, that interpolation theory is fully developed in the framework of the classical complex analysis. However, in quaternionic analysis, which shows a lot of analogies to complex analysis, the situation is more complicated due to the non-commutative multiplication. Thus, a fundamental theorem of algebra is not available, and standard tools from linear algebra cannot be applied in the usual way. To overcome these problems, a special system of monogenic polynomials the so-called Pseudo Complex Polynomials, sharing some properties of complex powers, is used. In this paper, we present an approach to deal with the interpolation problem, where solutions of elasticity equations in three dimensions are used as an interpolation basis.
Outline of a new approach to the analysis of complex systems and decision processes.
NASA Technical Reports Server (NTRS)
Zadeh, L. A.
1973-01-01
Development of a conceptual framework for dealing with systems which are too complex or too ill-defined to admit of precise quantitative analysis. The approach outlined is based on the premise that the key elements in human thinking are not numbers, but labels of fuzzy sets - i.e., classes of objects in which the transition from membership to nonmembership is gradual rather than abrupt. The approach in question has three main distinguishing features - namely, the use of so-called 'linguistic' variables in place of or in addition to numerical variables, the characterization of simple relations between variables by conditional fuzzy statements, and the characterization of complex relations by fuzzy algorithms.
Modeling And Simulation Of Bar Code Scanners Using Computer Aided Design Software
NASA Astrophysics Data System (ADS)
Hellekson, Ron; Campbell, Scott
1988-06-01
Many optical systems have demanding requirements to package the system in a small 3 dimensional space. The use of computer graphic tools can be a tremendous aid to the designer in analyzing the optical problems created by smaller and less costly systems. The Spectra Physics grocery store bar code scanner employs an especially complex 3 dimensional scan pattern to read bar code labels. By using a specially written program which interfaces with a computer aided design system, we have simulated many of the functions of this complex optical system. In this paper we will illustrate how a recent version of the scanner has been designed. We will discuss the use of computer graphics in the design process including interactive tweaking of the scan pattern, analysis of collected light, analysis of the scan pattern density, and analysis of the manufacturing tolerances used to build the scanner.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hramov, Alexander E.; Saratov State Technical University, Politechnicheskaja str., 77, Saratov 410054; Koronovskii, Alexey A.
2012-08-15
The spectrum of Lyapunov exponents is powerful tool for the analysis of the complex system dynamics. In the general framework of nonlinear dynamics, a number of the numerical techniques have been developed to obtain the spectrum of Lyapunov exponents for the complex temporal behavior of the systems with a few degree of freedom. Unfortunately, these methods cannot be applied directly to analysis of complex spatio-temporal dynamics of plasma devices which are characterized by the infinite phase space, since they are the spatially extended active media. In the present paper, we propose the method for the calculation of the spectrum ofmore » the spatial Lyapunov exponents (SLEs) for the spatially extended beam-plasma systems. The calculation technique is applied to the analysis of chaotic spatio-temporal oscillations in three different beam-plasma model: (1) simple plasma Pierce diode, (2) coupled Pierce diodes, and (3) electron-wave system with backward electromagnetic wave. We find an excellent agreement between the system dynamics and the behavior of the spectrum of the spatial Lyapunov exponents. Along with the proposed method, the possible problems of SLEs calculation are also discussed. It is shown that for the wide class of the spatially extended systems, the set of quantities included in the system state for SLEs calculation can be reduced using the appropriate feature of the plasma systems.« less
Integrated Safety Analysis Teams
NASA Technical Reports Server (NTRS)
Wetherholt, Jonathan C.
2008-01-01
Today's complex systems require understanding beyond one person s capability to comprehend. Each system requires a team to divide the system into understandable subsystems which can then be analyzed with an Integrated Hazard Analysis. The team must have both specific experiences and diversity of experience. Safety experience and system understanding are not always manifested in one individual. Group dynamics make the difference between success and failure as well as the difference between a difficult task and a rewarding experience. There are examples in the news which demonstrate the need to connect the pieces of a system into a complete picture. The Columbia disaster is now a standard example of a low consequence hazard in one part of the system; the External Tank is a catastrophic hazard cause for a companion subsystem, the Space Shuttle Orbiter. The interaction between the hardware, the manufacturing process, the handling, and the operations contributed to the problem. Each of these had analysis performed, but who constituted the team which integrated this analysis together? This paper will explore some of the methods used for dividing up a complex system; and how one integration team has analyzed the parts. How this analysis has been documented in one particular launch space vehicle case will also be discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Chan-Joong; Kim, Jimin; Hong, Taehoon
Climate change has become one of the most significant environmental issues, of which about 40% come from the building sector. In particular, complex building projects with various functions have increased, which should be managed from a program-level perspective. Therefore, this study aimed to develop a program-level management system for the life-cycle environmental and economic assessment of complex building projects. The developed system consists of three parts: (i) input part: database server and input data; (ii) analysis part: life cycle assessment and life cycle cost; and (iii) result part: microscopic analysis and macroscopic analysis. To analyze the applicability of the developedmore » system, this study selected ‘U’ University, a complex building project consisting of research facility and residential facility. Through value engineering with experts, a total of 137 design alternatives were established. Based on these alternatives, the macroscopic analysis results were as follows: (i) at the program-level, the life-cycle environmental and economic cost in ‘U’ University were reduced by 6.22% and 2.11%, respectively; (ii) at the project-level, the life-cycle environmental and economic cost in research facility were reduced 6.01% and 1.87%, respectively; and those in residential facility, 12.01% and 3.83%, respective; and (iii) for the mechanical work at the work-type-level, the initial cost was increased 2.9%; but the operation and maintenance phase was reduced by 20.0%. As a result, the developed system can allow the facility managers to establish the operation and maintenance strategies for the environmental and economic aspects from a program-level perspective. - Highlights: • A program-level management system for complex building projects was developed. • Life-cycle environmental and economic assessment can be conducted using the system. • The design alternatives can be analyzed from the microscopic perspective. • The system can be used to establish the optimal O&M strategy at the program-level. • It can be applied to any other country or sector in the global environment.« less
Ribesse, Nathalie; Bossyns, Paul; Marchal, Bruno; Karemere, Hermes; Burman, Christopher J; Macq, Jean
2017-03-01
In the field of development cooperation, interest in systems thinking and complex systems theories as a methodological approach is increasingly recognised. And so it is in health systems research, which informs health development aid interventions. However, practical applications remain scarce to date. The objective of this article is to contribute to the body of knowledge by presenting the tools inspired by systems thinking and complexity theories and methodological lessons learned from their application. These tools were used in a case study. Detailed results of this study are in process for publication in additional articles. Applying a complexity 'lens', the subject of the case study is the role of long-term international technical assistance in supporting health administration reform at the provincial level in the Democratic Republic of Congo. The Methods section presents the guiding principles of systems thinking and complex systems, their relevance and implication for the subject under study, and the existing tools associated with those theories which inspired us in the design of the data collection and analysis process. The tools and their application processes are presented in the results section, and followed in the discussion section by the critical analysis of their innovative potential and emergent challenges. The overall methodology provides a coherent whole, each tool bringing a different and complementary perspective on the system.
Complex Systems Simulation and Optimization Group on performance analysis and benchmarking latest . Research Interests High Performance Computing|Embedded System |Microprocessors & Microcontrollers
Exploratory Climate Data Visualization and Analysis Using DV3D and UVCDAT
NASA Technical Reports Server (NTRS)
Maxwell, Thomas
2012-01-01
Earth system scientists are being inundated by an explosion of data generated by ever-increasing resolution in both global models and remote sensors. Advanced tools for accessing, analyzing, and visualizing very large and complex climate data are required to maintain rapid progress in Earth system research. To meet this need, NASA, in collaboration with the Ultra-scale Visualization Climate Data Analysis Tools (UVCOAT) consortium, is developing exploratory climate data analysis and visualization tools which provide data analysis capabilities for the Earth System Grid (ESG). This paper describes DV3D, a UV-COAT package that enables exploratory analysis of climate simulation and observation datasets. OV3D provides user-friendly interfaces for visualization and analysis of climate data at a level appropriate for scientists. It features workflow inte rfaces, interactive 40 data exploration, hyperwall and stereo visualization, automated provenance generation, and parallel task execution. DV30's integration with CDAT's climate data management system (COMS) and other climate data analysis tools provides a wide range of high performance climate data analysis operations. DV3D expands the scientists' toolbox by incorporating a suite of rich new exploratory visualization and analysis methods for addressing the complexity of climate datasets.
Gonzalez Bernaldo de Quiros, Fernan; Dawidowski, Adriana R; Figar, Silvana
2017-02-01
In this study, we aimed: 1) to conceptualize the theoretical challenges facing health information systems (HIS) to represent patients' decisions about health and medical treatments in everyday life; 2) to suggest approaches for modeling these processes. The conceptualization of the theoretical and methodological challenges was discussed in 2015 during a series of interdisciplinary meetings attended by health informatics staff, epidemiologists and health professionals working in quality management and primary and secondary prevention of chronic diseases of the Hospital Italiano de Buenos Aires, together with sociologists, anthropologists and e-health stakeholders. HIS are facing the need and challenge to represent social human processes based on constructivist and complexity theories, which are the current frameworks of human sciences for understanding human learning and socio-cultural changes. Computer systems based on these theories can model processes of social construction of concrete and subjective entities and the interrelationships between them. These theories could be implemented, among other ways, through the mapping of health assets, analysis of social impact through community trials and modeling of complexity with system simulation tools. This analysis suggested the need to complement the traditional linear causal explanations of disease onset (and treatments) that are the bases for models of analysis of HIS with constructivist and complexity frameworks. Both may enlighten the complex interrelationships among patients, health services and the health system. The aim of this strategy is to clarify people's decision making processes to improve the efficiency, quality and equity of the health services and the health system.
Chang, Le; Baseggio, Oscar; Sementa, Luca; Cheng, Daojian; Fronzoni, Giovanna; Toffoli, Daniele; Aprà, Edoardo; Stener, Mauro; Fortunelli, Alessandro
2018-06-13
We introduce Individual Component Maps of Rotatory Strength (ICM-RS) and Rotatory Strength Density (RSD) plots as analysis tools of chiro-optical linear response spectra deriving from time-dependent density functional theory (TDDFT) simulations. ICM-RS and RSD allow one to visualize the origin of chiro-optical response in momentum or real space, including signed contributions and therefore highlighting cancellation terms that are ubiquitous in chirality phenomena, and should be especially useful in analyzing the spectra of complex systems. As test cases, we use ICM-RS and RSD to analyze circular dichroism spectra of selected (Ag-Au)30(SR)18 monolayer-protected metal nanoclusters, showing the potential of the proposed tools to derive insight and understanding, and eventually rational design, in chiro-optical studies of complex systems.
Venkataramanan, Natarajan Sathiyamoorthy; Suvitha, Ambigapathy; Kawazoe, Yoshiyuki
2017-11-01
This study aims to cast light on the physico-chemical nature and energetics of interactions between the nucleobases and water/DMSO molecules which occurs through the non-conventional CH⋯O/N-H bonds using a comprehensive quantum-chemical approach. The computed interaction energies do not show any appreciable change for all the nucleobase-solvent complexes, conforming the experimental findings on the hydration enthalpies. Compared to water, DMSO form complexes with high interaction energies. The quantitative molecular electrostatic potentials display a charge transfer during the complexation. NBO analysis shows the nucleobase-DMSO complexes, have higher stabilization energy values than the nucleobase-water complexes. AIM analysis illustrates that the in the nucleobase-DMSO complexes, SO⋯H-N type interaction have strongest hydrogen bond strength with high E HB values. Furthermore, the Laplacian of electron density and total electron density were negative indicating the partial covalent nature of bonding in these systems, while the other bonds are classified as noncovalent interactions. EDA analysis indicates, the electrostatic interaction is more pronounced in the case of nucleobase-water complexes, while the dispersion contribution is more dominant in nucleobase-DMSO complexes. NCI-RDG analysis proves the existence of strong hydrogen bonding in nucleobase-DMSO complex, which supports the AIM results. Copyright © 2017 Elsevier Inc. All rights reserved.
Johnston, Lee M; Matteson, Carrie L; Finegood, Diane T
2014-07-01
We demonstrate the use of a systems-based framework to assess solutions to complex health problems such as obesity. We coded 12 documents published between 2004 and 2013 aimed at influencing obesity planning for complex systems design (9 reports from US and Canadian governmental or health authorities, 1 Cochrane review, and 2 Institute of Medicine reports). We sorted data using the intervention-level framework (ILF), a novel solutions-oriented approach to complex problems. An in-depth comparison of 3 documents provides further insight into complexity and systems design in obesity policy. The majority of strategies focused mainly on changing the determinants of energy imbalance (food intake and physical activity). ILF analysis brings to the surface actions aimed at higher levels of system function and points to a need for more innovative policy design. Although many policymakers acknowledge obesity as a complex problem, many strategies stem from the paradigm of individual choice and are limited in scope. The ILF provides a template to encourage natural systems thinking and more strategic policy design grounded in complexity science.
[Data mining analysis of professor Li Fa-zhi AIDS herpes zoster medical record].
Wang, Dan-Ni; Li, Zhen; Xu, Li-Ran; Guo, Hui-Jun
2013-08-01
Analysis of professor Li Fa-zhi in the treatment of AIDS drug laws of herpes zoster and postherpetic neuralgia, provide reference for the use of Chinese medicine treatment of AIDS, herpes zoster and postherpetic neuralgia. By using the method of analyzing the complex network of Weishi county, Henan in 2007 October to 2011 July during an interview with professor Li Fa-zhi treatment of AIDS of herpes zoster and postherpetic neuralgia patients, patients are input structured clinical information collection system, into the analysis of the data, carries on the research analysis theory of traditional Chinese medicine compatibility system algorithm and complex network analysis the use of complex networks. The use of multi-dimensional query analysis of AIDS drugs, the core of herpes zoster and postherpetic neuralgia treated in this study are Scutellariae Radix, Glucyrrhizae Radix, Carthame Flos, Plantaginis Semen, Trichosamthis Fructus, Angelicae Sinensis Radix, Gentianae Radix; core prescription for Longdan Xiegan decoction and Trichosanthes red liquorice decoction. Professor Li Fa-zhi treatment of AIDS, herpes zoster and postherpetic neuralgia by clearing heat and removing dampness and activating blood circulation to.
Towards an integral computer environment supporting system operations analysis and conceptual design
NASA Technical Reports Server (NTRS)
Barro, E.; Delbufalo, A.; Rossi, F.
1994-01-01
VITROCISET has in house developed a prototype tool named System Dynamic Analysis Environment (SDAE) to support system engineering activities in the initial definition phase of a complex space system. The SDAE goal is to provide powerful means for the definition, analysis, and trade-off of operations and design concepts for the space and ground elements involved in a mission. For this purpose SDAE implements a dedicated modeling methodology based on the integration of different modern (static and dynamic) analysis and simulation techniques. The resulting 'system model' is capable of representing all the operational, functional, and behavioral aspects of the system elements which are part of a mission. The execution of customized model simulations enables: the validation of selected concepts with respect to mission requirements; the in-depth investigation of mission specific operational and/or architectural aspects; and the early assessment of performances required by the system elements to cope with mission constraints and objectives. Due to its characteristics, SDAE is particularly tailored for nonconventional or highly complex systems, which require a great analysis effort in their early definition stages. SDAE runs under PC-Windows and is currently used by VITROCISET system engineering group. This paper describes the SDAE main features, showing some tool output examples.
Biological and Clinical Aspects of Lanthanide Coordination Compounds
Misra, Sudhindra N.; M., Indira Devi; Shukla, Ram S.
2004-01-01
The coordinating chemistry of lanthanides, relevant to the biological, biochemical and medical aspects, makes a significant contribution to understanding the basis of application of lanthanides, particularly in biological and medical systems. The importance of the applications of lanthanides, as an excellent diagnostic and prognostic probe in clinical diagnostics, and an anticancer material, is remarkably increasing. Lanthanide complexes based X-ray contrast imaging and lanthanide chelates based contrast enhancing agents for magnetic resonance imaging (MRI) are being excessively used in radiological analysis in our body systems. The most important property of the chelating agents, in lanthanide chelate complex, is its ability to alter the behaviour of lanthanide ion with which it binds in biological systems, and the chelation markedly modifies the biodistribution and excretion profile of the lanthanide ions. The chelating agents, especially aminopoly carboxylic acids, being hydrophilic, increase the proportion of their complex excreted from complexed lanthanide ion form biological systems. Lanthanide polyamino carboxylate-chelate complexes are used as contrast enhancing agents for Magnetic Resonance Imaging. Conjugation of antibodies and other tissue specific molecules to lanthanide chelates has led to a new type of specific MRI contrast agents and their conjugated MRI contrast agents with improved relaxivity, functioning in the body similar to drugs. Many specific features of contrast agent assisted MRI make it particularly effective for musculoskeletal and cerebrospinal imaging. Lanthanide-chelate contrast agents are effectively used in clinical diagnostic investigations involving cerebrospinal diseases and in evaluation of central nervous system. Chelated lanthanide complexes shift reagent aided 23Na NMR spectroscopic analysis is used in cellular, tissue and whole organ systems. PMID:18365075
Investigation of design considerations for a complex demodulation filter
NASA Technical Reports Server (NTRS)
Stoughton, J. W.
1984-01-01
The digital design of an adaptive digital filter to be employed in the processing of microwave remote sensor data was developed. In particular, a complex demodulation approach was developed to provide narrow band power estimation for a proposed Doppler scatterometer system. This scatterometer was considered for application in the proposed National Oceanographic survey satellite, on an improvement of SEASAT features. A generalized analysis of complex diagrams for the digital architecture component of the proposed system.
Informational analysis involving application of complex information system
NASA Astrophysics Data System (ADS)
Ciupak, Clébia; Vanti, Adolfo Alberto; Balloni, Antonio José; Espin, Rafael
The aim of the present research is performing an informal analysis for internal audit involving the application of complex information system based on fuzzy logic. The same has been applied in internal audit involving the integration of the accounting field into the information systems field. The technological advancements can provide improvements to the work performed by the internal audit. Thus we aim to find, in the complex information systems, priorities for the work of internal audit of a high importance Private Institution of Higher Education. The applied method is quali-quantitative, as from the definition of strategic linguistic variables it was possible to transform them into quantitative with the matrix intersection. By means of a case study, where data were collected via interview with the Administrative Pro-Rector, who takes part at the elaboration of the strategic planning of the institution, it was possible to infer analysis concerning points which must be prioritized at the internal audit work. We emphasize that the priorities were identified when processed in a system (of academic use). From the study we can conclude that, starting from these information systems, audit can identify priorities on its work program. Along with plans and strategic objectives of the enterprise, the internal auditor can define operational procedures to work in favor of the attainment of the objectives of the organization.
Analysis of a Plant Transcriptional Regulatory Network Using Transient Expression Systems.
Díaz-Triviño, Sara; Long, Yuchen; Scheres, Ben; Blilou, Ikram
2017-01-01
In plant biology, transient expression systems have become valuable approaches used routinely to rapidly study protein expression, subcellular localization, protein-protein interactions, and transcriptional activity prior to in vivo studies. When studying transcriptional regulation, luciferase reporter assays offer a sensitive readout for assaying promoter behavior in response to different regulators or environmental contexts and to confirm and assess the functional relevance of predicted binding sites in target promoters. This chapter aims to provide detailed methods for using luciferase reporter system as a rapid, efficient, and versatile assay to analyze transcriptional regulation of target genes by transcriptional regulators. We describe a series of optimized transient expression systems consisting of Arabidopsis thaliana protoplasts, infiltrated Nicotiana benthamiana leaves, and human HeLa cells to study the transcriptional regulations of two well-characterized transcriptional regulators SCARECROW (SCR) and SHORT-ROOT (SHR) on one of their targets, CYCLIN D6 (CYCD6).Here, we illustrate similarities and differences in outcomes when using different systems. The plant-based systems revealed that the SCR-SHR complex enhances CYCD6 transcription, while analysis in HeLa cells showed that the complex is not sufficient to strongly induce CYCD6 transcription, suggesting that additional, plant-specific regulators are required for full activation. These results highlight the importance of the system and suggest that including heterologous systems, such as HeLa cells, can provide a more comprehensive analysis of a complex gene regulatory network.
Egri-Nagy, Attila; Nehaniv, Chrystopher L
2008-01-01
Beyond complexity measures, sometimes it is worthwhile in addition to investigate how complexity changes structurally, especially in artificial systems where we have complete knowledge about the evolutionary process. Hierarchical decomposition is a useful way of assessing structural complexity changes of organisms modeled as automata, and we show how recently developed computational tools can be used for this purpose, by computing holonomy decompositions and holonomy complexity. To gain insight into the evolution of complexity, we investigate the smoothness of the landscape structure of complexity under minimal transitions. As a proof of concept, we illustrate how the hierarchical complexity analysis reveals symmetries and irreversible structure in biological networks by applying the methods to the lac operon mechanism in the genetic regulatory network of Escherichia coli.
Takecian, Pedro L.; Oikawa, Marcio K.; Braghetto, Kelly R.; Rocha, Paulo; Lucena, Fred; Kavounis, Katherine; Schlumpf, Karen S.; Acker, Susan; Carneiro-Proietti, Anna B. F.; Sabino, Ester C.; Custer, Brian; Busch, Michael P.; Ferreira, João E.
2013-01-01
Over time, data warehouse (DW) systems have become more difficult to develop because of the growing heterogeneity of data sources. Despite advances in research and technology, DW projects are still too slow for pragmatic results to be generated. Here, we address the following question: how can the complexity of DW development for integration of heterogeneous transactional information systems be reduced? To answer this, we proposed methodological guidelines based on cycles of conceptual modeling and data analysis, to drive construction of a modular DW system. These guidelines were applied to the blood donation domain, successfully reducing the complexity of DW development. PMID:23729945
Takecian, Pedro L; Oikawa, Marcio K; Braghetto, Kelly R; Rocha, Paulo; Lucena, Fred; Kavounis, Katherine; Schlumpf, Karen S; Acker, Susan; Carneiro-Proietti, Anna B F; Sabino, Ester C; Custer, Brian; Busch, Michael P; Ferreira, João E
2013-06-01
Over time, data warehouse (DW) systems have become more difficult to develop because of the growing heterogeneity of data sources. Despite advances in research and technology, DW projects are still too slow for pragmatic results to be generated. Here, we address the following question: how can the complexity of DW development for integration of heterogeneous transactional information systems be reduced? To answer this, we proposed methodological guidelines based on cycles of conceptual modeling and data analysis, to drive construction of a modular DW system. These guidelines were applied to the blood donation domain, successfully reducing the complexity of DW development.
Analysis and Design of Complex Network Environments
2012-03-01
and J. Lowe, “The myths and facts behind cyber security risks for industrial control systems ,” in the Proceedings of the VDE Kongress, VDE Congress...questions about 1) how to model them, 2) the design of experiments necessary to discover their structure (and thus adapt system inputs to optimize the...theoretical work that clarifies fundamental limitations of complex networks with network engineering and systems biology to implement specific designs and
Human performance cognitive-behavioral modeling: a benefit for occupational safety.
Gore, Brian F
2002-01-01
Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.
Artificial intelligence applied to process signal analysis
NASA Technical Reports Server (NTRS)
Corsberg, Dan
1988-01-01
Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.
Human performance cognitive-behavioral modeling: a benefit for occupational safety
NASA Technical Reports Server (NTRS)
Gore, Brian F.
2002-01-01
Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.
Axisa, F; Gehin, C; Delhomme, G; Collet, C; Robin, O; Dittmar, A
2004-01-01
Improvement of the quality and efficiency of the quality of health in medicine, at home and in hospital becomes more and more important Designed to be user-friendly, smart clothes and gloves fit well for such a citizen use and health monitoring. Analysis of the autonomic nervous system using non-invasive sensors provides information for the emotional, sensorial, cognitive and physiological analysis. MARSIAN (modular autonomous recorder system for the measurement of autonomic nervous system) is a wrist ambulatory monitoring and recording system with a smart glove with sensors for the detection of the activity of the autonomic nervous system. It is composed of a "smart tee shirt", a "smart glove", a wrist device and PC which records data. The smart glove is one of the key point of MARSIAN. Complex movements, complex geometry, sensation make smart glove designing a challenge. MARSIAN has a large field of applications and researches (vigilance, behaviour, sensorial analysis, thermal environment for human, cognition science, sport, etc...) in various fields like neurophysiology, affective computing and health monitoring.
Complexity and Hopf Bifurcation Analysis on a Kind of Fractional-Order IS-LM Macroeconomic System
NASA Astrophysics Data System (ADS)
Ma, Junhai; Ren, Wenbo
On the basis of our previous research, we deepen and complete a kind of macroeconomics IS-LM model with fractional-order calculus theory, which is a good reflection on the memory characteristics of economic variables, we also focus on the influence of the variables on the real system, and improve the analysis capabilities of the traditional economic models to suit the actual macroeconomic environment. The conditions of Hopf bifurcation in fractional-order system models are briefly demonstrated, and the fractional order when Hopf bifurcation occurs is calculated, showing the inherent complex dynamic characteristics of the system. With numerical simulation, bifurcation, strange attractor, limit cycle, waveform and other complex dynamic characteristics are given; and the order condition is obtained with respect to time. We find that the system order has an important influence on the running state of the system. The system has a periodic motion when the order meets the conditions of Hopf bifurcation; the fractional-order system gradually stabilizes with the change of the order and parameters while the corresponding integer-order system diverges. This study has certain significance to policy-making about macroeconomic regulation and control.
NASA Astrophysics Data System (ADS)
Ganiev, R. F.; Reviznikov, D. L.; Rogoza, A. N.; Slastushenskiy, Yu. V.; Ukrainskiy, L. E.
2017-03-01
A description of a complex approach to investigation of nonlinear wave processes in the human cardiovascular system based on a combination of high-precision methods of measuring a pulse wave, mathematical methods of processing the empirical data, and methods of direct numerical modeling of hemodynamic processes in an arterial tree is given.
A universal indicator of critical state transitions in noisy complex networked systems
Liang, Junhao; Hu, Yanqing; Chen, Guanrong; Zhou, Tianshou
2017-01-01
Critical transition, a phenomenon that a system shifts suddenly from one state to another, occurs in many real-world complex networks. We propose an analytical framework for exactly predicting the critical transition in a complex networked system subjected to noise effects. Our prediction is based on the characteristic return time of a simple one-dimensional system derived from the original higher-dimensional system. This characteristic time, which can be easily calculated using network data, allows us to systematically separate the respective roles of dynamics, noise and topology of the underlying networked system. We find that the noise can either prevent or enhance critical transitions, playing a key role in compensating the network structural defect which suffers from either internal failures or environmental changes, or both. Our analysis of realistic or artificial examples reveals that the characteristic return time is an effective indicator for forecasting the sudden deterioration of complex networks. PMID:28230166
An application of sample entropy to precipitation in Paraíba State, Brazil
NASA Astrophysics Data System (ADS)
Xavier, Sílvio Fernando Alves; da Silva Jale, Jader; Stosic, Tatijana; dos Santos, Carlos Antonio Costa; Singh, Vijay P.
2018-05-01
A climate system is characterized to be a complex non-linear system. In order to describe the complex characteristics of precipitation series in Paraíba State, Brazil, we aim the use of sample entropy, a kind of entropy-based algorithm, to evaluate the complexity of precipitation series. Sixty-nine meteorological stations are distributed over four macroregions: Zona da Mata, Agreste, Borborema, and Sertão. The results of the analysis show that intricacies of monthly average precipitation have differences in the macroregions. Sample entropy is able to reflect the dynamic change of precipitation series providing a new way to investigate complexity of hydrological series. The complexity exhibits areal variation of local water resource systems which can influence the basis for utilizing and developing resources in dry areas.
NASA Technical Reports Server (NTRS)
Lindvall, Mikael; Godfrey, Sally; Ackermann, Chris; Ray, Arnab; Yonkwa, Lyly; Ganesan, Dharma; Stratton, William C.; Sibol, Deane E.
2008-01-01
Analyze, Visualize, and Evaluate structure and behavior using static and dynamic information, individual systems as well as systems of systems. Next steps: Refine software tool support; Apply to other systems; and Apply earlier in system life cycle.
T-MATS Toolbox for the Modeling and Analysis of Thermodynamic Systems
NASA Technical Reports Server (NTRS)
Chapman, Jeffryes W.
2014-01-01
The Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) is a MATLABSimulink (The MathWorks Inc.) plug-in for creating and simulating thermodynamic systems and controls. The package contains generic parameterized components that can be combined with a variable input iterative solver and optimization algorithm to create complex system models, such as gas turbines.
Information Theory Applied to Animal Communication Systems and Its Possible Application to SETI
NASA Astrophysics Data System (ADS)
Hanser, Sean F.; Doyle, Laurance R.; McCowan, Brenda; Jenkins, Jon M.
2004-06-01
Information theory, as first introduced by Claude Shannon (Shannon &Weaver 1949) quantitatively evaluates the organizational complexity of communication systems. At the same time George Zipf was examining linguistic structure in a way that was mathematically similar to the components of the Shannon first-order entropy (Zipf 1949). Both Shannon's and Zipf's mathematical procedures have been applied to animal communication and recently have been providing insightful results. The Zipf plot is a useful tool for a first estimate of the characterization of a communication system's complexity (which can later be examined for complex structure at deeper levels using Shannon entropic analysis). In this paper we shall discuss some of the applications and pitfalls of using the Zipf distribution as a preliminary evaluator of the communication complexity of a signaling system.
System Theoretic Frameworks for Mitigating Risk Complexity in the Nuclear Fuel Cycle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Adam David; Mohagheghi, Amir H.; Cohn, Brian
In response to the expansion of nuclear fuel cycle (NFC) activities -- and the associated suite of risks -- around the world, this project evaluated systems-based solutions for managing such risk complexity in multimodal and multi-jurisdictional international spent nuclear fuel (SNF) transportation. By better understanding systemic risks in SNF transportation, developing SNF transportation risk assessment frameworks, and evaluating these systems-based risk assessment frameworks, this research illustrated interdependency between safety, security, and safeguards risks is inherent in NFC activities and can go unidentified when each "S" is independently evaluated. Two novel system-theoretic analysis techniques -- dynamic probabilistic risk assessment (DPRA) andmore » system-theoretic process analysis (STPA) -- provide integrated "3S" analysis to address these interdependencies and the research results suggest a need -- and provide a way -- to reprioritize United States engagement efforts to reduce global nuclear risks. Lastly, this research identifies areas where Sandia National Laboratories can spearhead technical advances to reduce global nuclear dangers.« less
Cairoli, Andrea; Piovani, Duccio; Jensen, Henrik Jeldtoft
2014-12-31
We propose a new procedure to monitor and forecast the onset of transitions in high-dimensional complex systems. We describe our procedure by an application to the tangled nature model of evolutionary ecology. The quasistable configurations of the full stochastic dynamics are taken as input for a stability analysis by means of the deterministic mean-field equations. Numerical analysis of the high-dimensional stability matrix allows us to identify unstable directions associated with eigenvalues with a positive real part. The overlap of the instantaneous configuration vector of the full stochastic system with the eigenvectors of the unstable directions of the deterministic mean-field approximation is found to be a good early warning of the transitions occurring intermittently.
NASA Technical Reports Server (NTRS)
Johnson, S. C.
1982-01-01
An interface system for passing data between a relational information management (RIM) data base complex and engineering analysis language (EAL), a finite element structural analysis program is documented. The interface system, implemented on a CDC Cyber computer, is composed of two FORTRAN programs called RIM2EAL and EAL2RIM. The RIM2EAL reads model definition data from RIM and creates a file of EAL commands to define the model. The EAL2RIM reads model definition and EAL generated analysis data from EAL's data library and stores these data dirctly in a RIM data base. These two interface programs and the format for the RIM data complex are described.
Complexity analysis of human physiological signals based on case studies
NASA Astrophysics Data System (ADS)
Angelova, Maia; Holloway, Philip; Ellis, Jason
2015-04-01
This work focuses on methods for investigation of physiological time series based on complexity analysis. It is a part of a wider programme to determine non-invasive markers for healthy ageing. We consider two case studies investigated with actigraphy: (a) sleep and alternations with insomnia, and (b) ageing effects on mobility patterns. We illustrate, using these case studies, the application of fractal analysis to the investigation of regulation patterns and control, and change of physiological function. In the first case study, fractal analysis techniques were implemented to study the correlations present in sleep actigraphy for individuals suffering from acute insomnia in comparison with healthy controls. The aim was to investigate if complexity analysis can detect the onset of adverse health-related events. The subjects with acute insomnia displayed significantly higher levels of complexity, possibly a result of too much activity in the underlying regulatory systems. The second case study considered mobility patterns during night time and their variations with age. It showed that complexity metrics can identify change in physiological function with ageing. Both studies demonstrated that complexity analysis can be used to investigate markers of health, disease and healthy ageing.
Theory of reliable systems. [systems analysis and design
NASA Technical Reports Server (NTRS)
Meyer, J. F.
1973-01-01
The analysis and design of reliable systems are discussed. The attributes of system reliability studied are fault tolerance, diagnosability, and reconfigurability. Objectives of the study include: to determine properties of system structure that are conducive to a particular attribute; to determine methods for obtaining reliable realizations of a given system; and to determine how properties of system behavior relate to the complexity of fault tolerant realizations. A list of 34 references is included.
Single-Molecule Analysis for RISC Assembly and Target Cleavage.
Sasaki, Hiroshi M; Tadakuma, Hisashi; Tomari, Yukihide
2018-01-01
RNA-induced silencing complex (RISC) is a small RNA-protein complex that mediates silencing of complementary target RNAs. Biochemistry has been successfully used to characterize the molecular mechanism of RISC assembly and function for nearly two decades. However, further dissection of intermediate states during the reactions has been warranted to fill in the gaps in our understanding of RNA silencing mechanisms. Single-molecule analysis with total internal reflection fluorescence (TIRF) microscopy is a powerful imaging-based approach to interrogate complex formation and dynamics at the individual molecule level with high sensitivity. Combining this technique with our recently established in vitro reconstitution system of fly Ago2-RISC, we have developed a single-molecule observation system for RISC assembly. In this chapter, we summarize the detailed protocol for single-molecule analysis of chaperone-assisted assembly of fly Ago2-RISC as well as its target cleavage reaction.
A Framework for Reliability and Safety Analysis of Complex Space Missions
NASA Technical Reports Server (NTRS)
Evans, John W.; Groen, Frank; Wang, Lui; Austin, Rebekah; Witulski, Art; Mahadevan, Nagabhushan; Cornford, Steven L.; Feather, Martin S.; Lindsey, Nancy
2017-01-01
Long duration and complex mission scenarios are characteristics of NASA's human exploration of Mars, and will provide unprecedented challenges. Systems reliability and safety will become increasingly demanding and management of uncertainty will be increasingly important. NASA's current pioneering strategy recognizes and relies upon assurance of crew and asset safety. In this regard, flexibility to develop and innovate in the emergence of new design environments and methodologies, encompassing modeling of complex systems, is essential to meet the challenges.
NASA Astrophysics Data System (ADS)
Nagy, Julia; Eilert, Tobias; Michaelis, Jens
2018-03-01
Modern hybrid structural analysis methods have opened new possibilities to analyze and resolve flexible protein complexes where conventional crystallographic methods have reached their limits. Here, the Fast-Nano-Positioning System (Fast-NPS), a Bayesian parameter estimation-based analysis method and software, is an interesting method since it allows for the localization of unknown fluorescent dye molecules attached to macromolecular complexes based on single-molecule Förster resonance energy transfer (smFRET) measurements. However, the precision, accuracy, and reliability of structural models derived from results based on such complex calculation schemes are oftentimes difficult to evaluate. Therefore, we present two proof-of-principle benchmark studies where we use smFRET data to localize supposedly unknown positions on a DNA as well as on a protein-nucleic acid complex. Since we use complexes where structural information is available, we can compare Fast-NPS localization to the existing structural data. In particular, we compare different dye models and discuss how both accuracy and precision can be optimized.
Expert system development for commonality analysis in space programs
NASA Technical Reports Server (NTRS)
Yeager, Dorian P.
1987-01-01
This report is a combination of foundational mathematics and software design. A mathematical model of the Commonality Analysis problem was developed and some important properties discovered. The complexity of the problem is described herein and techniques, both deterministic and heuristic, for reducing that complexity are presented. Weaknesses are pointed out in the existing software (System Commonality Analysis Tool) and several improvements are recommended. It is recommended that: (1) an expert system for guiding the design of new databases be developed; (2) a distributed knowledge base be created and maintained for the purpose of encoding the commonality relationships between design items in commonality databases; (3) a software module be produced which automatically generates commonality alternative sets from commonality databases using the knowledge associated with those databases; and (4) a more complete commonality analysis module be written which is capable of generating any type of feasible solution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pullum, Laura L; Symons, Christopher T
2011-01-01
Machine learning is used in many applications, from machine vision to speech recognition to decision support systems, and is used to test applications. However, though much has been done to evaluate the performance of machine learning algorithms, little has been done to verify the algorithms or examine their failure modes. Moreover, complex learning frameworks often require stepping beyond black box evaluation to distinguish between errors based on natural limits on learning and errors that arise from mistakes in implementation. We present a conceptual architecture, failure model and taxonomy, and failure modes and effects analysis (FMEA) of a semi-supervised, multi-modal learningmore » system, and provide specific examples from its use in a radiological analysis assistant system. The goal of the research described in this paper is to provide a foundation from which dependability analysis of systems using semi-supervised, multi-modal learning can be conducted. The methods presented provide a first step towards that overall goal.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawton, Craig R.
2015-01-01
The military is undergoing a significant transformation as it modernizes for the information age and adapts to address an emerging asymmetric threat beyond traditional cold war era adversaries. Techniques such as traditional large-scale, joint services war gaming analysis are no longer adequate to support program evaluation activities and mission planning analysis at the enterprise level because the operating environment is evolving too quickly. New analytical capabilities are necessary to address modernization of the Department of Defense (DoD) enterprise. This presents significant opportunity to Sandia in supporting the nation at this transformational enterprise scale. Although Sandia has significant experience with engineeringmore » system of systems (SoS) and Complex Adaptive System of Systems (CASoS), significant fundamental research is required to develop modeling, simulation and analysis capabilities at the enterprise scale. This report documents an enterprise modeling framework which will enable senior level decision makers to better understand their enterprise and required future investments.« less
Microgravity isolation system design: A modern control analysis framework
NASA Technical Reports Server (NTRS)
Hampton, R. D.; Knospe, C. R.; Allaire, P. E.; Grodsinsky, C. M.
1994-01-01
Many acceleration-sensitive, microgravity science experiments will require active vibration isolation from the manned orbiters on which they will be mounted. The isolation problem, especially in the case of a tethered payload, is a complex three-dimensional one that is best suited to modern-control design methods. These methods, although more powerful than their classical counterparts, can nonetheless go only so far in meeting the design requirements for practical systems. Once a tentative controller design is available, it must still be evaluated to determine whether or not it is fully acceptable, and to compare it with other possible design candidates. Realistically, such evaluation will be an inherent part of a necessary iterative design process. In this paper, an approach is presented for applying complex mu-analysis methods to a closed-loop vibration isolation system (experiment plus controller). An analysis framework is presented for evaluating nominal stability, nominal performance, robust stability, and robust performance of active microgravity isolation systems, with emphasis on the effective use of mu-analysis methods.
NASA Astrophysics Data System (ADS)
Varney, Philip; Green, Itzhak
2014-11-01
Numerous methods are available to calculate rotordynamic whirl frequencies, including analytic methods, finite element analysis, and the transfer matrix method. The typical real-valued transfer matrix (RTM) suffers from several deficiencies, including lengthy computation times and the inability to distinguish forward and backward whirl. Though application of complex coordinates in rotordynamic analysis is not novel per se, specific advantages gained from using such coordinates in a transfer matrix analysis have yet to be elucidated. The present work employs a complex coordinate redefinition of the transfer matrix to obtain reduced forms of the elemental transfer matrices in inertial and rotating reference frames, including external stiffness and damping. Application of the complex-valued state variable redefinition results in a reduction of the 8×8 RTM to the 4×4 Complex Transfer Matrix (CTM). The CTM is advantageous in that it intrinsically separates forward and backward whirl, eases symbolic manipulation by halving the transfer matrices’ dimension, and provides significant improvement in computation time. A symbolic analysis is performed on a simple overhung rotor to demonstrate the mathematical motivation for whirl frequency separation. The CTM's utility is further shown by analyzing a rotordynamic system supported by viscoelastic elastomer rings. Viscoelastic elastomer ring supports can provide significant damping while reducing the cost and complexity associated with conventional components such as squeeze film dampers. The stiffness and damping of a viscoelastic damper ring are determined herein as a function of whirl frequency using the viscoelastic correspondence principle and a constitutive fractional calculus viscoelasticity model. The CTM is then employed to obtain the characteristic equation, where the whirl frequency dependent stiffness and damping of the elastomer supports are included. The Campbell diagram is shown, demonstrating the CTM's ability to intrinsically separate synchronous whirl direction for a non-trivial rotordynamic system. Good agreement is found between the CTM results and previously obtained analytic and experimental results for the elastomer ring supported rotordynamic system.
NASA Astrophysics Data System (ADS)
Abdel-Aty, Mahmoud
2016-07-01
The modeling of a complex system requires the analysis of all microscopic constituents and in particular of their interactions [1]. The interest in this research field has increased considering also recent developments in the information sciences. However interaction among scholars working in various fields of the applied sciences can be considered the true motor for the definition of a general framework for the analysis of complex systems. In particular biological systems constitute the platform where many scientists have decided to collaborate in order to gain a global description of the system. Among others, cancer-immune system competition (see [2] and the review papers [3,4]) has attracted much attention.
Some aspects of mathematical and chemical modeling of complex chemical processes
NASA Technical Reports Server (NTRS)
Nemes, I.; Botar, L.; Danoczy, E.; Vidoczy, T.; Gal, D.
1983-01-01
Some theoretical questions involved in the mathematical modeling of the kinetics of complex chemical process are discussed. The analysis is carried out for the homogeneous oxidation of ethylbenzene in the liquid phase. Particular attention is given to the determination of the general characteristics of chemical systems from an analysis of mathematical models developed on the basis of linear algebra.
Discrimination of complex mixtures by a colorimetric sensor array: coffee aromas.
Suslick, Benjamin A; Feng, Liang; Suslick, Kenneth S
2010-03-01
The analysis of complex mixtures presents a difficult challenge even for modern analytical techniques, and the ability to discriminate among closely similar such mixtures often remains problematic. Coffee provides a readily available archetype of such highly multicomponent systems. The use of a low-cost, sensitive colorimetric sensor array for the detection and identification of coffee aromas is reported. The color changes of the sensor array were used as a digital representation of the array response and analyzed with standard statistical methods, including principal component analysis (PCA) and hierarchical clustering analysis (HCA). PCA revealed that the sensor array has exceptionally high dimensionality with 18 dimensions required to define 90% of the total variance. In quintuplicate runs of 10 commercial coffees and controls, no confusions or errors in classification by HCA were observed in 55 trials. In addition, the effects of temperature and time in the roasting of green coffee beans were readily observed and distinguishable with a resolution better than 10 degrees C and 5 min, respectively. Colorimetric sensor arrays demonstrate excellent potential for complex systems analysis in real-world applications and provide a novel method for discrimination among closely similar complex mixtures.
Discrimination of Complex Mixtures by a Colorimetric Sensor Array: Coffee Aromas
Suslick, Benjamin A.; Feng, Liang; Suslick, Kenneth S.
2010-01-01
The analysis of complex mixtures presents a difficult challenge even for modern analytical techniques, and the ability to discriminate among closely similar such mixtures often remains problematic. Coffee provides a readily available archetype of such highly multicomponent systems. The use of a low-cost, sensitive colorimetric sensor array for the detection and identification of coffee aromas is reported. The color changes of the sensor array were used as a digital representation of the array response and analyzed with standard statistical methods, including principal component analysis (PCA) and hierarchical clustering analysis (HCA). PCA revealed that the sensor array has exceptionally high dimensionality with 18 dimensions required to define 90% of the total variance. In quintuplicate runs of 10 commercial coffees and controls, no confusions or errors in classification by HCA were observed in 55 trials. In addition, the effects of temperature and time in the roasting of green coffee beans were readily observed and distinguishable with a resolution better than 10 °C and 5 min, respectively. Colorimetric sensor arrays demonstrate excellent potential for complex systems analysis in real-world applications and provide a novel method for discrimination among closely similar complex mixtures. PMID:20143838
Multiscale entropy-based methods for heart rate variability complexity analysis
NASA Astrophysics Data System (ADS)
Silva, Luiz Eduardo Virgilio; Cabella, Brenno Caetano Troca; Neves, Ubiraci Pereira da Costa; Murta Junior, Luiz Otavio
2015-03-01
Physiologic complexity is an important concept to characterize time series from biological systems, which associated to multiscale analysis can contribute to comprehension of many complex phenomena. Although multiscale entropy has been applied to physiological time series, it measures irregularity as function of scale. In this study we purpose and evaluate a set of three complexity metrics as function of time scales. Complexity metrics are derived from nonadditive entropy supported by generation of surrogate data, i.e. SDiffqmax, qmax and qzero. In order to access accuracy of proposed complexity metrics, receiver operating characteristic (ROC) curves were built and area under the curves was computed for three physiological situations. Heart rate variability (HRV) time series in normal sinus rhythm, atrial fibrillation, and congestive heart failure data set were analyzed. Results show that proposed metric for complexity is accurate and robust when compared to classic entropic irregularity metrics. Furthermore, SDiffqmax is the most accurate for lower scales, whereas qmax and qzero are the most accurate when higher time scales are considered. Multiscale complexity analysis described here showed potential to assess complex physiological time series and deserves further investigation in wide context.
Pattern dynamics of the reaction-diffusion immune system.
Zheng, Qianqian; Shen, Jianwei; Wang, Zhijie
2018-01-01
In this paper, we will investigate the effect of diffusion, which is ubiquitous in nature, on the immune system using a reaction-diffusion model in order to understand the dynamical behavior of complex patterns and control the dynamics of different patterns. Through control theory and linear stability analysis of local equilibrium, we obtain the optimal condition under which the system loses stability and a Turing pattern occurs. By combining mathematical analysis and numerical simulation, we show the possible patterns and how these patterns evolve. In addition, we establish a bridge between the complex patterns and the biological mechanism using the results from a previous study in Nature Cell Biology. The results in this paper can help us better understand the biological significance of the immune system.
Brooks, Susan A
2006-06-01
A major challenge for the biotechnology industry is to engineer the glycosylation pathways of expression systems to synthesize recombinant proteins with human glycosylation. Inappropriate glycosylation can result in reduced activity, limited half-life in circulation and unwanted immunogenicity. In this review, the complexities of glycosylation in human cells are explained and compared with glycosylation in bacteria, yeasts, fungi, insects, plants and nonhuman mammalian species. Key advances in the engineering of the glycosylation of expression systems are highlighted. Advances in the challenging and technically complex field of glycan analysis are also described. The emergence of a new generation of expression systems with sophisticated engineering for humanized glycosylation of glycoproteins appears to be on the horizon.
Rand, Troy J.; Myers, Sara A.; Kyvelidou, Anastasia; Mukherjee, Mukul
2015-01-01
A healthy biological system is characterized by a temporal structure that exhibits fractal properties and is highly complex. Unhealthy systems demonstrate lowered complexity and either greater or less predictability in the temporal structure of a time series. The purpose of this research was to determine if support surface translations with different temporal structures would affect the temporal structure of the center of pressure (COP) signal. Eight healthy young participants stood on a force platform that was translated in the anteroposterior direction for input conditions of varying complexity: white noise, pink noise, brown noise, and sine wave. Detrended fluctuation analysis was used to characterize the long-range correlations of the COP time series in the AP direction. Repeated measures ANOVA revealed differences among conditions (P < .001). The less complex support surface translations resulted in a less complex COP compared to normal standing. A quadratic trend analysis demonstrated an inverted-u shape across an increasing order of predictability of the conditions (P < .001). The ability to influence the complexity of postural control through support surface translations can have important implications for rehabilitation. PMID:25994281
NASA Astrophysics Data System (ADS)
Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.
2014-12-01
Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping points) in the face of environmental and anthropogenic change (Perz, Muñoz-Carpena, Kiker and Holt, 2013), and through MonteCarlo mapping potential management activities over the most important factors or processes to influence the system towards behavioral (desirable) outcomes (Chu-Agor, Muñoz-Carpena et al., 2012).
The Shock and Vibration Bulletin. Part 1. Summaries of Presented Papers
1974-10-01
15 S. Smith, R. C. Stroud, G. A. Hamma, W. L. Hallaver, R. C. Yee MODALAB-A NEW SYSTEM FOR STRUCTURAL DYNAMIC TESTING, II, ANALYSIS ...PV -A ........................................................... 33 A. Burkhard and R. Scott ANALYSIS AND FLIGHT TEST CORRELATION OF VIBROACOUSTIC...METHODS FOR THE ANALYSIS OF ELASTICALLY SUPPORTED ISOLATION SYSTEMS ............................................. 41 G. L. Fox IMPACT ON COMPLEX
Establishment of an in vitro transcription system for Peste des petits ruminant virus.
Yunus, Mohammad; Shaila, Melkote S
2012-12-05
Peste-des-petits ruminants virus (PPRV) is a non segmented negative strand RNA virus of the genus Morbillivirus within Paramyxoviridae family. Negative strand RNA viruses are known to carry nucleocapsid (N) protein, phospho (P) protein and RNA polymerase (L protein) packaged within the virion which possess all activities required for transcription, post-transcriptional modification of mRNA and replication. In order to understand the mechanism of transcription and replication of the virus, an in vitro transcription reconstitution system is required. In the present work, an in vitro transcription system has been developed with ribonucleoprotein (RNP) complex purified from virus infected cells as well as partially purified recombinant polymerase (L-P) complex from insect cells along with N-RNA (genomic RNA encapsidated by N protein) template isolated from virus infected cells. RNP complex isolated from virus infected cells and recombinant L-P complex purified from insect cells was used to reconstitute transcription on N-RNA template. The requirement for this transcription reconstitution has been defined. Transcription of viral genes in the in vitro system was confirmed by PCR amplification of cDNAs corresponding to individual transcripts using gene specific primers. In order to measure the relative expression level of viral transcripts, real time PCR analysis was carried out. qPCR analysis of the transcription products made in vitro showed a gradient of polarity of transcription from 3' end to 5' end of the genome similar to that exhibited by the virus in infected cells. This report describes for the first time, the development of an in vitro transcription reconstitution system for PPRV with RNP complex purified from infected cells and recombinant L-P complex expressed in insect cells. Both the complexes were able to synthesize all the mRNA species in vitro, exhibiting a gradient of polarity in transcription.
A Molecular Dynamic Modeling of Hemoglobin-Hemoglobin Interactions
NASA Astrophysics Data System (ADS)
Wu, Tao; Yang, Ye; Sheldon Wang, X.; Cohen, Barry; Ge, Hongya
2010-05-01
In this paper, we present a study of hemoglobin-hemoglobin interaction with model reduction methods. We begin with a simple spring-mass system with given parameters (mass and stiffness). With this known system, we compare the mode superposition method with Singular Value Decomposition (SVD) based Principal Component Analysis (PCA). Through PCA we are able to recover the principal direction of this system, namely the model direction. This model direction will be matched with the eigenvector derived from mode superposition analysis. The same technique will be implemented in a much more complicated hemoglobin-hemoglobin molecule interaction model, in which thousands of atoms in hemoglobin molecules are coupled with tens of thousands of T3 water molecule models. In this model, complex inter-atomic and inter-molecular potentials are replaced by nonlinear springs. We employ the same method to get the most significant modes and their frequencies of this complex dynamical system. More complex physical phenomena can then be further studied by these coarse grained models.
NASA Technical Reports Server (NTRS)
Schmidt, R. J.; Dodds, R. H., Jr.
1985-01-01
The dynamic analysis of complex structural systems using the finite element method and multilevel substructured models is presented. The fixed-interface method is selected for substructure reduction because of its efficiency, accuracy, and adaptability to restart and reanalysis. This method is extended to reduction of substructures which are themselves composed of reduced substructures. The implementation and performance of the method in a general purpose software system is emphasized. Solution algorithms consistent with the chosen data structures are presented. It is demonstrated that successful finite element software requires the use of software executives to supplement the algorithmic language. The complexity of the implementation of restart and reanalysis porcedures illustrates the need for executive systems to support the noncomputational aspects of the software. It is shown that significant computational efficiencies can be achieved through proper use of substructuring and reduction technbiques without sacrificing solution accuracy. The restart and reanalysis capabilities and the flexible procedures for multilevel substructured modeling gives economical yet accurate analyses of complex structural systems.
Self-Driving Cars and Engineering Ethics: The Need for a System Level Analysis.
Borenstein, Jason; Herkert, Joseph R; Miller, Keith W
2017-11-13
The literature on self-driving cars and ethics continues to grow. Yet much of it focuses on ethical complexities emerging from an individual vehicle. That is an important but insufficient step towards determining how the technology will impact human lives and society more generally. What must complement ongoing discussions is a broader, system level of analysis that engages with the interactions and effects that these cars will have on one another and on the socio-technical systems in which they are embedded. To bring the conversation of self-driving cars to the system level, we make use of two traffic scenarios which highlight some of the complexities that designers, policymakers, and others should consider related to the technology. We then describe three approaches that could be used to address such complexities and their associated shortcomings. We conclude by bringing attention to the "Moral Responsibility for Computing Artifacts: The Rules", a framework that can provide insight into how to approach ethical issues related to self-driving cars.
The Internet As a Large-Scale Complex System
NASA Astrophysics Data System (ADS)
Park, Kihong; Willinger, Walter
2005-06-01
The Internet may be viewed as a "complex system" with diverse features and many components that can give rise to unexpected emergent phenomena, revealing much about its own engineering. This book brings together chapter contributions from a workshop held at the Santa Fe Institute in March 2001. This volume captures a snapshot of some features of the Internet that may be fruitfully approached using a complex systems perspective, meaning using interdisciplinary tools and methods to tackle the subject area. The Internet penetrates the socioeconomic fabric of everyday life; a broader and deeper grasp of the Internet may be needed to meet the challenges facing the future. The resulting empirical data have already proven to be invaluable for gaining novel insights into the network's spatio-temporal dynamics, and can be expected to become even more important when tryin to explain the Internet's complex and emergent behavior in terms of elementary networking-based mechanisms. The discoveries of fractal or self-similar network traffic traces, power-law behavior in network topology and World Wide Web connectivity are instances of unsuspected, emergent system traits. Another important factor at the heart of fair, efficient, and stable sharing of network resources is user behavior. Network systems, when habited by selfish or greedy users, take on the traits of a noncooperative multi-party game, and their stability and efficiency are integral to understanding the overall system and its dynamics. Lastly, fault-tolerance and robustness of large-scale network systems can exhibit spatial and temporal correlations whose effective analysis and management may benefit from rescaling techniques applied in certain physical and biological systems. The present book will bring together several of the leading workers involved in the analysis of complex systems with the future development of the Internet.
NASA Astrophysics Data System (ADS)
Marlin, Benjamin
Education planning provides the policy maker and the decision maker a logical framework in which to develop and implement education policy. At the international level, education planning is often confounded by both internal and external complexities, making the development of education policy difficult. This research presents a discrete event simulation in which individual students and teachers flow through the system across a variable time horizon. This simulation is then used with advancements in design of experiments, multivariate statistical analysis, and data envelopment analysis, to provide a methodology designed to assist the international education planning community. We propose that this methodology will provide the education planner with insights into the complexity of the education system, the effects of both endogenous and exogenous factors upon the system, and the implications of policies as they pertain to potential futures of the system. We do this recognizing that there are multiple actors and stochastic events in play, which although cannot be accurately forecasted, must be accounted for within the education model. To both test the implementation and usefulness of such a model and to prove its relevance, we chose the Afghan education system as the focal point of this research. The Afghan education system is a complex, real world system with competing actors, dynamic requirements, and ambiguous states. At the time of this writing, Afghanistan is at a pivotal point as a nation, and has been the recipient of a tremendous amount of international support and attention. Finally, Afghanistan is a fragile state, and the proliferation of the current disparity in education across gender, districts, and ethnicity could provide the catalyst to drive the country into hostility. In order to prevent the failure of the current government, it is essential that the education system is able to meet the demands of the Afghan people. This work provides insights into the Afghan education system, to include implications of security, the potential effects of societal issues, and prescriptive policy options. In using the proposed methodology, we provide justification for the future use of larger complex simulations in education planning |--- especially when said simulation is integrated with efficient design of experiments and data envelopment analysis.
VoroTop: Voronoi cell topology visualization and analysis toolkit
NASA Astrophysics Data System (ADS)
Lazar, Emanuel A.
2018-01-01
This paper introduces a new open-source software program called VoroTop, which uses Voronoi topology to analyze local structure in atomic systems. Strengths of this approach include its abilities to analyze high-temperature systems and to characterize complex structure such as grain boundaries. This approach enables the automated analysis of systems and mechanisms previously not possible.
Care coordination of multimorbidity: a scoping study
Burau, Viola
2015-01-01
Background A key challenge in healthcare systems worldwide is the large number of patients who suffer from multimorbidity; despite this, most systems are organized within a single-disease framework. Objective The present study addresses two issues: the characteristics and preconditions of care coordination for patients with multimorbidity; and the factors that promote or inhibit care coordination at the levels of provider organizations and healthcare professionals. Design The analysis is based on a scoping study, which combines a systematic literature search with a qualitative thematic analysis. The search was conducted in November 2013 and included the PubMed, CINAHL, and Web of Science databases, as well as the Cochrane Library, websites of relevant organizations and a hand-search of reference lists. The analysis included studies with a wide range of designs, from industrialized countries, in English, German and the Scandinavian languages, which focused on both multimorbidity/comorbidity and coordination of integrated care. Results The analysis included 47 of the 226 identified studies. The central theme emerging was complexity. This related to both specific medical conditions of patients with multimorbidity (case complexity) and the organization of care delivery at the levels of provider organizations and healthcare professionals (care complexity). Conclusions In terms of how to approach care coordination, one approach is to reduce complexity and the other is to embrace complexity. Either way, future research must take a more explicit stance on complexity and also gain a better understanding of the role of professionals as a prerequisite for the development of new care coordination interventions. PMID:29090157
Aeropropulsion 1987. Session 2: Aeropropulsion Structures Research
NASA Technical Reports Server (NTRS)
1987-01-01
Aeropropulsion systems present unique problems to the structural engineer. The extremes in operating temperatures, rotational effects, and behaviors of advanced material systems combine into complexities that require advances in many scientific disciplines involved in structural analysis and design procedures. This session provides an overview of the complexities of aeropropulsion structures and the theoretical, computational, and experimental research conducted to achieve the needed advances.
Miettinen, Sari; Ashorn, Ulla; Lehto, Juhani
2013-01-01
Rehabilitation in Finland is a good example of functions divided among several welfare sectors, such as health services and social services. The rehabilitation system in Finland is a complex one and there have been many efforts to create a coordinated entity. The purpose of this study is to open up a complex welfare system at the upper policy level and to understand the meaning of coordination at the level of service delivery. We shed light in particular on the national rehabilitation policy in Finland and how the policy has tried to overcome the negative effects of institutional complexity. In this study we used qualitative content analysis and frame analysis. As a result we identified four different welfare state frames with distinct features of policy problems, policy alternatives and institutional failure. The rehabilitation policy in Finland seems to be divided into different components which may cause problems at the level of service delivery and thus in the integration of services. Bringing these components together could at policy level enable a shared view of the rights of different population groups, effective management of integration at the level of service delivery and also an opportunity for change throughout the rehabilitation system.
NASA Technical Reports Server (NTRS)
Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip
2011-01-01
Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.
Functional Genomics Assistant (FUGA): a toolbox for the analysis of complex biological networks
2011-01-01
Background Cellular constituents such as proteins, DNA, and RNA form a complex web of interactions that regulate biochemical homeostasis and determine the dynamic cellular response to external stimuli. It follows that detailed understanding of these patterns is critical for the assessment of fundamental processes in cell biology and pathology. Representation and analysis of cellular constituents through network principles is a promising and popular analytical avenue towards a deeper understanding of molecular mechanisms in a system-wide context. Findings We present Functional Genomics Assistant (FUGA) - an extensible and portable MATLAB toolbox for the inference of biological relationships, graph topology analysis, random network simulation, network clustering, and functional enrichment statistics. In contrast to conventional differential expression analysis of individual genes, FUGA offers a framework for the study of system-wide properties of biological networks and highlights putative molecular targets using concepts of systems biology. Conclusion FUGA offers a simple and customizable framework for network analysis in a variety of systems biology applications. It is freely available for individual or academic use at http://code.google.com/p/fuga. PMID:22035155
NASA Astrophysics Data System (ADS)
Delignières, Didier; Marmelat, Vivien
2014-01-01
In this paper, we analyze empirical data, accounting for coordination processes between complex systems (bimanual coordination, interpersonal coordination, and synchronization with a fractal metronome), by using a recently proposed method: detrended cross-correlation analysis (DCCA). This work is motivated by the strong anticipation hypothesis, which supposes that coordination between complex systems is not achieved on the basis of local adaptations (i.e., correction, predictions), but results from a more global matching of complexity properties. Indeed, recent experiments have evidenced a very close correlation between the scaling properties of the series produced by two coordinated systems, despite a quite weak local synchronization. We hypothesized that strong anticipation should result in the presence of long-range cross-correlations between the series produced by the two systems. Results allow a detailed analysis of the effects of coordination on the fluctuations of the series produced by the two systems. In the long term, series tend to present similar scaling properties, with clear evidence of long-range cross-correlation. Short-term results strongly depend on the nature of the task. Simulation studies allow disentangling the respective effects of noise and short-term coupling processes on DCCA results, and suggest that the matching of long-term fluctuations could be the result of short-term coupling processes.
Markov and non-Markov processes in complex systems by the dynamical information entropy
NASA Astrophysics Data System (ADS)
Yulmetyev, R. M.; Gafarov, F. M.
1999-12-01
We consider the Markov and non-Markov processes in complex systems by the dynamical information Shannon entropy (DISE) method. The influence and important role of the two mutually dependent channels of entropy alternation (creation or generation of correlation) and anti-correlation (destroying or annihilation of correlation) have been discussed. The developed method has been used for the analysis of the complex systems of various natures: slow neutron scattering in liquid cesium, psychology (short-time numeral and pattern human memory and effect of stress on the dynamical taping-test), random dynamics of RR-intervals in human ECG (problem of diagnosis of various disease of the human cardio-vascular systems), chaotic dynamics of the parameters of financial markets and ecological systems.
Managing interoperability and complexity in health systems.
Bouamrane, M-M; Tao, C; Sarkar, I N
2015-01-01
In recent years, we have witnessed substantial progress in the use of clinical informatics systems to support clinicians during episodes of care, manage specialised domain knowledge, perform complex clinical data analysis and improve the management of health organisations' resources. However, the vision of fully integrated health information eco-systems, which provide relevant information and useful knowledge at the point-of-care, remains elusive. This journal Focus Theme reviews some of the enduring challenges of interoperability and complexity in clinical informatics systems. Furthermore, a range of approaches are proposed in order to address, harness and resolve some of the many remaining issues towards a greater integration of health information systems and extraction of useful or new knowledge from heterogeneous electronic data repositories.
Control system design and analysis using the INteractive Controls Analysis (INCA) program
NASA Technical Reports Server (NTRS)
Bauer, Frank H.; Downing, John P.
1987-01-01
The INteractive Controls Analysis (INCA) program was developed at the Goddard Space Flight Center to provide a user friendly efficient environment for the design and analysis of linear control systems. Since its inception, INCA has found extensive use in the design, development, and analysis of control systems for spacecraft, instruments, robotics, and pointing systems. Moreover, the results of the analytic tools imbedded in INCA have been flight proven with at least three currently orbiting spacecraft. This paper describes the INCA program and illustrates, using a flight proven example, how the package can perform complex design analyses with relative ease.
Risk analysis with a fuzzy-logic approach of a complex installation
NASA Astrophysics Data System (ADS)
Peikert, Tim; Garbe, Heyno; Potthast, Stefan
2016-09-01
This paper introduces a procedural method based on fuzzy logic to analyze systematic the risk of an electronic system in an intentional electromagnetic environment (IEME). The method analyzes the susceptibility of a complex electronic installation with respect to intentional electromagnetic interference (IEMI). It combines the advantages of well-known techniques as fault tree analysis (FTA), electromagnetic topology (EMT) and Bayesian networks (BN) and extends the techniques with an approach to handle uncertainty. This approach uses fuzzy sets, membership functions and fuzzy logic to handle the uncertainty with probability functions and linguistic terms. The linguistic terms add to the risk analysis the knowledge from experts of the investigated system or environment.
Complexity and dynamics of switched human balance control during quiet standing.
Nema, Salam; Kowalczyk, Piotr; Loram, Ian
2015-10-01
In this paper, we use a combination of numerical simulations, time series analysis, and complexity measures to investigate the dynamics of switched systems with noise, which are often used as models of human balance control during quiet standing. We link the results with complexity measures found in experimental data of human sway motion during quiet standing. The control model ensuring balance, which we use, is based on an act-and-wait control concept, that is, a human controller is switched on when a certain sway angle is reached. Otherwise, there is no active control present. Given a time series data, we determine how does it look a typical pattern of control strategy in our model system. We detect the switched nonlinearity in the system using a frequency analysis method in the absence of noise. We also analyse the effect of time delay on the existence of limit cycles in the system in the absence of noise. We perform the entropy and detrended fluctuation analyses in view of linking the switchings (and the dead zone) with the occurrences of complexity in the model system in the presence of noise. Finally, we perform the entropy and detrended fluctuation analyses on experimental data and link the results with numerical findings in our model example.
Expert systems for space power supply - Design, analysis, and evaluation
NASA Technical Reports Server (NTRS)
Cooper, Ralph S.; Thomson, M. Kemer; Hoshor, Alan
1987-01-01
The feasibility of applying expert systems to the conceptual design, analysis, and evaluation of space power supplies in particular, and complex systems in general is evaluated. To do this, the space power supply design process and its associated knowledge base were analyzed and characterized in a form suitable for computer emulation of a human expert. The existing expert system tools and the results achieved with them were evaluated to assess their applicability to power system design. Some new concepts for combining program architectures (modular expert systems and algorithms) with information about the domain were applied to create a 'deep' system for handling the complex design problem. NOVICE, a code to solve a simplified version of a scoping study of a wide variety of power supply types for a broad range of missions, has been developed, programmed, and tested as a concrete feasibility demonstration.
Systems Proteomics for Translational Network Medicine
Arrell, D. Kent; Terzic, Andre
2012-01-01
Universal principles underlying network science, and their ever-increasing applications in biomedicine, underscore the unprecedented capacity of systems biology based strategies to synthesize and resolve massive high throughput generated datasets. Enabling previously unattainable comprehension of biological complexity, systems approaches have accelerated progress in elucidating disease prediction, progression, and outcome. Applied to the spectrum of states spanning health and disease, network proteomics establishes a collation, integration, and prioritization algorithm to guide mapping and decoding of proteome landscapes from large-scale raw data. Providing unparalleled deconvolution of protein lists into global interactomes, integrative systems proteomics enables objective, multi-modal interpretation at molecular, pathway, and network scales, merging individual molecular components, their plurality of interactions, and functional contributions for systems comprehension. As such, network systems approaches are increasingly exploited for objective interpretation of cardiovascular proteomics studies. Here, we highlight network systems proteomic analysis pipelines for integration and biological interpretation through protein cartography, ontological categorization, pathway and functional enrichment and complex network analysis. PMID:22896016
Using cognitive work analysis to explore activity allocation within military domains.
Jenkins, D P; Stanton, N A; Salmon, P M; Walker, G H; Young, M S
2008-06-01
Cognitive work analysis (CWA) is frequently advocated as an approach for the analysis of complex socio-technical systems. Much of the current CWA literature within the military domain pays particular attention to its initial phases; work domain analysis and contextual task analysis. Comparably, the analysis of the social and organisational constraints receives much less attention. Through the study of a helicopter mission planning system software tool, this paper describes an approach for investigating the constraints affecting the distribution of work. The paper uses this model to evaluate the potential benefits of the social and organisational analysis phase within a military context. The analysis shows that, through its focus on constraints, the approach provides a unique description of the factors influencing the social organisation within a complex domain. This approach appears to be compatible with existing approaches and serves as a validation of more established social analysis techniques. As part of the ergonomic design of mission planning systems, the social organisation and cooperation analysis phase of CWA provides a constraint-based description informing allocation of function between key actor groups. This approach is useful because it poses questions related to the transfer of information and optimum working practices.
Multi-enzyme logic network architectures for assessing injuries: digital processing of biomarkers.
Halámek, Jan; Bocharova, Vera; Chinnapareddy, Soujanya; Windmiller, Joshua Ray; Strack, Guinevere; Chuang, Min-Chieh; Zhou, Jian; Santhosh, Padmanabhan; Ramirez, Gabriela V; Arugula, Mary A; Wang, Joseph; Katz, Evgeny
2010-12-01
A multi-enzyme biocatalytic cascade processing simultaneously five biomarkers characteristic of traumatic brain injury (TBI) and soft tissue injury (STI) was developed. The system operates as a digital biosensor based on concerted function of 8 Boolean AND logic gates, resulting in the decision about the physiological conditions based on the logic analysis of complex patterns of the biomarkers. The system represents the first example of a multi-step/multi-enzyme biosensor with the built-in logic for the analysis of complex combinations of biochemical inputs. The approach is based on recent advances in enzyme-based biocomputing systems and the present paper demonstrates the potential applicability of biocomputing for developing novel digital biosensor networks.
O'Neill, M A; Hilgetag, C C
2001-08-29
Many problems in analytical biology, such as the classification of organisms, the modelling of macromolecules, or the structural analysis of metabolic or neural networks, involve complex relational data. Here, we describe a software environment, the portable UNIX programming system (PUPS), which has been developed to allow efficient computational representation and analysis of such data. The system can also be used as a general development tool for database and classification applications. As the complexity of analytical biology problems may lead to computation times of several days or weeks even on powerful computer hardware, the PUPS environment gives support for persistent computations by providing mechanisms for dynamic interaction and homeostatic protection of processes. Biological objects and their interrelations are also represented in a homeostatic way in PUPS. Object relationships are maintained and updated by the objects themselves, thus providing a flexible, scalable and current data representation. Based on the PUPS environment, we have developed an optimization package, CANTOR, which can be applied to a wide range of relational data and which has been employed in different analyses of neuroanatomical connectivity. The CANTOR package makes use of the PUPS system features by modifying candidate arrangements of objects within the system's database. This restructuring is carried out via optimization algorithms that are based on user-defined cost functions, thus providing flexible and powerful tools for the structural analysis of the database content. The use of stochastic optimization also enables the CANTOR system to deal effectively with incomplete and inconsistent data. Prototypical forms of PUPS and CANTOR have been coded and used successfully in the analysis of anatomical and functional mammalian brain connectivity, involving complex and inconsistent experimental data. In addition, PUPS has been used for solving multivariate engineering optimization problems and to implement the digital identification system (DAISY), a system for the automated classification of biological objects. PUPS is implemented in ANSI-C under the POSIX.1 standard and is to a great extent architecture- and operating-system independent. The software is supported by systems libraries that allow multi-threading (the concurrent processing of several database operations), as well as the distribution of the dynamic data objects and library operations over clusters of computers. These attributes make the system easily scalable, and in principle allow the representation and analysis of arbitrarily large sets of relational data. PUPS and CANTOR are freely distributed (http://www.pups.org.uk) as open-source software under the GNU license agreement.
O'Neill, M A; Hilgetag, C C
2001-01-01
Many problems in analytical biology, such as the classification of organisms, the modelling of macromolecules, or the structural analysis of metabolic or neural networks, involve complex relational data. Here, we describe a software environment, the portable UNIX programming system (PUPS), which has been developed to allow efficient computational representation and analysis of such data. The system can also be used as a general development tool for database and classification applications. As the complexity of analytical biology problems may lead to computation times of several days or weeks even on powerful computer hardware, the PUPS environment gives support for persistent computations by providing mechanisms for dynamic interaction and homeostatic protection of processes. Biological objects and their interrelations are also represented in a homeostatic way in PUPS. Object relationships are maintained and updated by the objects themselves, thus providing a flexible, scalable and current data representation. Based on the PUPS environment, we have developed an optimization package, CANTOR, which can be applied to a wide range of relational data and which has been employed in different analyses of neuroanatomical connectivity. The CANTOR package makes use of the PUPS system features by modifying candidate arrangements of objects within the system's database. This restructuring is carried out via optimization algorithms that are based on user-defined cost functions, thus providing flexible and powerful tools for the structural analysis of the database content. The use of stochastic optimization also enables the CANTOR system to deal effectively with incomplete and inconsistent data. Prototypical forms of PUPS and CANTOR have been coded and used successfully in the analysis of anatomical and functional mammalian brain connectivity, involving complex and inconsistent experimental data. In addition, PUPS has been used for solving multivariate engineering optimization problems and to implement the digital identification system (DAISY), a system for the automated classification of biological objects. PUPS is implemented in ANSI-C under the POSIX.1 standard and is to a great extent architecture- and operating-system independent. The software is supported by systems libraries that allow multi-threading (the concurrent processing of several database operations), as well as the distribution of the dynamic data objects and library operations over clusters of computers. These attributes make the system easily scalable, and in principle allow the representation and analysis of arbitrarily large sets of relational data. PUPS and CANTOR are freely distributed (http://www.pups.org.uk) as open-source software under the GNU license agreement. PMID:11545702
NASA Astrophysics Data System (ADS)
Slathia, Goldy; Bamzai, K. K.
2017-11-01
Lanthanum chloride—thiourea—l tartaric acid coordinated complex was grown in the form of single crystal by slow evaporation of supersaturated solutions at room temperature. This coordinated complex crystallizes in orthorhombic crystal system having space group P nma. The crystallinity and purity was tested by powder x-ray diffraction. Fourier transform infra red and Raman spectroscopy analysis provide the evidences on structure and mode of coordination. The scanning electron microscopy (SEM) analysis shows the morphology evolution as brought by the increase in composition of lanthanum chloride. The band transitions due to C=O and C=S chromophores remain active in grown complexes and are recorded in the UV-vis optical spectrum. The thermal effects such as dehydration, melting and decomposition were observed by the thermogravimetric and differential thermo analytical (TGA/DTA) analysis. Electrical properties were studied by dielectric analysis in frequency range 100-30 MHz at various temperatures. Increase in values of dielectric constant was observed with change in lanthanum concentration in the coordinated complex.
NASA Astrophysics Data System (ADS)
Katpatal, Yashwant B.; Rishma, C.; Singh, Chandan K.
2018-05-01
The Gravity Recovery and Climate Experiment (GRACE) satellite mission is aimed at assessment of groundwater storage under different terrestrial conditions. The main objective of the presented study is to highlight the significance of aquifer complexity to improve the performance of GRACE in monitoring groundwater. Vidarbha region of Maharashtra, central India, was selected as the study area for analysis, since the region comprises a simple aquifer system in the western region and a complex aquifer system in the eastern region. Groundwater-level-trend analyses of the different aquifer systems and spatial and temporal variation of the terrestrial water storage anomaly were studied to understand the groundwater scenario. GRACE and its field application involve selecting four pixels from the GRACE output with different aquifer systems, where each GRACE pixel encompasses 50-90 monitoring wells. Groundwater storage anomalies (GWSA) are derived for each pixel for the period 2002 to 2015 using the Release 05 (RL05) monthly GRACE gravity models and the Global Land Data Assimilation System (GLDAS) land-surface models (GWSAGRACE) as well as the actual field data (GWSAActual). Correlation analysis between GWSAGRACE and GWSAActual was performed using linear regression. The Pearson and Spearman methods show that the performance of GRACE is good in the region with simple aquifers; however, performance is poorer in the region with multiple aquifer systems. The study highlights the importance of incorporating the sensitivity of GRACE in estimation of groundwater storage in complex aquifer systems in future studies.
29 CFR 1910.119 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2011 CFR
2011-07-01
... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...
29 CFR 1910.119 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2010 CFR
2010-07-01
... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...
Data Curation and Visualization for MuSIASEM Analysis of the Nexus
NASA Astrophysics Data System (ADS)
Renner, Ansel
2017-04-01
A novel software-based approach to relational analysis applying recent theoretical advancements of the Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism (MuSIASEM) accounting framework is presented. This research explores and explains underutilized ways software can assist complex system analysis across the stages of data collection, exploration, analysis and dissemination and in a transparent and collaborative manner. This work is being conducted as part of, and in support of, the four-year European Commission H2020 project: Moving Towards Adaptive Governance in Complexity: Informing Nexus Security (MAGIC). In MAGIC, theoretical advancements to MuSIASEM propose a powerful new approach to spatial-temporal WEFC relational analysis in accordance with a structural-functional scaling mechanism appropriate for biophysically relevant complex system analyses. Software is designed primarily with JavaScript using the Angular2 model-view-controller framework and the Data-Driven Documents (D3) library. These design choices clarify and modularize data flow, simplify research practitioner's work, allow for and assist stakeholder involvement and advance collaboration at all stages. Data requirements and scalable, robust yet light-weight structuring will first be explained. Following, algorithms to process this data will be explored. Data interfaces and data visualization approaches will lastly be presented and described.
Case for Deploying Complex Systems Utilizing Commodity Components
NASA Technical Reports Server (NTRS)
Bryant, Barry S.; Pitts, R. Lee; Ritter, George
2003-01-01
This viewgraph representation presents a study of the transition of computer networks and software engineering at the Huntsville Operations Support Center (HOSC) from a client/server UNIX based system to a client/server system based on commodity priced and open system components. Topics covered include: an overview of HOSC ground support systems, an analysis for changes to the existing ground support system, an analysis of options considered for the transition to a new system, and a consideration of goals for a new system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
CDS (Change Detection Systems) is a mechanism for rapid visual analysis using complex image alignment algorithms. CDS is controlled with a simple interface that has been designed for use for anyone that can operate a digital camera. A challenge of complex industrial systems like nuclear power plants is to accurately identify changes in systems, structures and components that may critically impact the operation of the facility. CDS can provide a means of early intervention before the issues evolve into safety and production challenges.
Lee, Jung Ae; Kim, Chul Yong; Yang, Dae Sik; Yoon, Won Sup; Park, Young Je; Lee, Suk; Kim, Young Bum
2014-01-01
To investigate the effectiveness of respiratory guidance system in 4-dimensional computed tomography (4 DCT) based respiratory-gated radiation therapy (RGRT) by comparing respiratory signals and dosimetric analysis of treatment plans. The respiratory amplitude and period of the free, the audio device-guided, and the complex system-guided breathing were evaluated in eleven patients with lung or liver cancers. The dosimetric parameters were assessed by comparing free breathing CT plan and 4 DCT-based 30-70% maximal intensity projection (MIP) plan. The use of complex system-guided breathing showed significantly less variation in respiratory amplitude and period compared to the free or audio-guided breathing regarding the root mean square errors (RMSE) of full inspiration (P = 0.031), full expiration (P = 0.007), and period (P = 0.007). The dosimetric parameters including V(5 Gy), V(10 Gy), V(20 Gy), V(30 Gy), V(40 Gy), and V(50 Gy) of normal liver or lung in 4 DCT MIP plan were superior over free breathing CT plan. The reproducibility and regularity of respiratory amplitude and period were significantly improved with the complex system-guided breathing compared to the free or the audio-guided breathing. In addition, the treatment plan based on the 4D CT-based MIP images acquired with the complex system guided breathing showed better normal tissue sparing than that on the free breathing CT.
Evaluation of RCAS Inflow Models for Wind Turbine Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tangler, J.; Bir, G.
The finite element structural modeling in the Rotorcraft Comprehensive Analysis System (RCAS) provides a state-of-the-art approach to aeroelastic analysis. This, coupled with its ability to model all turbine components, results in a methodology that can simulate complex system interactions characteristic of large wind. In addition, RCAS is uniquely capable of modeling advanced control algorithms and the resulting dynamic responses.
Dynamical analysis of the global business-cycle synchronization
2018-01-01
This paper reports the dynamical analysis of the business cycles of 12 (developed and developing) countries over the last 56 years by applying computational techniques used for tackling complex systems. They reveal long-term convergence and country-level interconnections because of close contagion effects caused by bilateral networking exposure. Interconnectivity determines the magnitude of cross-border impacts. Local features and shock propagation complexity also may be true engines for local configuration of cycles. The algorithmic modeling proves to represent a solid approach to study the complex dynamics involved in the world economies. PMID:29408909
Dynamical analysis of the global business-cycle synchronization.
Lopes, António M; Tenreiro Machado, J A; Huffstot, John S; Mata, Maria Eugénia
2018-01-01
This paper reports the dynamical analysis of the business cycles of 12 (developed and developing) countries over the last 56 years by applying computational techniques used for tackling complex systems. They reveal long-term convergence and country-level interconnections because of close contagion effects caused by bilateral networking exposure. Interconnectivity determines the magnitude of cross-border impacts. Local features and shock propagation complexity also may be true engines for local configuration of cycles. The algorithmic modeling proves to represent a solid approach to study the complex dynamics involved in the world economies.
SPAR reference manual. [for stress analysis
NASA Technical Reports Server (NTRS)
Whetstone, W. D.
1974-01-01
SPAR is a system of related programs which may be operated either in batch or demand (teletype) mode. Information exchange between programs is automatically accomplished through one or more direct access libraries, known collectively as the data complex. Card input is command-oriented, in free-field form. Capabilities available in the first production release of the system are fully documented, and include linear stress analysis, linear bifurcation buckling analysis, and linear vibrational analysis.
Complexity analysis of the Next Gen Air Traffic Management System: trajectory based operations.
Lyons, Rhonda
2012-01-01
According to Federal Aviation Administration traffic predictions currently our Air Traffic Management (ATM) system is operating at 150 percent capacity; forecasting that within the next two decades, the traffic with increase to a staggering 250 percent [17]. This will require a major redesign of our system. Today's ATM system is complex. It is designed to safely, economically, and efficiently provide air traffic services through the cost-effective provision of facilities and seamless services in collaboration with multiple agents however, contrary the vision, the system is loosely integrated and is suffering tremendously from antiquated equipment and saturated airways. The new Next Generation (Next Gen) ATM system is designed to transform the current system into an agile, robust and responsive set of operations that are designed to safely manage the growing needs of the projected increasingly complex, diverse set of air transportation system users and massive projected worldwide traffic rates. This new revolutionary technology-centric system is dynamically complex and is much more sophisticated than it's soon to be predecessor. ATM system failures could yield large scale catastrophic consequences as it is a safety critical system. This work will attempt to describe complexity and the complex nature of the NextGen ATM system and Trajectory Based Operational. Complex human factors interactions within Next Gen will be analyzed using a proposed dual experimental approach designed to identify hazards, gaps and elicit emergent hazards that would not be visible if conducted in isolation. Suggestions will be made along with a proposal for future human factors research in the TBO safety critical Next Gen environment.
Creative strategies of businesses with the holistic eigensolution in manufacturing industries
NASA Astrophysics Data System (ADS)
Zeichen, Gerfried; Huray, Paul G.
1998-10-01
It is a mission of this contribution to recognize and synthesize all the efforts in industry and in management science to strengthen our techniques and tools for successfully solving increasingly complex leadership problems in manufacturing industries. With the high standard of the work sharing method--the so called Taylorism principle--for cost efficient and mass production, invented at the beginning of the 20th century and the opening of the world market for global sales of goods and services a gigantic progress in living standards was reached. But at the beginning of the 21st century we are needing new ideas and methods for the guidance of overcoming increasing complexity. The holistic eigensolution presents a new operational framework for viewing and controlling the behavior of businesses. In contrast to the traditional process for viewing complex business systems through the intricate analysis of every part of that system, the authors have employed a technique used by physicists to understand the characteristic of `eigen' behaviors of complex physical systems. This method of systems analysis is achieved by observing interactions between the parts in a whole. This kind of analysis has a rigorous mathematical foundation in the physical world and it can be employed to understand most natural phenomena. Within a holistic framework, the observer is challenged to view the system form just the right perspective so that characteristic eigenmodes reveal themselves. The conclusion of the article describes why exactly the intelligent manufacturing science--especially in a broader sense--has the responsibility and chance to develop the holistic eigensolution framework as a Taylorism II-principle for the 21st century.
Complex Behavior of Contaminant Flux and the Ecology of the Lower Mississippi River
NASA Astrophysics Data System (ADS)
Barton, C. C.; Manheim, F. T.; De Cola, L.; Bollinger, J. E.; Jenkins, J. A.
2001-12-01
This presentation is an overview of a collaborative NSF/USGS/Tulane funded multi-scale study of the Lower Mississippi River system. The study examines the system in three major dimensional realms: space, time, and complexity (systems and their hierarchies). Researchers at Tulane University and the U.S. Geological Survey have initiated a collaborative effort to undertake the study of interacting elements which directly or indirectly affect the water quality, ecology and physical condition of the Mississippi River. These researchers include experts in the fields of water quality chemistry, geochemistry, hydrologic modeling, bioengineering, biology, fish ecology, statistics, complexity analysis, epidemiology, and computer science. Underlying this research are large databases that permit quantitative analysis of the system over the past 40 years. Results to date show that the variation in discharge and the contaminant flux scale independently both exhibit fractal scaling, the signature geometry of nonlinear dynamical and complex systems. Public perception is that the Lower Mississippi River is a health hazard, but for the past decade, traditional water quality measurements show that contaminants are within current regulatory guidelines for human consumption. This difference between public perception and scientific reality represents a complex scientific and social issue. The connections and feedback within the ecological system and the Mississippi River are few because engineering structures isolate the lower Mississippi River from its surroundings. Investigation of the connections and feedback between human health and the ecological health of the River and the surrounding region as well as perceptions of these states of health - holds promise for explaining epidemiological patterns of human disease.
ERIC Educational Resources Information Center
Greene, Jeffrey Alan; Azevedo, Roger
2009-01-01
In this study, we used think-aloud verbal protocols to examine how various macro-level processes of self-regulated learning (SRL; e.g., planning, monitoring, strategy use, handling of task difficulty and demands) were associated with the acquisition of a sophisticated mental model of a complex biological system. Numerous studies examine how…
Conveying the Complex: Updating U.S. Joint Systems Analysis Doctrine with Complexity Theory
2013-12-10
screech during a public address, or sustain and amplify it during a guitar solo. Since the systems are nonlinear, understanding cause and effect... Classics , 2007), 12. 34 those frames.58 A technique to cope with the potentially confusing...Reynolds, Paul Davidson. A Primer in Theory Construction. Boston: Allyn and Bacon Classics , 2007. Riolo, Rick L. “The Effects and Evolution of Tag
An Estimate of the Vertical Variability of Temperature at KSC Launch Complex 39-B
NASA Technical Reports Server (NTRS)
Brenton, James
2017-01-01
The purpose of this analysis is to determine the vertical variability of the air temperature below 500 feet at Launch Complex (LC) 39-B at Kennedy Space Center (KSC). This analysis utilizes data from the LC39-B Lightning Protection System (LPS) Towers and the 500 foot Tower 313. The results of this analysis will be used to help evaluate the ambient air temperature Launch Commit Criteria (LCC) for the Exploration Mission 1 launch.
To the systematization of failure analysis for perturbed systems (in German)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haller, U.
1974-01-01
The paper investigates the reliable functioning of complex technical systems. Of main importance is the question of how the functioning of technical systems which may fail or whose design still has some faults can be determined in the very earliest planning stages. The present paper is to develop a functioning schedule and to look for possible methods of systematic failure analysis of systems with stochastic failures. (RW/AK)
Tool Support for Parametric Analysis of Large Software Simulation Systems
NASA Technical Reports Server (NTRS)
Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony
2008-01-01
The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.
Model-based safety analysis of human-robot interactions: the MIRAS walking assistance robot.
Guiochet, Jérémie; Hoang, Quynh Anh Do; Kaaniche, Mohamed; Powell, David
2013-06-01
Robotic systems have to cope with various execution environments while guaranteeing safety, and in particular when they interact with humans during rehabilitation tasks. These systems are often critical since their failure can lead to human injury or even death. However, such systems are difficult to validate due to their high complexity and the fact that they operate within complex, variable and uncertain environments (including users), in which it is difficult to foresee all possible system behaviors. Because of the complexity of human-robot interactions, rigorous and systematic approaches are needed to assist the developers in the identification of significant threats and the implementation of efficient protection mechanisms, and in the elaboration of a sound argumentation to justify the level of safety that can be achieved by the system. For threat identification, we propose a method called HAZOP-UML based on a risk analysis technique adapted to system description models, focusing on human-robot interaction models. The output of this step is then injected in a structured safety argumentation using the GSN graphical notation. Those approaches have been successfully applied to the development of a walking assistant robot which is now in clinical validation.
Xie, Mingxia; Wang, Jiayao; Chen, Ke
2017-01-01
This study investigates the basic characteristics and proposes a concept for the complex system of geographical conditions (CSGC). By analyzing the DPSIR model and its correlation with the index system, we selected indexes for geographical conditions according to the resources, ecology, environment, economy and society parameters to build a system. This system consists of four hierarchies: index, classification, element and target levels. We evaluated the elements or indexes of the complex system using the TOPSIS method and a general model coordinating multiple complex systems. On this basis, the coordination analysis experiment of geographical conditions is applied to cities in the Henan province in China. The following conclusions were reached: ①According to the pressure, state and impact of geographical conditions, relatively consistent measures are taken around the city, but with conflicting results. ②The coordination degree of geographical conditions is small among regions showing large differences in classification index value. The degree of coordination of such regions is prone to extreme values; however, the smaller the difference the larger the coordination degree. ③The coordinated development of geographical conditions in the Henan province is at the stage of the point axis.
Advanced Stoichiometric Analysis of Metabolic Networks of Mammalian Systems
Orman, Mehmet A.; Berthiaume, Francois; Androulakis, Ioannis P.; Ierapetritou, Marianthi G.
2013-01-01
Metabolic engineering tools have been widely applied to living organisms to gain a comprehensive understanding about cellular networks and to improve cellular properties. Metabolic flux analysis (MFA), flux balance analysis (FBA), and metabolic pathway analysis (MPA) are among the most popular tools in stoichiometric network analysis. Although application of these tools into well-known microbial systems is extensive in the literature, various barriers prevent them from being utilized in mammalian cells. Limited experimental data, complex regulatory mechanisms, and the requirement of more complex nutrient media are some major obstacles in mammalian cell systems. However, mammalian cells have been used to produce therapeutic proteins, to characterize disease states or related abnormal metabolic conditions, and to analyze the toxicological effects of some medicinally important drugs. Therefore, there is a growing need for extending metabolic engineering principles to mammalian cells in order to understand their underlying metabolic functions. In this review article, advanced metabolic engineering tools developed for stoichiometric analysis including MFA, FBA, and MPA are described. Applications of these tools in mammalian cells are discussed in detail, and the challenges and opportunities are highlighted. PMID:22196224
Sturmberg, Joachim P; Martin, Carmel M; Katerndahl, David A
2014-01-01
Over the past 7 decades, theories in the systems and complexity sciences have had a major influence on academic thinking and research. We assessed the impact of complexity science on general practice/family medicine. We performed a historical integrative review using the following systematic search strategy: medical subject heading [humans] combined in turn with the terms complex adaptive systems, nonlinear dynamics, systems biology, and systems theory, limited to general practice/family medicine and published before December 2010. A total of 16,242 articles were retrieved, of which 49 were published in general practice/family medicine journals. Hand searches and snowballing retrieved another 35. After a full-text review, we included 56 articles dealing specifically with systems sciences and general/family practice. General practice/family medicine engaged with the emerging systems and complexity theories in 4 stages. Before 1995, articles tended to explore common phenomenologic general practice/family medicine experiences. Between 1995 and 2000, articles described the complex adaptive nature of this discipline. Those published between 2000 and 2005 focused on describing the system dynamics of medical practice. After 2005, articles increasingly applied the breadth of complex science theories to health care, health care reform, and the future of medicine. This historical review describes the development of general practice/family medicine in relation to complex adaptive systems theories, and shows how systems sciences more accurately reflect the discipline's philosophy and identity. Analysis suggests that general practice/family medicine first embraced systems theories through conscious reorganization of its boundaries and scope, before applying empirical tools. Future research should concentrate on applying nonlinear dynamics and empirical modeling to patient care, and to organizing and developing local practices, engaging in community development, and influencing health care reform.
Sturmberg, Joachim P.; Martin, Carmel M.; Katerndahl, David A.
2014-01-01
PURPOSE Over the past 7 decades, theories in the systems and complexity sciences have had a major influence on academic thinking and research. We assessed the impact of complexity science on general practice/family medicine. METHODS We performed a historical integrative review using the following systematic search strategy: medical subject heading [humans] combined in turn with the terms complex adaptive systems, nonlinear dynamics, systems biology, and systems theory, limited to general practice/family medicine and published before December 2010. A total of 16,242 articles were retrieved, of which 49 were published in general practice/family medicine journals. Hand searches and snowballing retrieved another 35. After a full-text review, we included 56 articles dealing specifically with systems sciences and general/family practice. RESULTS General practice/family medicine engaged with the emerging systems and complexity theories in 4 stages. Before 1995, articles tended to explore common phenomenologic general practice/family medicine experiences. Between 1995 and 2000, articles described the complex adaptive nature of this discipline. Those published between 2000 and 2005 focused on describing the system dynamics of medical practice. After 2005, articles increasingly applied the breadth of complex science theories to health care, health care reform, and the future of medicine. CONCLUSIONS This historical review describes the development of general practice/family medicine in relation to complex adaptive systems theories, and shows how systems sciences more accurately reflect the discipline’s philosophy and identity. Analysis suggests that general practice/family medicine first embraced systems theories through conscious reorganization of its boundaries and scope, before applying empirical tools. Future research should concentrate on applying nonlinear dynamics and empirical modeling to patient care, and to organizing and developing local practices, engaging in community development, and influencing health care reform. PMID:24445105
NASA Astrophysics Data System (ADS)
Varela, Consuelo; Tarquis, Ana M.; Blanco-Gutiérrez, Irene; Estebe, Paloma; Toledo, Marisol; Martorano, Lucieta
2015-04-01
Social-ecological systems are linked complex systems that represent interconnected human and biophysical processes evolving and adapting across temporal and spatial scales. In the real world, social-ecological systems pose substantial challenges for modeling. In this regard, Fuzzy Cognitive Maps (FCMs) have proven to be a useful method for capturing the functioning of this type of systems. FCMs are a semi-quantitative type of cognitive map that represent a system composed of relevant factors and weighted links showing the strength and direction of cause-effects relationships among factors. Therefore, FCMs can be interpreted as complex system structures or complex networks. In this sense, recent research has applied complex network concepts for the analysis of FCMs that represent social-ecological systems. Key to FCM the tool is its potential to allow feedback loops and to include stakeholder knowledge in the construction of the tool. Also, previous research has demonstrated their potential to represent system dynamics and simulate the effects of changes in the system, such as policy interventions. For illustrating this analysis, we have developed a series of participatory FCM for the study of the ecological and human systems related to biodiversity conservation in two case studies of the Amazonian region, the Bolivia lowlands of Guarayos and the Brazil Tapajos National forest. The research is carried out in the context of the EU project ROBIN1 and it is based on the development of a series of stakeholder workshops to analyze the current state of the socio-ecological environment in the Amazonian forest, reflecting conflicts and challenges for biodiversity conservation and human development. Stakeholders included all relevant actors in the local case studies, namely farmers, environmental groups, producer organizations, local and provincial authorities and scientists. In both case studies we illustrate the use of complex networks concepts, such as the adjacency matrix and centrality properties (e.g.: centrality, page-rank, betweenness centrality). Different measures of network centrality evidence that deforestation and loss of biodiversity are the most relevant factors in the FCM of the two case studies analyzed. In both cases agricultural expansion emerges as a key driver of deforestation. The lack of policy coordination and a weak implementation and enforcement are also highly influential factors. The analysis of the system's dynamics suggest that in the case of Bolivia forest fires and deforestation are likely to continue in the immediate future as illegal activities are maintained and poverty increases. In the case of Brazil a decrease in available viable economic activities is driving further deforestation and ecosystem services loss. Overall, the research evidences how using FCMs together with complex network analysis can support policy development by identifying key elements and processes upon which policy makers and institutions can take action. Acknowledgements The authors would like to acknowledge the EU project ROBIN (The Role of Biodiversity in Climate Change Mitigation, from the EC FP7, no 283093) and the Spanish project AL14-PID-12 (Biodiversidad y cambio climático en la Amazonía: Perspectivas socio-económicas y ambientales) of the UPM Latin America Cooperation Program for funding this research.
Dynamic Systems Modeling in Educational System Design & Policy
ERIC Educational Resources Information Center
Groff, Jennifer Sterling
2013-01-01
Over the last several hundred years, local and national educational systems have evolved from relatively simple systems to incredibly complex, interdependent, policy-laden structures, to which many question their value, effectiveness, and direction they are headed. System Dynamics is a field of analysis used to guide policy and system design in…
A measuring tool for tree-rings analysis
NASA Astrophysics Data System (ADS)
Shumilov, Oleg; Kanatjev, Alexander; Kasatkina, Elena
2013-04-01
A special tool has been created for the annual tree-ring widths measurement and analysis. It consists of professional scanner, computer system and software. This created complex in many aspects does not yield the similar systems (LINTAB, WinDENDRO), but in comparison to manual measurement systems, it offers a number of advantages: productivity gain, possibility of archiving the results of the measurements at any stage of the processing, operator comfort. It has been developed a new software, allowing processing of samples of different types (cores, saw cuts), including those which is difficult to process, having got a complex wood structure (inhomogeneity of growing in different directions, missed, light and false rings etc.). This software can analyze pictures made with optical scanners, analog or digital cameras. The complex software program was created on programming language C++, being compatible with modern operating systems like Windows X. Annual ring widths are measured along paths traced interactively. These paths can have any orientation and can be created so that ring widths are measured perpendicular to ring boundaries. A graphic of ring-widths in function of the year is displayed on a screen during the analysis and it can be used for visual and numerical cross-dating and comparison with other series or master-chronologies. Ring widths are saved to the text files in a special format, and those files are converted to the format accepted for data conservation in the International Tree-Ring Data Bank. The created complex is universal in application that will allow its use for decision of the different problems in biology and ecology. With help of this complex it has been reconstructed a long-term juniper (1328-2004) and pine (1445-2005) tree-ring chronologies on the base of samples collected at Kola Peninsula (northwestern Russia).
Monitoring and analysis of data from complex systems
NASA Technical Reports Server (NTRS)
Dollman, Thomas; Webster, Kenneth
1991-01-01
Some of the methods, systems, and prototypes that have been tested for monitoring and analyzing the data from several spacecraft and vehicles at the Marshall Space Flight Center are introduced. For the Huntsville Operations Support Center (HOSC) infrastructure, the Marshall Integrated Support System (MISS) provides a migration path to the state-of-the-art workstation environment. Its modular design makes it possible to implement the system in stages on multiple platforms without the need for all components to be in place at once. The MISS provides a flexible, user-friendly environment for monitoring and controlling orbital payloads. In addition, new capabilities and technology may be incorporated into MISS with greater ease. The use of information systems technology in advanced prototype phases, as adjuncts to mainline activities, is used to evaluate new computational techniques for monitoring and analysis of complex systems. Much of the software described (specially, HSTORESIS (Hubble Space Telescope Operational Readiness Expert Safemode Investigation System), DRS (Device Reasoning Shell), DART (Design Alternatives Rational Tool), elements of the DRA (Document Retrieval Assistant), and software for the PPS (Peripheral Processing System) and the HSPP (High-Speed Peripheral Processor)) is available with supporting documentation, and may be applicable to other system monitoring and analysis applications.
AN ASSESSMENT OF CENTRAL-STATION CONGENERATION SYSTEMS FOR INDUSTRIAL COMPLEXES
This report assesses the potential for cogeneration system development based on an analysis of the economic, environmental, energy efficiency and social aspects of such systems. The cogeneration system is an application of the principle of cogeneration in which utility-sized powe...
Planning representation for automated exploratory data analysis
NASA Astrophysics Data System (ADS)
St. Amant, Robert; Cohen, Paul R.
1994-03-01
Igor is a knowledge-based system for exploratory statistical analysis of complex systems and environments. Igor has two related goals: to help automate the search for interesting patterns in data sets, and to help develop models that capture significant relationships in the data. We outline a language for Igor, based on techniques of opportunistic planning, which balances control and opportunism. We describe the application of Igor to the analysis of the behavior of Phoenix, an artificial intelligence planning system.
NASA Astrophysics Data System (ADS)
Christensen, Claire Petra
Across diverse fields ranging from physics to biology, sociology, and economics, the technological advances of the past decade have engendered an unprecedented explosion of data on highly complex systems with thousands, if not millions of interacting components. These systems exist at many scales of size and complexity, and it is becoming ever-more apparent that they are, in fact, universal, arising in every field of study. Moreover, they share fundamental properties---chief among these, that the individual interactions of their constituent parts may be well-understood, but the characteristic behaviour produced by the confluence of these interactions---by these complex networks---is unpredictable; in a nutshell, the whole is more than the sum of its parts. There is, perhaps, no better illustration of this concept than the discoveries being made regarding complex networks in the biological sciences. In particular, though the sequencing of the human genome in 2003 was a remarkable feat, scientists understand that the "cellular-level blueprints" for the human being are cellular-level parts lists, but they say nothing (explicitly) about cellular-level processes. The challenge of modern molecular biology is to understand these processes in terms of the networks of parts---in terms of the interactions among proteins, enzymes, genes, and metabolites---as it is these processes that ultimately differentiate animate from inanimate, giving rise to life! It is the goal of systems biology---an umbrella field encapsulating everything from molecular biology to epidemiology in social systems---to understand processes in terms of fundamental networks of core biological parts, be they proteins or people. By virtue of the fact that there are literally countless complex systems, not to mention tools and techniques used to infer, simulate, analyze, and model these systems, it is impossible to give a truly comprehensive account of the history and study of complex systems. The author's own publications have contributed network inference, simulation, modeling, and analysis methods to the much larger body of work in systems biology, and indeed, in network science. The aim of this thesis is therefore twofold: to present this original work in the historical context of network science, but also to provide sufficient review and reference regarding complex systems (with an emphasis on complex networks in systems biology) and tools and techniques for their inference, simulation, analysis, and modeling, such that the reader will be comfortable in seeking out further information on the subject. The review-like Chapters 1, 2, and 4 are intended to convey the co-evolution of network science and the slow but noticeable breakdown of boundaries between disciplines in academia as research and comparison of diverse systems has brought to light the shared properties of these systems. It is the author's hope that theses chapters impart some sense of the remarkable and rapid progress in complex systems research that has led to this unprecedented academic synergy. Chapters 3 and 5 detail the author's original work in the context of complex systems research. Chapter 3 presents the methods and results of a two-stage modeling process that generates candidate gene-regulatory networks of the bacterium B.subtilis from experimentally obtained, yet mathematically underdetermined microchip array data. These networks are then analyzed from a graph theoretical perspective, and their biological viability is critiqued by comparing the networks' graph theoretical properties to those of other biological systems. The results of topological perturbation analyses revealing commonalities in behavior at multiple levels of complexity are also presented, and are shown to be an invaluable means by which to ascertain the level of complexity to which the network inference process is robust to noise. Chapter 5 outlines a learning algorithm for the development of a realistic, evolving social network (a city) into which a disease is introduced. The results of simulations in populations spanning two orders of magnitude are compared to prevaccine era measles data for England and Wales and demonstrate that the simulations are able to capture the quantitative and qualitative features of epidemics in populations as small as 10,000 people. The work presented in Chapter 5 validates the utility of network simulation in concurrently probing contact network dynamics and disease dynamics.
Structural model of control system for hydraulic stepper motor complex
NASA Astrophysics Data System (ADS)
Obukhov, A. D.; Dedov, D. L.; Kolodin, A. N.
2018-03-01
The article considers the problem of developing a structural model of the control system for a hydraulic stepper drive complex. A comparative analysis of stepper drives and assessment of the applicability of HSM for solving problems, requiring accurate displacement in space with subsequent positioning of the object, are carried out. The presented structural model of the automated control system of the multi-spindle complex of hydraulic stepper drives reflects the main components of the system, as well as the process of its control based on the control signals transfer to the solenoid valves by the controller. The models and methods described in the article can be used to formalize the control process in technical systems based on the application hydraulic stepper drives and allow switching from mechanical control to automated control.
The mysteries of the diffusion region in asymmetric systems
NASA Astrophysics Data System (ADS)
Hesse, M.; Aunai, N.; Zenitani, S.; Kuznetsova, M. M.; Birn, J.
2013-12-01
Unlike in symmetric systems, where symmetry dictates a comparatively simple structure of the reconnection region, asymmetric systems offer a surprising, much more complex, structure of the diffusion region. Beyond the well-known lack of colocation of flow stagnation and magnetic null, the physical mechanism underpinning the reconnection electric field also appears to be considerably more complex. In this presentation, we will perform a detailed analysis of the reconnection diffusion region in an asymmetric system. We will show that, unlike in symmetric systems, the immediate reconnection electric field is not given by electron pressure tensor nongyrotropies, but by electron inertial contributions. We will further discuss the role of pressure nongyrotropies, and we will study the origin of the complex structures of electron distributions in the central part of the diffusion region.
Chung, Younjin; Salvador-Carulla, Luis; Salinas-Pérez, José A; Uriarte-Uriarte, Jose J; Iruin-Sanz, Alvaro; García-Alonso, Carlos R
2018-04-25
Decision-making in mental health systems should be supported by the evidence-informed knowledge transfer of data. Since mental health systems are inherently complex, involving interactions between its structures, processes and outcomes, decision support systems (DSS) need to be developed using advanced computational methods and visual tools to allow full system analysis, whilst incorporating domain experts in the analysis process. In this study, we use a DSS model developed for interactive data mining and domain expert collaboration in the analysis of complex mental health systems to improve system knowledge and evidence-informed policy planning. We combine an interactive visual data mining approach, the self-organising map network (SOMNet), with an operational expert knowledge approach, expert-based collaborative analysis (EbCA), to develop a DSS model. The SOMNet was applied to the analysis of healthcare patterns and indicators of three different regional mental health systems in Spain, comprising 106 small catchment areas and providing healthcare for over 9 million inhabitants. Based on the EbCA, the domain experts in the development team guided and evaluated the analytical processes and results. Another group of 13 domain experts in mental health systems planning and research evaluated the model based on the analytical information of the SOMNet approach for processing information and discovering knowledge in a real-world context. Through the evaluation, the domain experts assessed the feasibility and technology readiness level (TRL) of the DSS model. The SOMNet, combined with the EbCA, effectively processed evidence-based information when analysing system outliers, explaining global and local patterns, and refining key performance indicators with their analytical interpretations. The evaluation results showed that the DSS model was feasible by the domain experts and reached level 7 of the TRL (system prototype demonstration in operational environment). This study supports the benefits of combining health systems engineering (SOMNet) and expert knowledge (EbCA) to analyse the complexity of health systems research. The use of the SOMNet approach contributes to the demonstration of DSS for mental health planning in practice.
Permutation entropy analysis of financial time series based on Hill's diversity number
NASA Astrophysics Data System (ADS)
Zhang, Yali; Shang, Pengjian
2017-12-01
In this paper the permutation entropy based on Hill's diversity number (Nn,r) is introduced as a new way to assess the complexity of a complex dynamical system such as stock market. We test the performance of this method with simulated data. Results show that Nn,r with appropriate parameters is more sensitive to the change of system and describes the trends of complex systems clearly. In addition, we research the stock closing price series from different data that consist of six indices: three US stock indices and three Chinese stock indices during different periods, Nn,r can quantify the changes of complexity for stock market data. Moreover, we get richer information from Nn,r, and obtain some properties about the differences between the US and Chinese stock indices.
Traditional Chinese medicine: potential approaches from modern dynamical complexity theories.
Ma, Yan; Zhou, Kehua; Fan, Jing; Sun, Shuchen
2016-03-01
Despite the widespread use of traditional Chinese medicine (TCM) in clinical settings, proving its effectiveness via scientific trials is still a challenge. TCM views the human body as a complex dynamical system, and focuses on the balance of the human body, both internally and with its external environment. Such fundamental concepts require investigations using system-level quantification approaches, which are beyond conventional reductionism. Only methods that quantify dynamical complexity can bring new insights into the evaluation of TCM. In a previous article, we briefly introduced the potential value of Multiscale Entropy (MSE) analysis in TCM. This article aims to explain the existing challenges in TCM quantification, to introduce the consistency of dynamical complexity theories and TCM theories, and to inspire future system-level research on health and disease.
Complexity Studies and Security in the Complex World: An Epistemological Framework of Analysis
NASA Astrophysics Data System (ADS)
Mesjasz, Czeslaw
The impact of systems thinking can be found in numerous security-oriented research, beginning from the early works on international system: Pitrim Sorokin, Quincy Wright, first models of military conflict and war: Frederick Lanchester, Lewis F. Richardson, national and military security (origins of RAND Corporation), through development of game theory-based conflict studies, International Relations, classical security studies of Morton A. Kaplan, Karl W. Deutsch [Mesjasz 1988], and ending with contemporary ideas of broadened concepts of security proposed by the Copenhagen School [Buzan et al 1998]. At present it may be even stated that the new military and non-military threats to contemporary complex society, such as low-intensity conflicts, regional conflicts, terrorism, environmental disturbances, etc. cannot be embraced without ideas taken from modern complex systems studies.
NASA Technical Reports Server (NTRS)
Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron
1994-01-01
This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.
Advances In High Temperature (Viscoelastoplastic) Material Modeling for Thermal Structural Analysis
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Saleeb, Atef F.
2005-01-01
Typical High Temperature Applications High Temperature Applications Demand High Performance Materials: 1) Complex Thermomechanical Loading; 2) Complex Material response requires Time-Dependent/Hereditary Models: Viscoelastic/Viscoplastic; and 3) Comprehensive Characterization (Tensile, Creep, Relaxation) for a variety of material systems.
GIS Toolsets for Planetary Geomorphology and Landing-Site Analysis
NASA Astrophysics Data System (ADS)
Nass, Andrea; van Gasselt, Stephan
2015-04-01
Modern Geographic Information Systems (GIS) allow expert and lay users alike to load and position geographic data and perform simple to highly complex surface analyses. For many applications dedicated and ready-to-use GIS tools are available in standard software systems while other applications require the modular combination of available basic tools to answer more specific questions. This also applies to analyses in modern planetary geomorphology where many of such (basic) tools can be used to build complex analysis tools, e.g. in image- and terrain model analysis. Apart from the simple application of sets of different tools, many complex tasks require a more sophisticated design for storing and accessing data using databases (e.g. ArcHydro for hydrological data analysis). In planetary sciences, complex database-driven models are often required to efficiently analyse potential landings sites or store rover data, but also geologic mapping data can be efficiently stored and accessed using database models rather than stand-alone shapefiles. For landings-site analyses, relief and surface roughness estimates are two common concepts that are of particular interest and for both, a number of different definitions co-exist. We here present an advanced toolset for the analysis of image and terrain-model data with an emphasis on extraction of landing site characteristics using established criteria. We provide working examples and particularly focus on the concepts of terrain roughness as it is interpreted in geomorphology and engineering studies.
A Hardware Model Validation Tool for Use in Complex Space Systems
NASA Technical Reports Server (NTRS)
Davies, Misty Dawn; Gundy-Burlet, Karen L.; Limes, Gregory L.
2010-01-01
One of the many technological hurdles that must be overcome in future missions is the challenge of validating as-built systems against the models used for design. We propose a technique composed of intelligent parameter exploration in concert with automated failure analysis as a scalable method for the validation of complex space systems. The technique is impervious to discontinuities and linear dependencies in the data, and can handle dimensionalities consisting of hundreds of variables over tens of thousands of experiments.
Rehfuess, Eva A; Best, Nicky; Briggs, David J; Joffe, Mike
2013-12-06
Effective interventions require evidence on how individual causal pathways jointly determine disease. Based on the concept of systems epidemiology, this paper develops Diagram-based Analysis of Causal Systems (DACS) as an approach to analyze complex systems, and applies it by examining the contributions of proximal and distal determinants of childhood acute lower respiratory infections (ALRI) in sub-Saharan Africa. Diagram-based Analysis of Causal Systems combines the use of causal diagrams with multiple routinely available data sources, using a variety of statistical techniques. In a step-by-step process, the causal diagram evolves from conceptual based on a priori knowledge and assumptions, through operational informed by data availability which then undergoes empirical testing, to integrated which synthesizes information from multiple datasets. In our application, we apply different regression techniques to Demographic and Health Survey (DHS) datasets for Benin, Ethiopia, Kenya and Namibia and a pooled World Health Survey (WHS) dataset for sixteen African countries. Explicit strategies are employed to make decisions transparent about the inclusion/omission of arrows, the sign and strength of the relationships and homogeneity/heterogeneity across settings.Findings about the current state of evidence on the complex web of socio-economic, environmental, behavioral and healthcare factors influencing childhood ALRI, based on DHS and WHS data, are summarized in an integrated causal diagram. Notably, solid fuel use is structured by socio-economic factors and increases the risk of childhood ALRI mortality. Diagram-based Analysis of Causal Systems is a means of organizing the current state of knowledge about a specific area of research, and a framework for integrating statistical analyses across a whole system. This partly a priori approach is explicit about causal assumptions guiding the analysis and about researcher judgment, and wrong assumptions can be reversed following empirical testing. This approach is well-suited to dealing with complex systems, in particular where data are scarce.
2013-01-01
Background Effective interventions require evidence on how individual causal pathways jointly determine disease. Based on the concept of systems epidemiology, this paper develops Diagram-based Analysis of Causal Systems (DACS) as an approach to analyze complex systems, and applies it by examining the contributions of proximal and distal determinants of childhood acute lower respiratory infections (ALRI) in sub-Saharan Africa. Results Diagram-based Analysis of Causal Systems combines the use of causal diagrams with multiple routinely available data sources, using a variety of statistical techniques. In a step-by-step process, the causal diagram evolves from conceptual based on a priori knowledge and assumptions, through operational informed by data availability which then undergoes empirical testing, to integrated which synthesizes information from multiple datasets. In our application, we apply different regression techniques to Demographic and Health Survey (DHS) datasets for Benin, Ethiopia, Kenya and Namibia and a pooled World Health Survey (WHS) dataset for sixteen African countries. Explicit strategies are employed to make decisions transparent about the inclusion/omission of arrows, the sign and strength of the relationships and homogeneity/heterogeneity across settings. Findings about the current state of evidence on the complex web of socio-economic, environmental, behavioral and healthcare factors influencing childhood ALRI, based on DHS and WHS data, are summarized in an integrated causal diagram. Notably, solid fuel use is structured by socio-economic factors and increases the risk of childhood ALRI mortality. Conclusions Diagram-based Analysis of Causal Systems is a means of organizing the current state of knowledge about a specific area of research, and a framework for integrating statistical analyses across a whole system. This partly a priori approach is explicit about causal assumptions guiding the analysis and about researcher judgment, and wrong assumptions can be reversed following empirical testing. This approach is well-suited to dealing with complex systems, in particular where data are scarce. PMID:24314302
Guo, Zhong; Johnston, Wayne; Kovtun, Oleksiy; Mureev, Sergey; Bröcker, Cornelia; Ungermann, Christian; Alexandrov, Kirill
2013-01-01
Biochemical and structural analysis of macromolecular protein assemblies remains challenging due to technical difficulties in recombinant expression, engineering and reconstitution of multisubunit complexes. Here we use a recently developed cell-free protein expression system based on the protozoan Leishmania tarentolae to produce in vitro all six subunits of the 600 kDa HOPS and CORVET membrane tethering complexes. We demonstrate that both subcomplexes and the entire HOPS complex can be reconstituted in vitro resulting in a comprehensive subunit interaction map. To our knowledge this is the largest eukaryotic protein complex in vitro reconstituted to date. Using the truncation and interaction analysis, we demonstrate that the complex is assembled through short hydrophobic sequences located in the C-terminus of the individual Vps subunits. Based on this data we propose a model of the HOPS and CORVET complex assembly that reconciles the available biochemical and structural data. PMID:24312556
DEVELOPMENT PLAN FOR THE CAUSAL ANALYSIS ...
The Causal Analysis/Diagnosis Decision Information System (CADDIS) is a web-based system that provides technical support for states, tribes and other users of the Office of Water's Stressor Identification Guidance. The Stressor Identification Guidance provides a rigorous and scientifically defensible method for determining the causes of biological impairments of aquatic ecosystems. It is being used by states as part of the TMDL process and is being applied to other impaired ecosystems such as Superfund sites. However, because of the complexity of causal relationships in ecosystems, and because the guidance includes a strength-of-evidence analysis which uses multiple causal considerations, the process is complex and information intensive. CADDIS helps users deal with that inherent complexity. Increasingly, the regulatory, remedial, and restoration actions taken to manage impaired environments are based on measurement and analysis of the biotic community. When an aquatic assemblage has been identified as impaired, an accurate and defensible assessment of the cause can help ensure that appropriate actions are taken. The U.S. EPA's Stressor Identification Guidance describes a methodology for identifying the most likely causes of observed impairments in aquatic systems. Stressor identification requires extensive knowledge of the mechanisms, symptoms, and stressor-response relationships for various specific stressors as well as the ability to use that knowledge in a
On the sensitivity of complex, internally coupled systems
NASA Technical Reports Server (NTRS)
Sobieszczanskisobieski, Jaroslaw
1988-01-01
A method is presented for computing sensitivity derivatives with respect to independent (input) variables for complex, internally coupled systems, while avoiding the cost and inaccuracy of finite differencing performed on the entire system analysis. The method entails two alternative algorithms: the first is based on the classical implicit function theorem formulated on residuals of governing equations, and the second develops the system sensitivity equations in a new form using the partial (local) sensitivity derivatives of the output with respect to the input of each part of the system. A few application examples are presented to illustrate the discussion.
Topological analysis of long-chain branching patterns in polyolefins.
Bonchev, D; Markel, E; Dekmezian, A
2001-01-01
Patterns in molecular topology and complexity for long-chain branching are quantitatively described. The Wiener number, the topological complexity index, and a new index of 3-starness are used to quantify polymer structure. General formulas for these indices were derived for the cases of 3-arm star, H-shaped, and B-arm comb polymers. The factors affecting complexity in monodisperse polymer systems are ranked as follows: number of arms > arm length > arm central position approximately equal to arm clustering > total molecular weight approximately equal to backbone molecular weight. Topological indices change rapidly and then plateau as the molecular weight of branches on a polyolefin backbone increases from 0 to 5 kD. Complexity calculations relate 2-arm or 3-arm comb structures to the corresponding 3-arm stars of equivalent complexity but much higher molecular weight. In a subsequent paper, we report the application of topological analysis for developing structure/property relationships for monodisperse polymers. While the focus of the present work is on the description of monodisperse, well-defined architectures, the methods may be extended to the description of polydisperse systems.
Old River Control Complex Sedimentation Investigation
2015-06-01
efforts to describe the shoaling processes and sediment transport in the two-river system. Geomorphic analysis The geomorphic assessment utilized...District, New Orleans. The investigation was conducted via a combination of field data collection and laboratory analysis, geomorphic assessments, and...6 Geomorphic analysis
SAINT: A combined simulation language for modeling man-machine systems
NASA Technical Reports Server (NTRS)
Seifert, D. J.
1979-01-01
SAINT (Systems Analysis of Integrated Networks of Tasks) is a network modeling and simulation technique for design and analysis of complex man machine systems. SAINT provides the conceptual framework for representing systems that consist of discrete task elements, continuous state variables, and interactions between them. It also provides a mechanism for combining human performance models and dynamic system behaviors in a single modeling structure. The SAINT technique is described and applications of the SAINT are discussed.
Koorehdavoudi, Hana; Bogdan, Paul
2016-01-01
Biological systems are frequently categorized as complex systems due to their capabilities of generating spatio-temporal structures from apparent random decisions. In spite of research on analyzing biological systems, we lack a quantifiable framework for measuring their complexity. To fill this gap, in this paper, we develop a new paradigm to study a collective group of N agents moving and interacting in a three-dimensional space. Our paradigm helps to identify the spatio-temporal states of the motion of the group and their associated transition probabilities. This framework enables the estimation of the free energy landscape corresponding to the identified states. Based on the energy landscape, we quantify missing information, emergence, self-organization and complexity for a collective motion. We show that the collective motion of the group of agents evolves to reach the most probable state with relatively lowest energy level and lowest missing information compared to other possible states. Our analysis demonstrates that the natural group of animals exhibit a higher degree of emergence, self-organization and complexity over time. Consequently, this algorithm can be integrated into new frameworks to engineer collective motions to achieve certain degrees of emergence, self-organization and complexity. PMID:27297496
NASA Astrophysics Data System (ADS)
Koorehdavoudi, Hana; Bogdan, Paul
2016-06-01
Biological systems are frequently categorized as complex systems due to their capabilities of generating spatio-temporal structures from apparent random decisions. In spite of research on analyzing biological systems, we lack a quantifiable framework for measuring their complexity. To fill this gap, in this paper, we develop a new paradigm to study a collective group of N agents moving and interacting in a three-dimensional space. Our paradigm helps to identify the spatio-temporal states of the motion of the group and their associated transition probabilities. This framework enables the estimation of the free energy landscape corresponding to the identified states. Based on the energy landscape, we quantify missing information, emergence, self-organization and complexity for a collective motion. We show that the collective motion of the group of agents evolves to reach the most probable state with relatively lowest energy level and lowest missing information compared to other possible states. Our analysis demonstrates that the natural group of animals exhibit a higher degree of emergence, self-organization and complexity over time. Consequently, this algorithm can be integrated into new frameworks to engineer collective motions to achieve certain degrees of emergence, self-organization and complexity.
NASA Technical Reports Server (NTRS)
Moore, B., III; Kaufmann, R.; Reinhold, C.
1981-01-01
Systems analysis and control theory consideration are given to simulations of both individual components and total systems, in order to develop a reliable control strategy for a Controlled Ecological Life Support System (CELSS) which includes complex biological components. Because of the numerous nonlinearities and tight coupling within the biological component, classical control theory may be inadequate and the statistical analysis of factorial experiments more useful. The range in control characteristics of particular species may simplify the overall task by providing an appropriate balance of stability and controllability to match species function in the overall design. The ultimate goal of this research is the coordination of biological and mechanical subsystems in order to achieve a self-supporting environment.
Performance analysis of Integrated Communication and Control System networks
NASA Technical Reports Server (NTRS)
Halevi, Y.; Ray, A.
1990-01-01
This paper presents statistical analysis of delays in Integrated Communication and Control System (ICCS) networks that are based on asynchronous time-division multiplexing. The models are obtained in closed form for analyzing control systems with randomly varying delays. The results of this research are applicable to ICCS design for complex dynamical processes like advanced aircraft and spacecraft, autonomous manufacturing plants, and chemical and processing plants.
Investigating dynamical complexity in the magnetosphere using various entropy measures
NASA Astrophysics Data System (ADS)
Balasis, Georgios; Daglis, Ioannis A.; Papadimitriou, Constantinos; Kalimeri, Maria; Anastasiadis, Anastasios; Eftaxias, Konstantinos
2009-09-01
The complex system of the Earth's magnetosphere corresponds to an open spatially extended nonequilibrium (input-output) dynamical system. The nonextensive Tsallis entropy has been recently introduced as an appropriate information measure to investigate dynamical complexity in the magnetosphere. The method has been employed for analyzing Dst time series and gave promising results, detecting the complexity dissimilarity among different physiological and pathological magnetospheric states (i.e., prestorm activity and intense magnetic storms, respectively). This paper explores the applicability and effectiveness of a variety of computable entropy measures (e.g., block entropy, Kolmogorov entropy, T complexity, and approximate entropy) to the investigation of dynamical complexity in the magnetosphere. We show that as the magnetic storm approaches there is clear evidence of significant lower complexity in the magnetosphere. The observed higher degree of organization of the system agrees with that inferred previously, from an independent linear fractal spectral analysis based on wavelet transforms. This convergence between nonlinear and linear analyses provides a more reliable detection of the transition from the quiet time to the storm time magnetosphere, thus showing evidence that the occurrence of an intense magnetic storm is imminent. More precisely, we claim that our results suggest an important principle: significant complexity decrease and accession of persistency in Dst time series can be confirmed as the magnetic storm approaches, which can be used as diagnostic tools for the magnetospheric injury (global instability). Overall, approximate entropy and Tsallis entropy yield superior results for detecting dynamical complexity changes in the magnetosphere in comparison to the other entropy measures presented herein. Ultimately, the analysis tools developed in the course of this study for the treatment of Dst index can provide convenience for space weather applications.
Expert diagnostics system as a part of analysis software for power mission operations
NASA Technical Reports Server (NTRS)
Harris, Jennifer A.; Bahrami, Khosrow A.
1993-01-01
The operation of interplanetary spacecraft at JPL has become an increasingly complex activity. This complexity is due to advanced spacecraft designs and ambitious mission objectives which lead to operations requirements that are more demanding than those of any previous mission. For this reason, several productivity enhancement measures are underway at JPL within mission operations, particularly in the spacecraft analysis area. These measures aimed at spacecraft analysis include: the development of a multi-mission, multi-subsystem operations environment; the introduction of automated tools into this environment; and the development of an expert diagnostics system. This paper discusses an effort to integrate the above mentioned productivity enhancement measures. A prototype was developed that integrates an expert diagnostics system into a multi-mission, multi-subsystem operations environment using the Galileo Power / Pyro Subsystem as a testbed. This prototype will be discussed in addition to background information associated with it.
A methodology for system-of-systems design in support of the engineering team
NASA Astrophysics Data System (ADS)
Ridolfi, G.; Mooij, E.; Cardile, D.; Corpino, S.; Ferrari, G.
2012-04-01
Space missions have experienced a trend of increasing complexity in the last decades, resulting in the design of very complex systems formed by many elements and sub-elements working together to meet the requirements. In a classical approach, especially in a company environment, the two steps of design-space exploration and optimization are usually performed by experts inferring on major phenomena, making assumptions and doing some trial-and-error runs on the available mathematical models. This is done especially in the very early design phases where most of the costs are locked-in. With the objective of supporting the engineering team and the decision-makers during the design of complex systems, the authors developed a modelling framework for a particular category of complex, coupled space systems called System-of-Systems. Once modelled, the System-of-Systems is solved using a computationally cheap parametric methodology, named the mixed-hypercube approach, based on the utilization of a particular type of fractional factorial design-of-experiments, and analysis of the results via global sensitivity analysis and response surfaces. As an applicative example, a system-of-systems of a hypothetical human space exploration scenario for the support of a manned lunar base is presented. The results demonstrate that using the mixed-hypercube to sample the design space, an optimal solution is reached with a limited computational effort, providing support to the engineering team and decision makers thanks to sensitivity and robustness information. The analysis of the system-of-systems model that was implemented shows that the logistic support of a human outpost on the Moon for 15 years is still feasible with currently available launcher classes. The results presented in this paper have been obtained in cooperation with Thales Alenia Space—Italy, in the framework of a regional programme called STEPS. STEPS—Sistemi e Tecnologie per l'EsPlorazione Spaziale is a research project co-financed by Piedmont Region and firms and universities of the Piedmont Aerospace District in the ambit of the P.O.R-F.E.S.R. 2007-2013 program.
Metabolic Compartmentation – A System Level Property of Muscle Cells
Saks, Valdur; Beraud, Nathalie; Wallimann, Theo
2008-01-01
Problems of quantitative investigation of intracellular diffusion and compartmentation of metabolites are analyzed. Principal controversies in recently published analyses of these problems for the living cells are discussed. It is shown that the formal theoretical analysis of diffusion of metabolites based on Fick's equation and using fixed diffusion coefficients for diluted homogenous aqueous solutions, but applied for biological systems in vivo without any comparison with experimental results, may lead to misleading conclusions, which are contradictory to most biological observations. However, if the same theoretical methods are used for analysis of actual experimental data, the apparent diffusion constants obtained are orders of magnitude lower than those in diluted aqueous solutions. Thus, it can be concluded that local restrictions of diffusion of metabolites in a cell are a system-level properties caused by complex structural organization of the cells, macromolecular crowding, cytoskeletal networks and organization of metabolic pathways into multienzyme complexes and metabolons. This results in microcompartmentation of metabolites, their channeling between enzymes and in modular organization of cellular metabolic networks. The perspectives of further studies of these complex intracellular interactions in the framework of Systems Biology are discussed. PMID:19325782
High frequency vibration analysis by the complex envelope vectorization.
Giannini, O; Carcaterra, A; Sestieri, A
2007-06-01
The complex envelope displacement analysis (CEDA) is a procedure to solve high frequency vibration and vibro-acoustic problems, providing the envelope of the physical solution. CEDA is based on a variable transformation mapping the high frequency oscillations into signals of low frequency content and has been successfully applied to one-dimensional systems. However, the extension to plates and vibro-acoustic fields met serious difficulties so that a general revision of the theory was carried out, leading finally to a new method, the complex envelope vectorization (CEV). In this paper the CEV method is described, underlying merits and limits of the procedure, and a set of applications to vibration and vibro-acoustic problems of increasing complexity are presented.
A probabilistic process model for pelagic marine ecosystems informed by Bayesian inverse analysis
Marine ecosystems are complex systems with multiple pathways that produce feedback cycles, which may lead to unanticipated effects. Models abstract this complexity and allow us to predict, understand, and hypothesize. In ecological models, however, the paucity of empirical data...
Ferrazzi, Priscilla; Krupa, Terry
2015-09-01
Studies that seek to understand and improve health care systems benefit from qualitative methods that employ theory to add depth, complexity, and context to analysis. Theories used in health research typically emerge from social science, but these can be inadequate for studying complex health systems. Mental health rehabilitation programs for criminal courts are complicated by their integration within the criminal justice system and by their dual health-and-justice objectives. In a qualitative multiple case study exploring the potential for these mental health court programs in Arctic communities, we assess whether a legal theory, known as therapeutic jurisprudence, functions as a useful methodological theory. Therapeutic jurisprudence, recruited across discipline boundaries, succeeds in guiding our qualitative inquiry at the complex intersection of mental health care and criminal law by providing a framework foundation for directing the study's research questions and the related propositions that focus our analysis. © The Author(s) 2014.
Structural analysis and design for the development of floating photovoltaic energy generation system
NASA Astrophysics Data System (ADS)
Yoon, S. J.; Joo, H. J.; Kim, S. H.
2018-06-01
In this paper, we discussed the structural analysis and design for the development of floating photovoltaic energy generation system. Series of research conducted to develop the system from the analysis and design of the structural system to the installation of the system discussed. In the structural system supporting solar panels PFRP materials and SMC FRP materials used. A unit module structure is fabricated and then the unit module structures are connected each other to assemble whole PV energy generation complex. This system connected directly to the power grid system. In addition, extensive monitoring for the efficiency of electricity generation and the soundness of the structural system is in progress for the further system enhancement.
Some Observations on the Current Status of Performing Finite Element Analyses
NASA Technical Reports Server (NTRS)
Raju, Ivatury S.; Knight, Norman F., Jr; Shivakumar, Kunigal N.
2015-01-01
Aerospace structures are complex high-performance structures. Advances in reliable and efficient computing and modeling tools are enabling analysts to consider complex configurations, build complex finite element models, and perform analysis rapidly. Many of the early career engineers of today are very proficient in the usage of modern computers, computing engines, complex software systems, and visualization tools. These young engineers are becoming increasingly efficient in building complex 3D models of complicated aerospace components. However, the current trends demonstrate blind acceptance of the results of the finite element analysis results. This paper is aimed at raising an awareness of this situation. Examples of the common encounters are presented. To overcome the current trends, some guidelines and suggestions for analysts, senior engineers, and educators are offered.
NASA Technical Reports Server (NTRS)
Clancey, William J.; Lee, Pascal; Sierhuis, Maarten; Norvig, Peter (Technical Monitor)
2001-01-01
Living and working on Mars will require model-based computer systems for maintaining and controlling complex life support, communication, transportation, and power systems. This technology must work properly on the first three-year mission, augmenting human autonomy, without adding-yet more complexity to be diagnosed and repaired. One design method is to work with scientists in analog (mars-like) setting to understand how they prefer to work, what constrains will be imposed by the Mars environment, and how to ameliorate difficulties. We describe how we are using empirical requirements analysis to prototype model-based tools at a research station in the High Canadian Arctic.
Electromagnetic game modeling through Tensor Analysis of Networks and Game Theory
NASA Astrophysics Data System (ADS)
Maurice, Olivier; Reineix, Alain; Lalléchère, Sébastien
2014-10-01
A complex system involves events coming from natural behaviors. Whatever is the complicated face of machines, they are still far from the complexity of natural systems. Currently, economy is one of the rare science trying to find out some ways to model human behavior. These attempts involve game theory and psychology. Our purpose is to develop a formalism able to take in charge both game and hardware modeling. We first present the Tensorial Analysis of Networks, used for the material part of the system. Then, we detail the mathematical objects defined in order to describe the evolution of the system and its gaming side. To illustrate the discussion we consider the case of a drone whose electronic can be disturbed by a radar field, but this drone must fly as near as possible close to this radar.
Integrating GIS and ABM to Explore Spatiotemporal Dynamics
NASA Astrophysics Data System (ADS)
Sun, M.; Jiang, Y.; Yang, C.
2013-12-01
Agent-based modeling as a methodology for the bottom-up exploration with the account of adaptive behavior and heterogeneity of system components can help discover the development and pattern of the complex social and environmental system. However, ABM is a computationally intensive process especially when the number of system components becomes large and the agent-agent/agent-environmental interaction is modeled very complex. Most of traditional ABM frameworks developed based on CPU do not have a satisfying computing capacity. To address the problem and as the emergence of advanced techniques, GPU computing with CUDA can provide powerful parallel structure to enable the complex simulation of spatiotemporal dynamics. In this study, we first develop a GPU-based ABM system. Secondly, in order to visualize the dynamics generated from the movement of agent and the change of agent/environmental attributes during the simulation, we integrate GIS into the ABM system. Advanced geovisualization technologies can be utilized for representing the spatiotemporal change events, such as proper 2D/3D maps with state-of-the-art symbols, space-time cube and multiple layers each of which presents pattern in one time-stamp, etc. Thirdly, visual analytics which include interactive tools (e.g. grouping, filtering, linking, etc.) is included in our ABM-GIS system to help users conduct real-time data exploration during the progress of simulation. Analysis like flow analysis and spatial cluster analysis can be integrated according to the geographical problem we want to explore.
Systems biology: A tool for charting the antiviral landscape.
Bowen, James R; Ferris, Martin T; Suthar, Mehul S
2016-06-15
The host antiviral programs that are initiated following viral infection form a dynamic and complex web of responses that we have collectively termed as "the antiviral landscape". Conventional approaches to studying antiviral responses have primarily used reductionist systems to assess the function of a single or a limited subset of molecules. Systems biology is a holistic approach that considers the entire system as a whole, rather than individual components or molecules. Systems biology based approaches facilitate an unbiased and comprehensive analysis of the antiviral landscape, while allowing for the discovery of emergent properties that are missed by conventional approaches. The antiviral landscape can be viewed as a hierarchy of complexity, beginning at the whole organism level and progressing downward to isolated tissues, populations of cells, and single cells. In this review, we will discuss how systems biology has been applied to better understand the antiviral landscape at each of these layers. At the organismal level, the Collaborative Cross is an invaluable genetic resource for assessing how genetic diversity influences the antiviral response. Whole tissue and isolated bulk cell transcriptomics serves as a critical tool for the comprehensive analysis of antiviral responses at both the tissue and cellular levels of complexity. Finally, new techniques in single cell analysis are emerging tools that will revolutionize our understanding of how individual cells within a bulk infected cell population contribute to the overall antiviral landscape. Copyright © 2016 Elsevier B.V. All rights reserved.
Stoichiometry for binding and transport by the twin arginine translocation system.
Celedon, Jose M; Cline, Kenneth
2012-05-14
Twin arginine translocation (Tat) systems transport large folded proteins across sealed membranes. Tat systems accomplish this feat with three membrane components organized in two complexes. In thylakoid membranes, cpTatC and Hcf106 comprise a large receptor complex containing an estimated eight cpTatC-Hcf106 pairs. Protein transport occurs when Tha4 joins the receptor complex as an oligomer of uncertain size that is thought to form the protein-conducting structure. Here, binding analyses with intact membranes or purified complexes indicate that each receptor complex could bind eight precursor proteins. Kinetic analysis of translocation showed that each precursor-bound site was independently functional for transport, and, with sufficient Tha4, all sites were concurrently active for transport. Tha4 titration determined that ∼26 Tha4 protomers were required for transport of each OE17 (oxygen-evolving complex subunit of 17 kD) precursor protein. Our results suggest that, when fully saturated with precursor proteins and Tha4, the Tat translocase is an ∼2.2-megadalton complex that can individually transport eight precursor proteins or cooperatively transport multimeric precursors.
NASA Astrophysics Data System (ADS)
Azougagh, Yassine; Benhida, Khalid; Elfezazi, Said
2016-02-01
In this paper, the focus is on studying the performance of complex systems in a supply chain context by developing a structured modelling approach based on the methodology ASDI (Analysis, Specification, Design and Implementation) by combining the modelling by Petri nets and simulation using ARENA. The linear approach typically followed in conducting of this kind of problems has to cope with a difficulty of modelling due to the complexity and the number of parameters of concern. Therefore, the approach used in this work is able to structure modelling a way to cover all aspects of the performance study. The modelling structured approach is first introduced before being applied to the case of an industrial system in the field of phosphate. Results of the performance indicators obtained from the models developed, permitted to test the behaviour and fluctuations of this system and to develop improved models of the current situation. In addition, in this paper, it was shown how Arena software can be adopted to simulate complex systems effectively. The method in this research can be applied to investigate various improvements scenarios and their consequences before implementing them in reality.
Systems Genetics as a Tool to Identify Master Genetic Regulators in Complex Disease.
Moreno-Moral, Aida; Pesce, Francesco; Behmoaras, Jacques; Petretto, Enrico
2017-01-01
Systems genetics stems from systems biology and similarly employs integrative modeling approaches to describe the perturbations and phenotypic effects observed in a complex system. However, in the case of systems genetics the main source of perturbation is naturally occurring genetic variation, which can be analyzed at the systems-level to explain the observed variation in phenotypic traits. In contrast with conventional single-variant association approaches, the success of systems genetics has been in the identification of gene networks and molecular pathways that underlie complex disease. In addition, systems genetics has proven useful in the discovery of master trans-acting genetic regulators of functional networks and pathways, which in many cases revealed unexpected gene targets for disease. Here we detail the central components of a fully integrated systems genetics approach to complex disease, starting from assessment of genetic and gene expression variation, linking DNA sequence variation to mRNA (expression QTL mapping), gene regulatory network analysis and mapping the genetic control of regulatory networks. By summarizing a few illustrative (and successful) examples, we highlight how different data-modeling strategies can be effectively integrated in a systems genetics study.
Holistic School Leadership: Systems Thinking as an Instructional Leadership Enabler
ERIC Educational Resources Information Center
Shaked, Haim; Schechter, Chen
2016-01-01
As instructional leadership involves attempts to understand and improve complex systems, this study explored principals' perceptions regarding possible contributions of systems thinking to instructional leadership. Based on a qualitative analysis, systems thinking was perceived by middle and high school principals to contribute to the following…
A novel conformation of gel grown biologically active cadmium nicotinate
NASA Astrophysics Data System (ADS)
Nair, Lekshmi P.; Bijini, B. R.; Divya, R.; Nair, Prabitha B.; Eapen, S. M.; Dileep Kumar, B. S.; Nishanth Kumar, S.; Nair, C. M. K.; Deepa, M.; Rajendra Babu, K.
2017-11-01
The elimination of toxic heavy metals by the formation of stable co-ordination compounds with biologically active ligands is applicable in drug designing. A new crystalline complex of cadmium with nicotinic acid is grown at ambient temperature using the single gel diffusion method in which the crystal structure is different from those already reported. Single crystal x-ray diffraction reveals the identity of crystal structure belonging to monoclinic system, P21/c space group with cell dimensions a = 17.220 (2) Å, b = 10.2480 (2) Å, c = 7.229(9) Å, β = 91.829(4)°. Powder x-ray diffraction analysis confirmed the crystallinity of the sample. The unidentate mode of co-ordination between the metal atom and the carboxylate group is supported by the Fourier Transform Infra Red spectral data. Thermal analysis ensures the thermal stability of the complex. Kinetic and thermodynamic parameters are also calculated. The stoichiometry of the complex is confirmed by the elemental analysis. The UV-visible spectral analysis shows the wide transparency window of the complex in the visible region. The band gap of the complex is found to be 3.92 eV. The complex shows excellent antibacterial and antifungal activity.
Chai, Liyuan; Yang, Jinqin; Zhang, Ning; Wu, Pin-Jiun; Li, Qingzhu; Wang, Qingwei; Liu, Hui; Yi, Haibo
2017-09-01
Aqueous complexes between ferric (Fe(III)) and arsenate (As(V)) are indispensable for understanding the mobility of arsenic (As) in Fe(III)-As(V)-rich systems. In this study, aqueous Fe(III)-As(V) complexes, FeH 2 AsO 4 2+ and FeHAsO 4 + , were postulated based on the qualitative analysis of UV-Vis spectra in both Fe(III)-As(V)-HClO 4 and Fe(III)-As(V)-H 2 SO 4 systems. Subsequently, monodentate structures were evidenced by Fe K-edge EXAFS and modeled as [FeH 2 AsO 4 (H 2 O) 5 ] 2+ and [FeHAsO 4 (H 2 O) 5 ] + by DFT. The feature band at ∼280 nm was verified as electron excitation chiefly from Fe-As-bridged O atoms to d-orbital of Fe in [FeH 2 AsO 4 (H 2 O) 5 ] 2+ and [FeHAsO 4 (H 2 O) 5 ] + . The structural and spectral information of Fe(III)-As(V) complexes will enable future speciation analysis in Fe(III)-As(V)-rich system. Copyright © 2017. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Alfi, Nafiseh; Khorasani-Motlagh, Mozhgan; Rezvani, Ali Reza; Noroozifar, Meissam; Molčanov, Krešimir
2017-06-01
A heteroleptic europium coordination compound formulated as [Eu(phen)2(OH2)2(Cl)2](Cl)(H2O) (phen = 1,10-phenanthroline), has been synthesized and characterized by elemental analysis, FT-IR spectroscopy, and single-crystal X-ray diffractometer. Crystal structure analysis reveals the complex is crystallized in orthorhombic system with Pca21 space group. Electronic absorption and various emission methods for investigation of the binding system of europium(III) complex to Fish Salmon deoxyribonucleic acid (FS-DNA) and Bovamin Serum Albumin (BSA) have been explored. Furthermore, the binding constants, binding sites and the corresponding thermodynamic parameters of the interaction system based on the van't Hoff equation for FS-DNA and BSA were calculated. The thermodynamic parameters reflect the exothermic nature of emission process (ΔH°<0 and ΔS°<0). The experimental results seem to indicate that the [Eu(phen)2(OH2)2(Cl)2](Cl)(H2O) bound to FS-DNA by non-intercalative mode which the groove binding is preferable mode. Also, the complex exhibits a brilliant antimicrobial activity in vitro against standard bacterial strains.
Phillips, Andrew B; Merrill, Jacqueline
2012-01-01
Many complex markets such as banking and manufacturing have benefited significantly from technology adoption. Each of these complex markets experienced increased efficiency, quality, security, and customer involvement as a result of technology transformation in their industry. Healthcare has not benefited to the same extent. We provide initial findings from a policy analysis of complex markets and the features of these transformations that can influence health technology adoption and acceptance.
System-level simulation of liquid filling in microfluidic chips.
Song, Hongjun; Wang, Yi; Pant, Kapil
2011-06-01
Liquid filling in microfluidic channels is a complex process that depends on a variety of geometric, operating, and material parameters such as microchannel geometry, flow velocity∕pressure, liquid surface tension, and contact angle of channel surface. Accurate analysis of the filling process can provide key insights into the filling time, air bubble trapping, and dead zone formation, and help evaluate trade-offs among the various design parameters and lead to optimal chip design. However, efficient modeling of liquid filling in complex microfluidic networks continues to be a significant challenge. High-fidelity computational methods, such as the volume of fluid method, are prohibitively expensive from a computational standpoint. Analytical models, on the other hand, are primarily applicable to idealized geometries and, hence, are unable to accurately capture chip level behavior of complex microfluidic systems. This paper presents a parametrized dynamic model for the system-level analysis of liquid filling in three-dimensional (3D) microfluidic networks. In our approach, a complex microfluidic network is deconstructed into a set of commonly used components, such as reservoirs, microchannels, and junctions. The components are then assembled according to their spatial layout and operating rationale to achieve a rapid system-level model. A dynamic model based on the transient momentum equation is developed to track the liquid front in the microchannels. The principle of mass conservation at the junction is used to link the fluidic parameters in the microchannels emanating from the junction. Assembly of these component models yields a set of differential and algebraic equations, which upon integration provides temporal information of the liquid filling process, particularly liquid front propagation (i.e., the arrival time). The models are used to simulate the transient liquid filling process in a variety of microfluidic constructs and in a multiplexer, representing a complex microfluidic network. The accuracy (relative error less than 7%) and orders-of-magnitude speedup (30 000X-4 000 000X) of our system-level models are verified by comparison against 3D high-fidelity numerical studies. Our findings clearly establish the utility of our models and simulation methodology for fast, reliable analysis of liquid filling to guide the design optimization of complex microfluidic networks.
NASA Astrophysics Data System (ADS)
Nuh, M. Z.; Nasir, N. F.
2017-08-01
Biodiesel as a fuel comprised of mono alkyl esters of long chain fatty acids derived from renewable lipid feedstock, such as vegetable oil and animal fat. Biodiesel production is complex process which need systematic design and optimization. However, no case study using the process system engineering (PSE) elements which are superstructure optimization of batch process, it involves complex problems and uses mixed-integer nonlinear programming (MINLP). The PSE offers a solution to complex engineering system by enabling the use of viable tools and techniques to better manage and comprehend the complexity of the system. This study is aimed to apply the PSE tools for the simulation of biodiesel process and optimization and to develop mathematical models for component of the plant for case A, B, C by using published kinetic data. Secondly, to determine economic analysis for biodiesel production, focusing on heterogeneous catalyst. Finally, the objective of this study is to develop the superstructure for biodiesel production by using heterogeneous catalyst. The mathematical models are developed by the superstructure and solving the resulting mixed integer non-linear model and estimation economic analysis by using MATLAB software. The results of the optimization process with the objective function of minimizing the annual production cost by batch process from case C is 23.2587 million USD. Overall, the implementation a study of process system engineering (PSE) has optimized the process of modelling, design and cost estimation. By optimizing the process, it results in solving the complex production and processing of biodiesel by batch.
GESA--a two-dimensional processing system using knowledge base techniques.
Rowlands, D G; Flook, A; Payne, P I; van Hoff, A; Niblett, T; McKee, S
1988-12-01
The successful analysis of two-dimensional (2-D) polyacrylamide electrophoresis gels demands considerable experience and understanding of the protein system under investigation as well as knowledge of the separation technique itself. The present work concerns the development of a computer system for analysing 2-D electrophoretic separations which incorporates concepts derived from artificial intelligence research such that non-experts can use the technique as a diagnostic or identification tool. Automatic analysis of 2-D gel separations has proved to be extremely difficult using statistical methods. Non-reproducibility of gel separations is also difficult to overcome using automatic systems. However, the human eye is extremely good at recognising patterns in images, and human intervention in semi-automatic computer systems can reduce the computational complexities of fully automatic systems. Moreover, the expertise and understanding of an "expert" is invaluable in reducing system complexity if it can be encapsulated satisfactorily in an expert system. The combination of user-intervention in the computer system together with the encapsulation of expert knowledge characterises the present system. The domain within which the system has been developed is that of wheat grain storage proteins (gliadins) which exhibit polymorphism to such an extent that cultivars can be uniquely identified by their gliadin patterns. The system can be adapted to other domains where a range of polymorpic protein sub-units exist. In its generalised form, the system can also be used for comparing more complex 2-D gel electrophoretic separations.
An Approach to Experimental Design for the Computer Analysis of Complex Phenomenon
NASA Technical Reports Server (NTRS)
Rutherford, Brian
2000-01-01
The ability to make credible system assessments, predictions and design decisions related to engineered systems and other complex phenomenon is key to a successful program for many large-scale investigations in government and industry. Recently, many of these large-scale analyses have turned to computational simulation to provide much of the required information. Addressing specific goals in the computer analysis of these complex phenomenon is often accomplished through the use of performance measures that are based on system response models. The response models are constructed using computer-generated responses together with physical test results where possible. They are often based on probabilistically defined inputs and generally require estimation of a set of response modeling parameters. As a consequence, the performance measures are themselves distributed quantities reflecting these variabilities and uncertainties. Uncertainty in the values of the performance measures leads to uncertainties in predicted performance and can cloud the decisions required of the analysis. A specific goal of this research has been to develop methodology that will reduce this uncertainty in an analysis environment where limited resources and system complexity together restrict the number of simulations that can be performed. An approach has been developed that is based on evaluation of the potential information provided for each "intelligently selected" candidate set of computer runs. Each candidate is evaluated by partitioning the performance measure uncertainty into two components - one component that could be explained through the additional computational simulation runs and a second that would remain uncertain. The portion explained is estimated using a probabilistic evaluation of likely results for the additional computational analyses based on what is currently known about the system. The set of runs indicating the largest potential reduction in uncertainty is then selected and the computational simulations are performed. Examples are provided to demonstrate this approach on small scale problems. These examples give encouraging results. Directions for further research are indicated.
Intelligent control of a planning system for astronaut training.
Ortiz, J; Chen, G
1999-07-01
This work intends to design, analyze and solve, from the systems control perspective, a complex, dynamic, and multiconstrained planning system for generating training plans for crew members of the NASA-led International Space Station. Various intelligent planning systems have been developed within the framework of artificial intelligence. These planning systems generally lack a rigorous mathematical formalism to allow a reliable and flexible methodology for their design, modeling, and performance analysis in a dynamical, time-critical, and multiconstrained environment. Formulating the planning problem in the domain of discrete-event systems under a unified framework such that it can be modeled, designed, and analyzed as a control system will provide a self-contained theory for such planning systems. This will also provide a means to certify various planning systems for operations in the dynamical and complex environments in space. The work presented here completes the design, development, and analysis of an intricate, large-scale, and representative mathematical formulation for intelligent control of a real planning system for Space Station crew training. This planning system has been tested and used at NASA-Johnson Space Center.
Reversible heart rhythm complexity impairment in patients with primary aldosteronism
NASA Astrophysics Data System (ADS)
Lin, Yen-Hung; Wu, Vin-Cent; Lo, Men-Tzung; Wu, Xue-Ming; Hung, Chi-Sheng; Wu, Kwan-Dun; Lin, Chen; Ho, Yi-Lwun; Stowasser, Michael; Peng, Chung-Kang
2015-08-01
Excess aldosterone secretion in patients with primary aldosteronism (PA) impairs their cardiovascular system. Heart rhythm complexity analysis, derived from heart rate variability (HRV), is a powerful tool to quantify the complex regulatory dynamics of human physiology. We prospectively analyzed 20 patients with aldosterone producing adenoma (APA) that underwent adrenalectomy and 25 patients with essential hypertension (EH). The heart rate data were analyzed by conventional HRV and heart rhythm complexity analysis including detrended fluctuation analysis (DFA) and multiscale entropy (MSE). We found APA patients had significantly decreased DFAα2 on DFA analysis and decreased area 1-5, area 6-15, and area 6-20 on MSE analysis (all p < 0.05). Area 1-5, area 6-15, area 6-20 in the MSE study correlated significantly with log-transformed renin activity and log-transformed aldosterone-renin ratio (all p < = 0.01). The conventional HRV parameters were comparable between PA and EH patients. After adrenalectomy, all the altered DFA and MSE parameters improved significantly (all p < 0.05). The conventional HRV parameters did not change. Our result suggested that heart rhythm complexity is impaired in APA patients and this is at least partially reversed by adrenalectomy.
Hofmann, Matthias J.; Koelsch, Patrick
2015-01-01
Vibrational sum-frequency generation (SFG) spectroscopy has become an established technique for in situ surface analysis. While spectral recording procedures and hardware have been optimized, unique data analysis routines have yet to be established. The SFG intensity is related to probing geometries and properties of the system under investigation such as the absolute square of the second-order susceptibility χ(2)2. A conventional SFG intensity measurement does not grant access to the complex parts of χ(2) unless further assumptions have been made. It is therefore difficult, sometimes impossible, to establish a unique fitting solution for SFG intensity spectra. Recently, interferometric phase-sensitive SFG or heterodyne detection methods have been introduced to measure real and imaginary parts of χ(2) experimentally. Here, we demonstrate that iterative phase-matching between complex spectra retrieved from maximum entropy method analysis and fitting of intensity SFG spectra (iMEMfit) leads to a unique solution for the complex parts of χ(2) and enables quantitative analysis of SFG intensity spectra. A comparison between complex parts retrieved by iMEMfit applied to intensity spectra and phase sensitive experimental data shows excellent agreement between the two methods. PMID:26450297
A Review of Diagnostic Techniques for ISHM Applications
NASA Technical Reports Server (NTRS)
Patterson-Hine, Ann; Biswas, Gautam; Aaseng, Gordon; Narasimhan, Sriam; Pattipati, Krishna
2005-01-01
System diagnosis is an integral part of any Integrated System Health Management application. Diagnostic applications make use of system information from the design phase, such as safety and mission assurance analysis, failure modes and effects analysis, hazards analysis, functional models, fault propagation models, and testability analysis. In modern process control and equipment monitoring systems, topological and analytic , models of the nominal system, derived from design documents, are also employed for fault isolation and identification. Depending on the complexity of the monitored signals from the physical system, diagnostic applications may involve straightforward trending and feature extraction techniques to retrieve the parameters of importance from the sensor streams. They also may involve very complex analysis routines, such as signal processing, learning or classification methods to derive the parameters of importance to diagnosis. The process that is used to diagnose anomalous conditions from monitored system signals varies widely across the different approaches to system diagnosis. Rule-based expert systems, case-based reasoning systems, model-based reasoning systems, learning systems, and probabilistic reasoning systems are examples of the many diverse approaches ta diagnostic reasoning. Many engineering disciplines have specific approaches to modeling, monitoring and diagnosing anomalous conditions. Therefore, there is no "one-size-fits-all" approach to building diagnostic and health monitoring capabilities for a system. For instance, the conventional approaches to diagnosing failures in rotorcraft applications are very different from those used in communications systems. Further, online and offline automated diagnostic applications are integrated into an operations framework with flight crews, flight controllers and maintenance teams. While the emphasis of this paper is automation of health management functions, striking the correct balance between automated and human-performed tasks is a vital concern.
Developing interprofessional education online: An ecological systems theory analysis.
Bluteau, Patricia; Clouder, Lynn; Cureton, Debra
2017-07-01
This article relates the findings of a discourse analysis of an online asynchronous interprofessional learning initiative involving two UK universities. The impact of the initiative is traced over three intensive periods of online interaction, each of several-weeks duration occurring over a three-year period, through an analysis of a random sample of discussion forum threads. The corpus of rich data drawn from the forums is interpreted using ecological systems theory, which highlights the complexity of interaction of individual, social and cultural elements. Ecological systems theory adopts a life course approach to understand how development occurs through processes of progressively more complex reciprocal interaction between people and their environment. This lens provides a novel approach for analysis and interpretation of findings with respect to the impact of pre-registration interprofessional education and the interaction between the individual and their social and cultural contexts as they progress through 3/4 years of their programmes. Development is mapped over time (the chronosystem) to highlight the complexity of interaction across microsystems (individual), mesosystems (curriculum and institutional/care settings), exosystems (community/wider local context), and macrosystems (national context and culture). This article illustrates the intricacies of students' interprofessional development over time and the interactive effects of social ecological components in terms of professional knowledge and understanding, wider appreciation of health and social care culture and identity work. The implications for contemporary pre-registration interprofessional education and the usefulness and applicability of ecological systems theory for future research and development are considered.
Trenholm, Susan; Ferlie, Ewan
2013-09-01
We employ complexity theory to analyse the English National Health Service (NHS)'s organisational response to resurgent tuberculosis across London. Tennison (2002) suggests that complexity theory could fruitfully explore a healthcare system's response to this complex and emergent phenomenon: we explore this claim here. We also bring in established New Public Management principles to enhance our empirical analysis, which is based on data collected between late 2009 and mid-2011. We find that the operation of complexity theory based features, especially self-organisation, are significantly impacted by the macro context of a New Public Management-based regime which values control, measurement and risk management more than innovation, flexibility and lateral system building. We finally explore limitations and suggest perspectives for further research. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bosikov, I. I.; Klyuev, R. V.; Revazov, V. Ch; Pilieva, D. E.
2018-03-01
The article describes research and analysis of hazardous processes occurring in the natural-industrial system and effectiveness assessment of its functioning using mathematical models. Studies of the functioning regularities of the natural and industrial system are becoming increasingly relevant in connection with the formulation of the task of modernizing production and the economy of Russia as a whole. In connection with a significant amount of poorly structured data, it is complicated by regulations for the effective functioning of production processes, social and natural complexes, under which a sustainable development of the natural-industrial system of the mining and processing complex would be ensured. Therefore, the scientific and applied problems, the solution of which allows one to formalize the hidden structural functioning patterns of the natural-industrial system and to make managerial decisions of organizational and technological nature to improve the efficiency of the system, are very relevant.
ERIC Educational Resources Information Center
Skinner, Anna; Diller, David; Kumar, Rohit; Cannon-Bowers, Jan; Smith, Roger; Tanaka, Alyssa; Julian, Danielle; Perez, Ray
2018-01-01
Background: Contemporary work in the design and development of intelligent training systems employs task analysis (TA) methods for gathering knowledge that is subsequently encoded into task models. These task models form the basis of intelligent interpretation of student performance within education and training systems. Also referred to as expert…
Phase locking route behind complex periodic windows in a forced oscillator
NASA Astrophysics Data System (ADS)
Jan, Hengtai; Tsai, Kuo-Ting; Kuo, Li-wei
2013-09-01
Chaotic systems have complex reactions against an external driving force; even in cases with low-dimension oscillators, the routes to synchronization are diverse. We proposed a stroboscope-based method for analyzing driven chaotic systems in their phase space. According to two statistic quantities generated from time series, we could realize the system state and the driving behavior simultaneously. We demonstrated our method in a driven bi-stable system, which showed complex period windows under a proper driving force. With increasing periodic driving force, a route from interior periodic oscillation to phase synchronization through the chaos state could be found. Periodic windows could also be identified and the circumstances under which they occurred distinguished. Statistical results were supported by conditional Lyapunov exponent analysis to show the power in analyzing the unknown time series.
Estimating the Effects of Damping Treatments on the Vibration of Complex Structures
2012-09-26
26 4.3 Literature review 26 4.3.1 CLD Theory 26 4.3.2 Temperature Profiling 28 4.4 Constrained Layer Damping Analysis 29 4.5 Results 35...Coordinate systems and length scales are noted. Constraining layer, viscoelastic layer and base layer pertain to the nomenclature used through CLD ...for vibrational damping 4.1 Introduction Constrained layer damping ( CLD ) treatment systems are widely used in complex structures to dissipate
ERIC Educational Resources Information Center
Vogler, Jane S.; Schallert, Diane L.; Jordan, Michelle E.; Song, Kwangok; Sanders, Anke J. Z.; Te Chiang, Yueh-hui Yan; Lee, Ji-Eun; Park, Jeongbin Hannah; Yu, Li-Tang
2017-01-01
Complex adaptive systems theory served as a framework for this qualitative study exploring the process of how meaning emerges from the collective interactions of individuals in a synchronous online discussion through their shared words about a topic. In an effort to bridge levels of analysis from the individual to the small group to the community,…
1983-12-01
while at the same time improving its operational efficiency. Through their integration and use, System Program Managers have a comprehensive analytical... systems . The NRLA program is hosted on the CREATE Operating System and contains approxiamately 5500 lines of computer code. It consists of a main...associated with C alternative maintenance plans. As the technological complexity of weapons systems has increased new and innovative logisitcal support
NASA Technical Reports Server (NTRS)
Gore, Brian F.
2011-01-01
As automation and advanced technologies are introduced into transport systems ranging from the Next Generation Air Transportation System termed NextGen, to the advanced surface transportation systems as exemplified by the Intelligent Transportations Systems, to future systems designed for space exploration, there is an increased need to validly predict how the future systems will be vulnerable to error given the demands imposed by the assistive technologies. One formalized approach to study the impact of assistive technologies on the human operator in a safe and non-obtrusive manner is through the use of human performance models (HPMs). HPMs play an integral role when complex human-system designs are proposed, developed, and tested. One HPM tool termed the Man-machine Integration Design and Analysis System (MIDAS) is a NASA Ames Research Center HPM software tool that has been applied to predict human-system performance in various domains since 1986. MIDAS is a dynamic, integrated HPM and simulation environment that facilitates the design, visualization, and computational evaluation of complex man-machine system concepts in simulated operational environments. The paper will discuss a range of aviation specific applications including an approach used to model human error for NASA s Aviation Safety Program, and what-if analyses to evaluate flight deck technologies for NextGen operations. This chapter will culminate by raising two challenges for the field of predictive HPMs for complex human-system designs that evaluate assistive technologies: that of (1) model transparency and (2) model validation.
Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems
NASA Astrophysics Data System (ADS)
Koch, Patrick Nathan
Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.
Preliminary Work Domain Analysis for Human Extravehicular Activity
NASA Technical Reports Server (NTRS)
McGuire, Kerry; Miller, Matthew; Feigh, Karen
2015-01-01
A work domain analysis (WDA) of human extravehicular activity (EVA) is presented in this study. A formative methodology such as Cognitive Work Analysis (CWA) offers a new perspective to the knowledge gained from the past 50 years of living and working in space for the development of future EVA support systems. EVA is a vital component of human spaceflight and provides a case study example of applying a work domain analysis (WDA) to a complex sociotechnical system. The WDA presented here illustrates how the physical characteristics of the environment, hardware, and life support systems of the domain guide the potential avenues and functional needs of future EVA decision support system development.
Using value-based analysis to influence outcomes in complex surgical systems.
Kirkpatrick, John R; Marks, Stanley; Slane, Michele; Kim, Donald; Cohen, Lance; Cortelli, Michael; Plate, Juan; Perryman, Richard; Zapas, John
2015-04-01
Value-based analysis (VBA) is a management strategy used to determine changes in value (quality/cost) when a usual practice (UP) is replaced by a best practice (BP). Previously validated in clinical initiatives, its usefulness in complex systems is unknown. To answer this question, we used VBA to correct deficiencies in cardiac surgery at Memorial Healthcare System. Cardiac surgery is a complex surgical system that lends itself to VBA because outcomes metrics provided by the Society of Thoracic Surgeons provide an estimate of quality; cost is available from Centers for Medicare and Medicaid Services and other contemporary sources; the UP can be determined; and the best practice can be established. Analysis of the UP at Memorial Healthcare System revealed considerable deficiencies in selection of patients for surgery; the surgery itself, including choice of procedure and outcomes; after care; follow-up; and control of expenditures. To correct these deficiencies, each UP was replaced with a BP. Changes included replacement of most of the cardiac surgeons; conversion to an employed physician model; restructuring of a heart surgery unit; recruitment of cardiac anesthesiologists; introduction of an interactive educational program; eliminating unsafe practices; and reducing cost. There was a significant (p < 0.01) reduction in readmissions, complications, and mortality between 2009 and 2013. Memorial Healthcare System was only 1 of 17 (1.7%) database participants (n = 1,009) to achieve a Society of Thoracic Surgeons 3-star rating in all 3 measured categories. Despite substantial improvements in quality, the cost per case and the length of stay declined. These changes created a savings opportunity of $14 million, with actual savings of $10.4 million. These findings suggest that VBA can be a powerful tool to enhance value (quality/cost) in a complex surgical system. Copyright © 2015 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
[Social actors and phenomenologic modelling].
Laflamme, Simon
2012-05-01
The phenomenological approach has a quasi-monopoly in the individual and subjectivity analyses in social sciences. However, the conceptual apparatus associated with this approach is very restrictive. The human being has to be understood as rational, conscious, intentional, interested, and autonomous. Because of this, a large dimension of human activity cannot be taken into consideration: all that does not fit into the analytical categories (nonrational, nonconscious, etc.). Moreover, this approach cannot really move toward a relational analysis unless it is between individuals predefined by its conceptual apparatus. This lack of complexity makes difficult the establishment of links between phenomenology and systemic analysis in which relation (and its derivatives such as recursiveness, dialectic, correlation) plays an essential role. This article intends to propose a way for systemic analysis to apprehend the individual with respect to his complexity.
NASA Technical Reports Server (NTRS)
Davies, Misty D.; Gundy-Burlet, Karen
2010-01-01
A useful technique for the validation and verification of complex flight systems is Monte Carlo Filtering -- a global sensitivity analysis that tries to find the inputs and ranges that are most likely to lead to a subset of the outputs. A thorough exploration of the parameter space for complex integrated systems may require thousands of experiments and hundreds of controlled and measured variables. Tools for analyzing this space often have limitations caused by the numerical problems associated with high dimensionality and caused by the assumption of independence of all of the dimensions. To combat both of these limitations, we propose a technique that uses a combination of the original variables with the derived variables obtained during a principal component analysis.
Autonomous perception and decision making in cyber-physical systems
NASA Astrophysics Data System (ADS)
Sarkar, Soumik
2011-07-01
The cyber-physical system (CPS) is a relatively new interdisciplinary technology area that includes the general class of embedded and hybrid systems. CPSs require integration of computation and physical processes that involves the aspects of physical quantities such as time, energy and space during information processing and control. The physical space is the source of information and the cyber space makes use of the generated information to make decisions. This dissertation proposes an overall architecture of autonomous perception-based decision & control of complex cyber-physical systems. Perception involves the recently developed framework of Symbolic Dynamic Filtering for abstraction of physical world in the cyber space. For example, under this framework, sensor observations from a physical entity are discretized temporally and spatially to generate blocks of symbols, also called words that form a language. A grammar of a language is the set of rules that determine the relationships among words to build sentences. Subsequently, a physical system is conjectured to be a linguistic source that is capable of generating a specific language. The proposed technology is validated on various (experimental and simulated) case studies that include health monitoring of aircraft gas turbine engines, detection and estimation of fatigue damage in polycrystalline alloys, and parameter identification. Control of complex cyber-physical systems involve distributed sensing, computation, control as well as complexity analysis. A novel statistical mechanics-inspired complexity analysis approach is proposed in this dissertation. In such a scenario of networked physical systems, the distribution of physical entities determines the underlying network topology and the interaction among the entities forms the abstract cyber space. It is envisioned that the general contributions, made in this dissertation, will be useful for potential application areas such as smart power grids and buildings, distributed energy systems, advanced health care procedures and future ground and air transportation systems.
Research on the EDM Technology for Micro-holes at Complex Spatial Locations
NASA Astrophysics Data System (ADS)
Y Liu, J.; Guo, J. M.; Sun, D. J.; Cai, Y. H.; Ding, L. T.; Jiang, H.
2017-12-01
For the demands on machining micro-holes at complex spatial location, several key technical problems are conquered such as micro-Electron Discharge Machining (micro-EDM) power supply system’s development, the host structure’s design and machining process technical. Through developing low-voltage power supply circuit, high-voltage circuit, micro and precision machining circuit and clearance detection system, the narrow pulse and high frequency six-axis EDM machining power supply system is developed to meet the demands on micro-hole discharging machining. With the method of combining the CAD structure design, CAE simulation analysis, modal test, ODS (Operational Deflection Shapes) test and theoretical analysis, the host construction and key axes of the machine tool are optimized to meet the position demands of the micro-holes. Through developing the special deionized water filtration system to make sure that the machining process is stable enough. To verify the machining equipment and processing technical developed in this paper through developing the micro-hole’s processing flow and test on the real machine tool. As shown in the final test results: the efficient micro-EDM machining pulse power supply system, machine tool host system, deionized filtration system and processing method developed in this paper meet the demands on machining micro-holes at complex spatial locations.
The Speech Community in Evolutionary Language Dynamics
ERIC Educational Resources Information Center
Blythe, Richard A.; Croft, William A.
2009-01-01
Language is a complex adaptive system: Speakers are agents who interact with each other, and their past and current interactions feed into speakers' future behavior in complex ways. In this article, we describe the social cognitive linguistic basis for this analysis of language and a mathematical model developed in collaboration between…
Weiss, Michael; Hultsch, Henrike; Adam, Iris; Scharff, Constance; Kipper, Silke
2014-06-22
The singing of song birds can form complex signal systems comprised of numerous subunits sung with distinct combinatorial properties that have been described as syntax-like. This complexity has inspired inquiries into similarities of bird song to human language; but the quantitative analysis and description of song sequences is a challenging task. In this study, we analysed song sequences of common nightingales (Luscinia megarhynchos) by means of a network analysis. We translated long nocturnal song sequences into networks of song types with song transitions as connectors. As network measures, we calculated shortest path length and transitivity and identified the 'small-world' character of nightingale song networks. Besides comparing network measures with conventional measures of song complexity, we also found a correlation between network measures and age of birds. Furthermore, we determined the numbers of in-coming and out-going edges of each song type, characterizing transition patterns. These transition patterns were shared across males for certain song types. Playbacks with different transition patterns provided first evidence that these patterns are responded to differently and thus play a role in singing interactions. We discuss potential functions of the network properties of song sequences in the framework of vocal leadership. Network approaches provide biologically meaningful parameters to describe the song structure of species with extremely large repertoires and complex rules of song retrieval.
Weiss, Michael; Hultsch, Henrike; Adam, Iris; Scharff, Constance; Kipper, Silke
2014-01-01
The singing of song birds can form complex signal systems comprised of numerous subunits sung with distinct combinatorial properties that have been described as syntax-like. This complexity has inspired inquiries into similarities of bird song to human language; but the quantitative analysis and description of song sequences is a challenging task. In this study, we analysed song sequences of common nightingales (Luscinia megarhynchos) by means of a network analysis. We translated long nocturnal song sequences into networks of song types with song transitions as connectors. As network measures, we calculated shortest path length and transitivity and identified the ‘small-world’ character of nightingale song networks. Besides comparing network measures with conventional measures of song complexity, we also found a correlation between network measures and age of birds. Furthermore, we determined the numbers of in-coming and out-going edges of each song type, characterizing transition patterns. These transition patterns were shared across males for certain song types. Playbacks with different transition patterns provided first evidence that these patterns are responded to differently and thus play a role in singing interactions. We discuss potential functions of the network properties of song sequences in the framework of vocal leadership. Network approaches provide biologically meaningful parameters to describe the song structure of species with extremely large repertoires and complex rules of song retrieval. PMID:24807258
Analysis of chaos in high-dimensional wind power system.
Wang, Cong; Zhang, Hongli; Fan, Wenhui; Ma, Ping
2018-01-01
A comprehensive analysis on the chaos of a high-dimensional wind power system is performed in this study. A high-dimensional wind power system is more complex than most power systems. An 11-dimensional wind power system proposed by Huang, which has not been analyzed in previous studies, is investigated. When the systems are affected by external disturbances including single parameter and periodic disturbance, or its parameters changed, chaotic dynamics of the wind power system is analyzed and chaotic parameters ranges are obtained. Chaos existence is confirmed by calculation and analysis of all state variables' Lyapunov exponents and the state variable sequence diagram. Theoretical analysis and numerical simulations show that the wind power system chaos will occur when parameter variations and external disturbances change to a certain degree.
Complexity and chaos control in a discrete-time prey-predator model
NASA Astrophysics Data System (ADS)
Din, Qamar
2017-08-01
We investigate the complex behavior and chaos control in a discrete-time prey-predator model. Taking into account the Leslie-Gower prey-predator model, we propose a discrete-time prey-predator system with predator partially dependent on prey and investigate the boundedness, existence and uniqueness of positive equilibrium and bifurcation analysis of the system by using center manifold theorem and bifurcation theory. Various feedback control strategies are implemented for controlling the bifurcation and chaos in the system. Numerical simulations are provided to illustrate theoretical discussion.
NASA Astrophysics Data System (ADS)
Sharma, Raj Pal; Saini, Anju; Kumar, Santosh; Kumar, Jitendra; Sathishkumar, Ranganathan; Venugopalan, Paloth
2017-01-01
A new anionic copper(II) complex, (MeImH)2 [Cu(pfbz)4] (1) where, MeImH = 2-methylimidazolium and pfbz = pentafluorobenzoate has been isolated by reacting copper(II) sulfate pentahydrate, pentafluorobenzoic acid and 2-methylimidazole in ethanol: water mixture in 1:2:2 molar ratio. This complex 1 has been characterized by elemental analysis, thermogravimetric analysis, spectroscopic techniques (UV-Vis, FT-IR) and conductance measurements. The complex salt crystallizes in monoclinic crystal system with space group C2/c. Single crystal X-ray structure determination revealed the presence of discrete ions: [Cu(pfbz)4]2- anion and two 2-methylimidazolium cation (C4H7N2)+. The crystal lattice is stabilized by strong hydrogen bonding and F⋯F interactions between cationic-anionic and the anionic-anionic moieties respectively, besides π-π interactions.
Intelligibility in microbial complex systems: Wittgenstein and the score of life.
Baquero, Fernando; Moya, Andrés
2012-01-01
Knowledge in microbiology is reaching an extreme level of diversification and complexity, which paradoxically results in a strong reduction in the intelligibility of microbial life. In our days, the "score of life" metaphor is more accurate to express the complexity of living systems than the classic "book of life." Music and life can be represented at lower hierarchical levels by music scores and genomic sequences, and such representations have a generational influence in the reproduction of music and life. If music can be considered as a representation of life, such representation remains as unthinkable as life itself. The analysis of scores and genomic sequences might provide mechanistic, phylogenetic, and evolutionary insights into music and life, but not about their real dynamics and nature, which is still maintained unthinkable, as was proposed by Wittgenstein. As complex systems, life or music is composed by thinkable and only showable parts, and a strategy of half-thinking, half-seeing is needed to expand knowledge. Complex models for complex systems, based on experiences on trans-hierarchical integrations, should be developed in order to provide a mixture of legibility and imageability of biological processes, which should lead to higher levels of intelligibility of microbial life.
Intelligibility in microbial complex systems: Wittgenstein and the score of life
Baquero, Fernando; Moya, Andrés
2012-01-01
Knowledge in microbiology is reaching an extreme level of diversification and complexity, which paradoxically results in a strong reduction in the intelligibility of microbial life. In our days, the “score of life” metaphor is more accurate to express the complexity of living systems than the classic “book of life.” Music and life can be represented at lower hierarchical levels by music scores and genomic sequences, and such representations have a generational influence in the reproduction of music and life. If music can be considered as a representation of life, such representation remains as unthinkable as life itself. The analysis of scores and genomic sequences might provide mechanistic, phylogenetic, and evolutionary insights into music and life, but not about their real dynamics and nature, which is still maintained unthinkable, as was proposed by Wittgenstein. As complex systems, life or music is composed by thinkable and only showable parts, and a strategy of half-thinking, half-seeing is needed to expand knowledge. Complex models for complex systems, based on experiences on trans-hierarchical integrations, should be developed in order to provide a mixture of legibility and imageability of biological processes, which should lead to higher levels of intelligibility of microbial life. PMID:22919679
Complex Networks/Foundations of Information Systems
2013-03-06
the benefit of feedback or dynamic correlations in coding and protocol. Using Renyi correlation analysis and entropy to model this wider class of...dynamic heterogeneous conditions. Lizhong Zheng, MIT Renyi Channel Correlation Analysis (connected to geometric curvature) Network Channel
Comparative analysis of techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Hitt, E. F.; Bridgman, M. S.; Robinson, A. C.
1981-01-01
Performability analysis is a technique developed for evaluating the effectiveness of fault-tolerant computing systems in multiphase missions. Performability was evaluated for its accuracy, practical usefulness, and relative cost. The evaluation was performed by applying performability and the fault tree method to a set of sample problems ranging from simple to moderately complex. The problems involved as many as five outcomes, two to five mission phases, permanent faults, and some functional dependencies. Transient faults and software errors were not considered. A different analyst was responsible for each technique. Significantly more time and effort were required to learn performability analysis than the fault tree method. Performability is inherently as accurate as fault tree analysis. For the sample problems, fault trees were more practical and less time consuming to apply, while performability required less ingenuity and was more checkable. Performability offers some advantages for evaluating very complex problems.
NASA Astrophysics Data System (ADS)
Wlodarczyk, Jakub; Kierdaszuk, Borys
2005-08-01
Decays of tyrosine fluorescence in protein-ligand complexes are described by a model of continuous distribution of fluorescence lifetimes. Resulted analytical power-like decay function provides good fits to highly complex fluorescence kinetics. Moreover, this is a manifestation of so-called Tsallis q-exponential function, which is suitable for description of the systems with long-range interactions, memory effect, as well as with fluctuations of the characteristic lifetime of fluorescence. The proposed decay functions were applied to analysis of fluorescence decays of tyrosine in a protein, i.e. the enzyme purine nucleoside phosphorylase from E. coli (the product of the deoD gene), free in aqueous solution and in a complex with formycin A (an inhibitor) and orthophosphate (a co-substrate). The power-like function provides new information about enzyme-ligand complex formation based on the physically justified heterogeneity parameter directly related to the lifetime distribution. A measure of the heterogeneity parameter in the enzyme systems is provided by a variance of fluorescence lifetime distribution. The possible number of deactivation channels and excited state mean lifetime can be easily derived without a priori knowledge of the complexity of studied system. Moreover, proposed model is simpler then traditional multi-exponential one, and better describes heterogeneous nature of studied systems.
Applying Early Systems Engineering: Injecting Knowledge into the Capability Development Process
2012-10-01
involves early use of systems engi- neering and technical analyses to supplement the existing operational analysis techniques currently used in...complexity, and costs of systems now being developed require tight coupling between operational requirements stated in the CDD, system requirements...Fleischer » Keywords: Capability Development, Competitive Prototyping, Knowledge Points, Early Systems Engineering Applying Early Systems
MacGregor, Hayley; McKenzie, Andrew; Jacobs, Tanya; Ullauri, Angelica
2018-04-25
In 2011, a decision was made to scale up a pilot innovation involving 'adherence clubs' as a form of differentiated care for HIV positive people in the public sector antiretroviral therapy programme in the Western Cape Province of South Africa. In 2016 we were involved in the qualitative aspect of an evaluation of the adherence club model, the overall objective of which was to assess the health outcomes for patients accessing clubs through epidemiological analysis, and to conduct a health systems analysis to evaluate how the model of care performed at scale. In this paper we adopt a complex adaptive systems lens to analyse planned organisational change through intervention in a state health system. We explore the challenges associated with taking to scale a pilot that began as a relatively simple innovation by a non-governmental organisation. Our analysis reveals how a programme initially representing a simple, unitary system in terms of management and clinical governance had evolved into a complex, differentiated care system. An innovation that was assessed as an excellent idea and received political backing, worked well whilst supported on a small scale. However, as scaling up progressed, challenges have emerged at the same time as support has waned. We identified a 'tipping point' at which the system was more likely to fail, as vulnerabilities magnified and the capacity for adaptation was exceeded. Yet the study also revealed the impressive capacity that a health system can have for catalysing novel approaches. We argue that innovation in largescale, complex programmes in health systems is a continuous process that requires ongoing support and attention to new innovation as challenges emerge. Rapid scaling up is also likely to require recourse to further resources, and a culture of iterative learning to address emerging challenges and mitigate complex system errors. These are necessary steps to the future success of adherence clubs as a cornerstone of differentiated care. Further research is needed to assess the equity and quality outcomes of a differentiated care model and to ensure the inclusive distribution of the benefits to all categories of people living with HIV.
NASA Astrophysics Data System (ADS)
Li, Shu-Bin; Cao, Dan-Ni; Dang, Wen-Xiu; Zhang, Lin
As a new cross-discipline, the complexity science has penetrated into every field of economy and society. With the arrival of big data, the research of the complexity science has reached its summit again. In recent years, it offers a new perspective for traffic control by using complex networks theory. The interaction course of various kinds of information in traffic system forms a huge complex system. A new mesoscopic traffic flow model is improved with variable speed limit (VSL), and the simulation process is designed, which is based on the complex networks theory combined with the proposed model. This paper studies effect of VSL on the dynamic traffic flow, and then analyzes the optimal control strategy of VSL in different network topologies. The conclusion of this research is meaningful to put forward some reasonable transportation plan and develop effective traffic management and control measures to help the department of traffic management.
Clinical quality needs complex adaptive systems and machine learning.
Marsland, Stephen; Buchan, Iain
2004-01-01
The vast increase in clinical data has the potential to bring about large improvements in clinical quality and other aspects of healthcare delivery. However, such benefits do not come without cost. The analysis of such large datasets, particularly where the data may have to be merged from several sources and may be noisy and incomplete, is a challenging task. Furthermore, the introduction of clinical changes is a cyclical task, meaning that the processes under examination operate in an environment that is not static. We suggest that traditional methods of analysis are unsuitable for the task, and identify complexity theory and machine learning as areas that have the potential to facilitate the examination of clinical quality. By its nature the field of complex adaptive systems deals with environments that change because of the interactions that have occurred in the past. We draw parallels between health informatics and bioinformatics, which has already started to successfully use machine learning methods.
Chandler, Jacqueline; Rycroft-Malone, Jo; Hawkes, Claire; Noyes, Jane
2016-02-01
To examine the application of core concepts from Complexity Theory to explain the findings from a process evaluation undertaken in a trial evaluating implementation strategies for recommendations about reducing surgical fasting times. The proliferation of evidence-based guidance requires a greater focus on its implementation. Theory is required to explain the complex processes across the multiple healthcare organizational levels. This social healthcare context involves the interaction between professionals, patients and the organizational systems in care delivery. Complexity Theory may provide an explanatory framework to explain the complexities inherent in implementation in social healthcare contexts. A secondary thematic analysis of qualitative process evaluation data informed by Complexity Theory. Seminal texts applying Complexity Theory to the social context were annotated, key concepts extracted and core Complexity Theory concepts identified. These core concepts were applied as a theoretical lens to provide an explanation of themes from a process evaluation of a trial evaluating the implementation of strategies to reduce surgical fasting times. Sampled substantive texts provided a representative spread of theoretical development and application of Complexity Theory from late 1990's-2013 in social science, healthcare, management and philosophy. Five Complexity Theory core concepts extracted were 'self-organization', 'interaction', 'emergence', 'system history' and 'temporality'. Application of these concepts suggests routine surgical fasting practice is habituated in the social healthcare system and therefore it cannot easily be reversed. A reduction to fasting times requires an incentivised new approach to emerge in the surgical system's priority of completing the operating list. The application of Complexity Theory provides a useful explanation for resistance to change fasting practice. Its utility in implementation research warrants further attention and evaluation. © 2015 John Wiley & Sons Ltd.
Challenges in Visual Analysis of Ensembles
Crossno, Patricia
2018-04-12
Modeling physical phenomena through computational simulation increasingly relies on generating a collection of related runs, known as an ensemble. In this paper, we explore the challenges we face in developing analysis and visualization systems for large and complex ensemble data sets, which we seek to understand without having to view the results of every simulation run. Implementing approaches and ideas developed in response to this goal, we demonstrate the analysis of a 15K run material fracturing study using Slycat, our ensemble analysis system.
Challenges in Visual Analysis of Ensembles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crossno, Patricia
Modeling physical phenomena through computational simulation increasingly relies on generating a collection of related runs, known as an ensemble. In this paper, we explore the challenges we face in developing analysis and visualization systems for large and complex ensemble data sets, which we seek to understand without having to view the results of every simulation run. Implementing approaches and ideas developed in response to this goal, we demonstrate the analysis of a 15K run material fracturing study using Slycat, our ensemble analysis system.
Reliability evaluation methodology for NASA applications
NASA Technical Reports Server (NTRS)
Taneja, Vidya S.
1992-01-01
Liquid rocket engine technology has been characterized by the development of complex systems containing large number of subsystems, components, and parts. The trend to even larger and more complex system is continuing. The liquid rocket engineers have been focusing mainly on performance driven designs to increase payload delivery of a launch vehicle for a given mission. In otherwords, although the failure of a single inexpensive part or component may cause the failure of the system, reliability in general has not been considered as one of the system parameters like cost or performance. Up till now, quantification of reliability has not been a consideration during system design and development in the liquid rocket industry. Engineers and managers have long been aware of the fact that the reliability of the system increases during development, but no serious attempts have been made to quantify reliability. As a result, a method to quantify reliability during design and development is needed. This includes application of probabilistic models which utilize both engineering analysis and test data. Classical methods require the use of operating data for reliability demonstration. In contrast, the method described in this paper is based on similarity, analysis, and testing combined with Bayesian statistical analysis.
Visualizing driving forces of spatially extended systems using the recurrence plot framework
NASA Astrophysics Data System (ADS)
Riedl, Maik; Marwan, Norbert; Kurths, Jürgen
2017-12-01
The increasing availability of highly resolved spatio-temporal data leads to new opportunities as well as challenges in many scientific disciplines such as climatology, ecology or epidemiology. This allows more detailed insights into the investigated spatially extended systems. However, this development needs advanced techniques of data analysis which go beyond standard linear tools since the more precise consideration often reveals nonlinear phenomena, for example threshold effects. One of these tools is the recurrence plot approach which has been successfully applied to the description of complex systems. Using this technique's power of visualization, we propose the analysis of the local minima of the underlying distance matrix in order to display driving forces of spatially extended systems. The potential of this novel idea is demonstrated by the analysis of the chlorophyll concentration and the sea surface temperature in the Southern California Bight. We are able not only to confirm the influence of El Niño events on the phytoplankton growth in this region but also to confirm two discussed regime shifts in the California current system. This new finding underlines the power of the proposed approach and promises new insights into other complex systems.
Ziemann, Alexandra; Fouillet, Anne; Brand, Helmut; Krafft, Thomas
2016-01-01
Introduction Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors. Materials and Methods We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events. Results We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness. Conclusions We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings. PMID:27182731
Ziemann, Alexandra; Fouillet, Anne; Brand, Helmut; Krafft, Thomas
2016-01-01
Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors. We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events. We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness. We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings.
MVP-CA Methodology for the Expert System Advocate's Advisor (ESAA)
DOT National Transportation Integrated Search
1997-11-01
The Multi-Viewpoint Clustering Analysis (MVP-CA) tool is a semi-automated tool to provide a valuable aid for comprehension, verification, validation, maintenance, integration, and evolution of complex knowledge-based software systems. In this report,...
NASA Astrophysics Data System (ADS)
Guy, Nathaniel
This thesis explores new ways of looking at telemetry data, from a time-correlative perspective, in order to see patterns within the data that may suggest root causes of system faults. It was thought initially that visualizing an animated Pearson Correlation Coefficient (PCC) matrix for telemetry channels would be sufficient to give new understanding; however, testing showed that the high dimensionality and inability to easily look at change over time in this approach impeded understanding. Different correlative techniques, combined with the time curve visualization proposed by Bach et al (2015), were adapted to visualize both raw telemetry and telemetry data correlations. Review revealed that these new techniques give insights into the data, and an intuitive grasp of data families, which show the effectiveness of this approach for enhancing system understanding and assisting with root cause analysis for complex aerospace systems.
Complexity analysis and mathematical tools towards the modelling of living systems.
Bellomo, N; Bianca, C; Delitala, M
2009-09-01
This paper is a review and critical analysis of the mathematical kinetic theory of active particles applied to the modelling of large living systems made up of interacting entities. The first part of the paper is focused on a general presentation of the mathematical tools of the kinetic theory of active particles. The second part provides a review of a variety of mathematical models in life sciences, namely complex social systems, opinion formation, evolution of epidemics with virus mutations, and vehicular traffic, crowds and swarms. All the applications are technically related to the mathematical structures reviewed in the first part of the paper. The overall contents are based on the concept that living systems, unlike the inert matter, have the ability to develop behaviour geared towards their survival, or simply to improve the quality of their life. In some cases, the behaviour evolves in time and generates destructive and/or proliferative events.
NASA Astrophysics Data System (ADS)
Gao, Lingyu; Li, Xinghua; Guo, Qianrui; Quan, Jing; Hu, Zhengyue; Su, Zhikun; Zhang, Dong; Liu, Peilu; Li, Haopeng
2018-01-01
The internal structure of off-axis three-mirror system is commonly complex. The mirror installation error in assembly always affects the imaging line-of-sight and further degrades the image quality. Due to the complexity of the optical path in off-axis three-mirror optical system, the straightforward theoretical analysis on the variations of imaging line-of-sight is extremely difficult. In order to simplify the theoretical analysis, an equivalent single-mirror system is proposed and presented in this paper. In addition, the mathematical model of single-mirror system is established and the accurate expressions of imaging coordinate are derived. Utilizing the simulation software ZEMAX, off-axis three-mirror model and single-mirror model are both established. By adjusting the position of mirror and simulating the line-of-sight rotation of optical system, the variations of imaging coordinates are clearly observed. The final simulation results include: in off-axis three-mirror system, the varying sensitivity of the imaging coordinate to the rotation of line-of-sight is approximately 30 um/″; in single-mirror system, the varying sensitivity of the imaging coordinate to the rotation of line-of-sight is 31.5 um/″. Compared to the simulation results of the off-axis three-mirror model, the 5% relative error of single-mirror model analysis highly satisfies the requirement of equivalent analysis and also verifies its validity. This paper presents a new method to analyze the installation error of the mirror in the off-axis three-mirror system influencing on the imaging line-of-sight. Moreover, the off-axis three-mirror model is totally equivalent to the single-mirror model in theoretical analysis.
Ground-Based Aerosol Measurements | Science Inventory ...
Atmospheric particulate matter (PM) is a complex chemical mixture of liquid and solid particles suspended in air (Seinfeld and Pandis 2016). Measurements of this complex mixture form the basis of our knowledge regarding particle formation, source-receptor relationships, data to test and verify complex air quality models, and how PM impacts human health, visibility, global warming, and ecological systems (EPA 2009). Historically, PM samples have been collected on filters or other substrates with subsequent chemical analysis in the laboratory and this is still the major approach for routine networks (Chow 2005; Solomon et al. 2014) as well as in research studies. In this approach, air, at a specified flow rate and time period, is typically drawn through an inlet, usually a size selective inlet, and then drawn through filters, 1 INTRODUCTION Atmospheric particulate matter (PM) is a complex chemical mixture of liquid and solid particles suspended in air (Seinfeld and Pandis 2016). Measurements of this complex mixture form the basis of our knowledge regarding particle formation, source-receptor relationships, data to test and verify complex air quality models, and how PM impacts human health, visibility, global warming, and ecological systems (EPA 2009). Historically, PM samples have been collected on filters or other substrates with subsequent chemical analysis in the laboratory and this is still the major approach for routine networks (Chow 2005; Solomo
Odean, Rosalie; Nazareth, Alina; Pruden, Shannon M.
2015-01-01
Developmental systems theory posits that development cannot be segmented by influences acting in isolation, but should be studied through a scientific lens that highlights the complex interactions between these forces over time (Overton, 2013a). This poses a unique challenge for developmental psychologists studying complex processes like language development. In this paper, we advocate for the combining of highly sophisticated data collection technologies in an effort to move toward a more systemic approach to studying language development. We investigate the efficiency and appropriateness of combining eye-tracking technology and the LENA (Language Environment Analysis) system, an automated language analysis tool, in an effort to explore the relation between language processing in early development, and external dynamic influences like parent and educator language input in the home and school environments. Eye-tracking allows us to study language processing via eye movement analysis; these eye movements have been linked to both conscious and unconscious cognitive processing, and thus provide one means of evaluating cognitive processes underlying language development that does not require the use of subjective parent reports or checklists. The LENA system, on the other hand, provides automated language output that describes a child’s language-rich environment. In combination, these technologies provide critical information not only about a child’s language processing abilities but also about the complexity of the child’s language environment. Thus, when used in conjunction these technologies allow researchers to explore the nature of interacting systems involved in language development. PMID:26379591
Complex socio-technical systems: Characterization and management guidelines.
Righi, Angela Weber; Saurin, Tarcisio Abreu
2015-09-01
Although ergonomics has paid increasing attention to the perspective of complexity, methods for its operationalization are scarce. This study introduces a framework for the operationalization of the "attribute view" of complexity, which involves: (i) the delimitation of the socio-technical system (STS); (ii) the description of four complexity attributes, namely a large number of elements in dynamic interactions, a wide diversity of elements, unexpected variability, and resilience; (iii) the assessment of six management guidelines, namely design slack, give visibility to processes and outcomes, anticipate and monitor the impacts of small changes, monitor the gap between prescription and practice, encourage diversity of perspectives when making decisions, and create an environment that supports resilience; and (iv) the identification of leverage points for improving the STS design, based on both the analysis of relationships among the attributes and their classification as irreducible/manageable complexity, and liability/asset. The use of the framework is illustrated by the study of an emergency department of a University hospital. Data collection involved analysis of documents, observations of work at the front-line, interviews with employees, and the application of questionnaires. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Trade Studies of Space Launch Architectures using Modular Probabilistic Risk Analysis
NASA Technical Reports Server (NTRS)
Mathias, Donovan L.; Go, Susie
2006-01-01
A top-down risk assessment in the early phases of space exploration architecture development can provide understanding and intuition of the potential risks associated with new designs and technologies. In this approach, risk analysts draw from their past experience and the heritage of similar existing systems as a source for reliability data. This top-down approach captures the complex interactions of the risk driving parts of the integrated system without requiring detailed knowledge of the parts themselves, which is often unavailable in the early design stages. Traditional probabilistic risk analysis (PRA) technologies, however, suffer several drawbacks that limit their timely application to complex technology development programs. The most restrictive of these is a dependence on static planning scenarios, expressed through fault and event trees. Fault trees incorporating comprehensive mission scenarios are routinely constructed for complex space systems, and several commercial software products are available for evaluating fault statistics. These static representations cannot capture the dynamic behavior of system failures without substantial modification of the initial tree. Consequently, the development of dynamic models using fault tree analysis has been an active area of research in recent years. This paper discusses the implementation and demonstration of dynamic, modular scenario modeling for integration of subsystem fault evaluation modules using the Space Architecture Failure Evaluation (SAFE) tool. SAFE is a C++ code that was originally developed to support NASA s Space Launch Initiative. It provides a flexible framework for system architecture definition and trade studies. SAFE supports extensible modeling of dynamic, time-dependent risk drivers of the system and functions at the level of fidelity for which design and failure data exists. The approach is scalable, allowing inclusion of additional information as detailed data becomes available. The tool performs a Monte Carlo analysis to provide statistical estimates. Example results of an architecture system reliability study are summarized for an exploration system concept using heritage data from liquid-fueled expendable Saturn V/Apollo launch vehicles.
Modeling of the Human - Operator in a Complex System Functioning Under Extreme Conditions
NASA Astrophysics Data System (ADS)
Getzov, Peter; Hubenova, Zoia; Yordanov, Dimitar; Popov, Wiliam
2013-12-01
Problems, related to the explication of sophisticated control systems of objects, operating under extreme conditions, have been examined and the impact of the effectiveness of the operator's activity on the systems as a whole. The necessity of creation of complex simulation models, reflecting operator's activity, is discussed. Organizational and technical system of an unmanned aviation complex is described as a sophisticated ergatic system. Computer realization of main subsystems of algorithmic system of the man as a controlling system is implemented and specialized software for data processing and analysis is developed. An original computer model of a Man as a tracking system has been implemented. Model of unmanned complex for operators training and formation of a mental model in emergency situation, implemented in "matlab-simulink" environment, has been synthesized. As a unit of the control loop, the pilot (operator) is simplified viewed as an autocontrol system consisting of three main interconnected subsystems: sensitive organs (perception sensors); central nervous system; executive organs (muscles of the arms, legs, back). Theoretical-data model of prediction the level of operator's information load in ergatic systems is proposed. It allows the assessment and prediction of the effectiveness of a real working operator. Simulation model of operator's activity in takeoff based on the Petri nets has been synthesized.
Engineering education as a complex system
NASA Astrophysics Data System (ADS)
Gattie, David K.; Kellam, Nadia N.; Schramski, John R.; Walther, Joachim
2011-12-01
This paper presents a theoretical basis for cultivating engineering education as a complex system that will prepare students to think critically and make decisions with regard to poorly understood, ill-structured issues. Integral to this theoretical basis is a solution space construct developed and presented as a benchmark for evaluating problem-solving orientations that emerge within students' thinking as they progress through an engineering curriculum. It is proposed that the traditional engineering education model, while analytically rigorous, is characterised by properties that, although necessary, are insufficient for preparing students to address complex issues of the twenty-first century. A Synthesis and Design Studio model for engineering education is proposed, which maintains the necessary rigor of analysis within a uniquely complex yet sufficiently structured learning environment.
Structural system reliability calculation using a probabilistic fault tree analysis method
NASA Technical Reports Server (NTRS)
Torng, T. Y.; Wu, Y.-T.; Millwater, H. R.
1992-01-01
The development of a new probabilistic fault tree analysis (PFTA) method for calculating structural system reliability is summarized. The proposed PFTA procedure includes: developing a fault tree to represent the complex structural system, constructing an approximation function for each bottom event, determining a dominant sampling sequence for all bottom events, and calculating the system reliability using an adaptive importance sampling method. PFTA is suitable for complicated structural problems that require computer-intensive computer calculations. A computer program has been developed to implement the PFTA.
Space shuttle main engine controller assembly, phase C-D. [with lagging system design and analysis
NASA Technical Reports Server (NTRS)
1973-01-01
System design and system analysis and simulation are slightly behind schedule, while design verification testing has improved. Input/output circuit design has improved, but digital computer unit (DCU) and mechanical design continue to lag. Part procurement was impacted by delays in printed circuit board, assembly drawing releases. These are the result of problems in generating suitable printed circuit artwork for the very complex and high density multilayer boards.
A Hybrid Stochastic-Neuro-Fuzzy Model-Based System for In-Flight Gas Turbine Engine Diagnostics
2001-04-05
Margin (ADM) and (ii) Fault Detection Margin (FDM). Key Words: ANFIS, Engine Health Monitoring , Gas Path Analysis, and Stochastic Analysis Adaptive Network...The paper illustrates the application of a hybrid Stochastic- Fuzzy -Inference Model-Based System (StoFIS) to fault diagnostics and prognostics for both...operational history monitored on-line by the engine health management (EHM) system. To capture the complex functional relationships between different
[Adjustment of the German DRG system in 2009].
Wenke, A; Franz, D; Pühse, G; Volkmer, B; Roeder, N
2009-07-01
The 2009 version of the German DRG system brought significant changes for urology concerning coding of diagnoses, medical procedures and the DRG structure. In view of the political situation and considerable economic pressure, a critical analysis of the 2009 German DRG system is warranted. Analysis of relevant diagnoses, medical procedures and G-DRGs in the versions 2008 and 2009 based on the publications of the German DRG-institute (InEK) and the German Institute of Medical Documentation and Information (DIMDI). The relevant diagnoses, medical procedures and German DRGs in the versions 2008 and 2009 were analysed based on the publications of the German DRG Institute (InEK) and the German Institute of Medical Documentation and Information (DIMDI). Changes for 2009 focus on the development of the DRG structure, DRG validation and codes for medical procedures to be used for very complex cases. The outcome of these changes for German hospitals may vary depending in the range of activities. The German DRG system again gained complexity. High demands are made on correct and complete coding of complex urology cases. The quality of case allocation in the German DRG system was improved. On the one hand some of the old problems (e.g. enterostomata) still persist, while on the other hand new problems evolved out of the attempt to improve the case allocation of highly complex and expensive cases. Time will tell whether the increase in highly specialized DRG with low case numbers will continue to endure and reach acceptable rates of annual fluctuations.
Energy conservation and analysis and evaluation. [specifically at Slidell Computer Complex
NASA Technical Reports Server (NTRS)
1976-01-01
The survey assembled and made recommendations directed at conserving utilities and reducing the use of energy at the Slidell Computer Complex. Specific items included were: (1) scheduling and controlling the use of gas and electricity, (2) building modifications to reduce energy, (3) replacement of old, inefficient equipment, (4) modifications to control systems, (5) evaluations of economizer cycles in HVAC systems, and (6) corrective settings for thermostats, ductstats, and other temperature and pressure control devices.
Synthesis, characterization and solid-state properties of [Zn(Hdmmthiol)2]\\cdot2H2O complex
NASA Astrophysics Data System (ADS)
Dagdelen, Fethi; Aydogdu, Yildirim; Dey, Kamalendu; Biswas, Susobhan
2016-05-01
The zinc(II) complex with tridentate thiohydrazone ligand have been prepared by metal template reaction. The metal template reaction was used to prepare the zinc (II) complex with tridentate thiohydrazone ligand. The reaction of diacetylmonoxime and, morpholine N-thiohydrazidewith Zn(OAc)2 \\cdot2H2O under reflux yielded the formation of the [Zn(Hdmmthiol )2]\\cdot2H2O complex. The complex was characterized by a combination of protocols including elemental analysis, UV+vis, FT-IR, TG and PXRD. The temperature dependence of the electrical conductivity and the optical property of the [Zn(Hdmmthiol )2] \\cdot2H2O complex is called H2dammthiol was studied. Powder X-ray diffraction (PXRD) method was used to investigate the crystal structure of the sample. The zinc complex was shown to be a member of the triclinic system. The zinc complex was determined to have n-type conductivity as demonstrated in the hot probe measurements. The complex was determined to display direct optical transition with band gaps of 2.52eV as determined by the optical absorption analysis.
External Aiding Methods for IMU-Based Navigation
2016-11-26
Carlo simulation and particle filtering . This approach allows for the utilization of highly complex systems in a black box configuration with minimal...alternative method, which has the advantage of being less computationally demanding, is to use a Kalman filtering -based approach. The particular...Kalman filtering -based approach used here is known as linear covariance analysis. In linear covariance analysis, the nonlinear systems describing the
Tutorial Workshop on Robotics and Robot Control.
1982-10-26
J^V7S US ARMY TANK-AUTOMOTIVE COMMAND, WARREN MICHIGAN US ARMY MATERIEL SYSTEMS ANALYSIS ACTIVITY, ABERDEEN PROVING GROUNDS, MARYLAND ^ V&S...Technology Pasadena, California 91103 M. Vur.kovic Senior Research Associate Institute for Technoeconomic Systems Department of Industrial...Further investigation of the action precedence graphs together with their appli- cation to more complex manipulator tasks and analysis of J2. their
NASA Technical Reports Server (NTRS)
Phillips, D. T.; Manseur, B.; Foster, J. W.
1982-01-01
Alternate definitions of system failure create complex analysis for which analytic solutions are available only for simple, special cases. The GRASP methodology is a computer simulation approach for solving all classes of problems in which both failure and repair events are modeled according to the probability laws of the individual components of the system.
NASA Technical Reports Server (NTRS)
1983-01-01
Space station systems characteristics and architecture are described. A manned space station operational analysis is performed to determine crew size, crew task complexity and time tables, and crew equipment to support the definition of systems and subsystems concepts. This analysis is used to select and evaluate the architectural options for development.
ERIC Educational Resources Information Center
Shephard, Kerry
2017-01-01
A general inductive analysis was applied to 98 submissions made to a recent review of New Zealand's tertiary education system, primarily to enable those interested to engage with multiple viewpoints about this highly complex educational system. The analysis yielded three substantial themes that reoccur throughout the submissions and that may…
Weck, P J; Schaffner, D A; Brown, M R; Wicks, R T
2015-02-01
The Bandt-Pompe permutation entropy and the Jensen-Shannon statistical complexity are used to analyze fluctuating time series of three different turbulent plasmas: the magnetohydrodynamic (MHD) turbulence in the plasma wind tunnel of the Swarthmore Spheromak Experiment (SSX), drift-wave turbulence of ion saturation current fluctuations in the edge of the Large Plasma Device (LAPD), and fully developed turbulent magnetic fluctuations of the solar wind taken from the Wind spacecraft. The entropy and complexity values are presented as coordinates on the CH plane for comparison among the different plasma environments and other fluctuation models. The solar wind is found to have the highest permutation entropy and lowest statistical complexity of the three data sets analyzed. Both laboratory data sets have larger values of statistical complexity, suggesting that these systems have fewer degrees of freedom in their fluctuations, with SSX magnetic fluctuations having slightly less complexity than the LAPD edge I(sat). The CH plane coordinates are compared to the shape and distribution of a spectral decomposition of the wave forms. These results suggest that fully developed turbulence (solar wind) occupies the lower-right region of the CH plane, and that other plasma systems considered to be turbulent have less permutation entropy and more statistical complexity. This paper presents use of this statistical analysis tool on solar wind plasma, as well as on an MHD turbulent experimental plasma.
Haas, Isabelle; Dietel, Thomas; Press, Konstantin; Kol, Moshe; Kempe, Rhett
2013-10-11
Based on two well-established ligand systems, the aminopyridinato (Ap) and the phenoxyimine (FI) ligand systems, new Ap-FI hybrid ligands were developed. Four different Ap-FI hybrid ligands were synthesized through a simple condensation reaction and fully characterized. The reaction of hafnium tetrabenzyl with all four Ap-FI hybrid ligands exclusively led to mono(Ap-FI) complexes of the type [(Ap-FI)HfBn2 ]. The ligands acted as tetradentate dianionic chelates. Upon activation with tris(pentafluorophenyl)borane, the hafnium-dibenzyl complexes led to highly active catalysts for the polymerization of 1-hexene. Ultrahigh molecular weights and extremely narrow polydispersities support the living nature of this polymerization process. A possible deactivation product of the hafnium catalysts was characterized by single-crystal X-ray analysis and is discussed. The coordination modes of these new ligands were studied with the help of model titanium complexes. The reaction of titanium(IV) isopropoxide with ligand 1 led to a mono(Ap-FI) complex, which showed the desired fac-mer coordination mode. Titanium (IV) isopropoxide reacted with ligand 4 to give a complex of the type [(ApH-FI)2 Ti(OiPr)2 ], which featured the ligand in its monoanionic form. The two titanium complexes were characterized by X-ray crystal-structure analysis. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
[Orthopedic and trauma surgery in the German-DRG-System 2009].
Franz, D; Windolf, J; Siebert, C H; Roeder, N
2009-01-01
The German DRG-System was advanced into version 2009. For orthopedic and trauma surgery significant changes concerning coding of diagnoses, medical procedures and concerning the DRG-structure were made. Analysis of relevant diagnoses, medical procedures and G-DRGs in the versions 2008 and 2009 based on the publications of the German DRG-institute (InEK) and the German Institute of Medical Documentation and Information (DIMDI). Changes for 2009 focussed on the development of DRG-structure, DRG-validation and codes for medical procedures to be used for very complex cases. The outcome of these changes for German hospitals may vary depending in the range of activities. G-DRG-System gained complexity again. High demands are made on correct and complete coding of complex orthopedic and trauma surgery cases. Quality of case-allocation within the G-DRG-System was improved. Nevertheless, further adjustments of the G-DRG-System especially for cases with severe injuries are necessary.
Self-assembly of metal nanostructures on binary alloy surfaces
Duguet, T.; Han, Yong; Yuen, Chad; Jing, Dapeng; Ünal, Barış; Evans, J. W.; Thiel, P. A.
2011-01-01
Deposition of metals on binary alloy surfaces offers new possibilities for guiding the formation of functional metal nanostructures. This idea is explored with scanning tunneling microscopy studies and atomistic-level analysis and modeling of nonequilibrium island formation. For Au/NiAl(110), complex monolayer structures are found and compared with the simple fcc(110) bilayer structure recently observed for Ag/NiAl(110). We also consider a more complex codeposition system, (Ni + Al)/NiAl(110), which offers the opportunity for fundamental studies of self-growth of alloys including deviations for equilibrium ordering. A general multisite lattice-gas model framework enables analysis of structure selection and morphological evolution in these systems. PMID:21097706
NASA Astrophysics Data System (ADS)
Kassem, M.; Soize, C.; Gagliardini, L.
2009-06-01
In this paper, an energy-density field approach applied to the vibroacoustic analysis of complex industrial structures in the low- and medium-frequency ranges is presented. This approach uses a statistical computational model. The analyzed system consists of an automotive vehicle structure coupled with its internal acoustic cavity. The objective of this paper is to make use of the statistical properties of the frequency response functions of the vibroacoustic system observed from previous experimental and numerical work. The frequency response functions are expressed in terms of a dimensionless matrix which is estimated using the proposed energy approach. Using this dimensionless matrix, a simplified vibroacoustic model is proposed.
Dabrzalska, Monika; Zablocka, Maria; Mignani, Serge; Majoral, Jean Pierre; Klajnert-Maculewicz, Barbara
2015-08-15
Dendrimers due to their unique architecture may play an important role in drug delivery systems including chemotherapy, gene therapy and recently, photodynamic therapy as well. We investigated two dendrimer-photosensitizer systems in context of potential use of these systems in photodynamic therapy. The mixtures of an anionic phosphorus dendrimer of the second generation and methylene blue were studied by UV-vis spectroscopy while that of a cationic phosphorus dendrimer (third generation) and rose bengal were investigated by spectrofluorimetric methods. Spectroscopic analysis of these two systems revealed the formation of dendrimer-photosensitizer complexes via electrostatic interactions as well as π stacking. The stoichiometry of the rose bengal-cationic dendrimer complex was estimated to be 7:1 and 9:1 for the methylene blue-anionic dendrimer complex. The results suggest that these polyanionic or polycationic phosphorus dendrimers can be promising candidates as carriers in photodynamic therapy. Copyright © 2015 Elsevier B.V. All rights reserved.
Disaster preparedness in a complex urban system: the case of Kathmandu Valley, Nepal.
Carpenter, Samuel; Grünewald, François
2016-07-01
The city is a growing centre of humanitarian concern. Yet, aid agencies, governments and donors are only beginning to comprehend the scale and, importantly, the complexity of the humanitarian challenge in urban areas. Using the case study of the Kathmandu Valley, Nepal, this paper examines the analytical utility of recent research on complex urban systems in strengthening scholarly understanding of urban disaster risk management, and outlines its operational relevance to disaster preparedness. Drawing on a literature review and 26 interviews with actors from across the Government of Nepal, the International Red Cross and Red Crescent Movement, non-governmental organisations, United Nations agencies, and at-risk communities, the study argues that complexity can be seen as a defining feature of urban systems and the risks that confront them. To manage risk in these systems effectively, preparedness efforts must be based on adaptive and agile approaches, incorporating the use of network analysis, partnerships, and new technologies. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.
Automated thermal mapping techniques using chromatic image analysis
NASA Technical Reports Server (NTRS)
Buck, Gregory M.
1989-01-01
Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.
Socio-Technical Systems Analysis in Health Care: A Research Agenda
Bass, Ellen; Bellandi, Tommaso; Gurses, Ayse; Hallbeck, Susan; Mollo, Vanina
2012-01-01
Given the complexity of health care and the ‘people’ nature of healthcare work and delivery, STSA (Sociotechnical Systems Analysis) research is needed to address the numerous quality of care problems observed across the world. This paper describes open STSA research areas, including workload management, physical, cognitive and macroergonomic issues of medical devices and health information technologies, STSA in transitions of care, STSA of patient-centered care, risk management and patient safety management, resilience, and feedback loops between event detection, reporting and analysis and system redesign. PMID:22611480
DOT National Transportation Integrated Search
1974-08-01
DYNALIST, a computer program that extracts complex eigenvalues and eigenvectors for dynamic systems described in terms of matrix equations of motion, has been acquired and made operational at TSC. In this report, simple dynamic systems are used to de...
Complex Communication System and Social Change.
ERIC Educational Resources Information Center
Chang, Won H.
The basic question under examination is the underlying force that brings forth changes in cultural and social organizations. By employing general system theory and communication systemic analysis, the author concludes that communication, especially human communication, is the main vehicle of change. Human interchange, it is suggested, is constant…
The Integrated Farm System Model: A Tool for Whole Farm Nutrient Management Analysis
USDA-ARS?s Scientific Manuscript database
With tighter profit margins and increasing environmental constraints, strategic planning of farm production systems is becoming both more important and more difficult. This is especially true for integrated crop and animal production systems. Animal production is complex with a number of interacting...
An Isotopic Dilution Experiment Using Liquid Scintillation: A Simple Two-System, Two-Phase Analysis.
ERIC Educational Resources Information Center
Moehs, Peter J.; Levine, Samuel
1982-01-01
A simple isotonic, dilution analysis whose principles apply to methods of more complex radioanalyses is described. Suitable for clinical and instrumental analysis chemistry students, experimental manipulations are kept to a minimum involving only aqueous extraction before counting. Background information, procedures, and results are discussed.…
Transdisciplinary application of the cross-scale resilience model
Sundstrom, Shana M.; Angeler, David G.; Garmestani, Ahjond S.; Garcia, Jorge H.; Allen, Craig R.
2014-01-01
The cross-scale resilience model was developed in ecology to explain the emergence of resilience from the distribution of ecological functions within and across scales, and as a tool to assess resilience. We propose that the model and the underlying discontinuity hypothesis are relevant to other complex adaptive systems, and can be used to identify and track changes in system parameters related to resilience. We explain the theory behind the cross-scale resilience model, review the cases where it has been applied to non-ecological systems, and discuss some examples of social-ecological, archaeological/ anthropological, and economic systems where a cross-scale resilience analysis could add a quantitative dimension to our current understanding of system dynamics and resilience. We argue that the scaling and diversity parameters suitable for a resilience analysis of ecological systems are appropriate for a broad suite of systems where non-normative quantitative assessments of resilience are desired. Our planet is currently characterized by fast environmental and social change, and the cross-scale resilience model has the potential to quantify resilience across many types of complex adaptive systems.
Use of paired simple and complex models to reduce predictive bias and quantify uncertainty
NASA Astrophysics Data System (ADS)
Doherty, John; Christensen, Steen
2011-12-01
Modern environmental management and decision-making is based on the use of increasingly complex numerical models. Such models have the advantage of allowing representation of complex processes and heterogeneous system property distributions inasmuch as these are understood at any particular study site. The latter are often represented stochastically, this reflecting knowledge of the character of system heterogeneity at the same time as it reflects a lack of knowledge of its spatial details. Unfortunately, however, complex models are often difficult to calibrate because of their long run times and sometimes questionable numerical stability. Analysis of predictive uncertainty is also a difficult undertaking when using models such as these. Such analysis must reflect a lack of knowledge of spatial hydraulic property details. At the same time, it must be subject to constraints on the spatial variability of these details born of the necessity for model outputs to replicate observations of historical system behavior. In contrast, the rapid run times and general numerical reliability of simple models often promulgates good calibration and ready implementation of sophisticated methods of calibration-constrained uncertainty analysis. Unfortunately, however, many system and process details on which uncertainty may depend are, by design, omitted from simple models. This can lead to underestimation of the uncertainty associated with many predictions of management interest. The present paper proposes a methodology that attempts to overcome the problems associated with complex models on the one hand and simple models on the other hand, while allowing access to the benefits each of them offers. It provides a theoretical analysis of the simplification process from a subspace point of view, this yielding insights into the costs of model simplification, and into how some of these costs may be reduced. It then describes a methodology for paired model usage through which predictive bias of a simplified model can be detected and corrected, and postcalibration predictive uncertainty can be quantified. The methodology is demonstrated using a synthetic example based on groundwater modeling environments commonly encountered in northern Europe and North America.
NASA Astrophysics Data System (ADS)
Divya, R.; Nair, Lekshmi P.; Bijini, B. R.; Nair, C. M. K.; Babu, K. Rajendra
2018-05-01
Good quality prismatic crystals of industrially applicable corrosion inhibiting barium complex of 1,3,5-triazinane-2,4,6-trione have been grown by conventional gel method. The crystal structure, packing, and nature of bonds are revealed in the single crystal X-ray diffraction analysis. The crystal has a three-dimensional polymeric structure having a triclinic crystal system with the space group P-1. The powder X-ray diffraction analysis confirms its crystalline nature. The functional groups present in the crystal are identified by Fourier transform infrared spectroscopy. Elemental analysis confirms the stoichiometry of the elements present in the complex. Thermogravimetric analysis and differential thermal analysis reveal its good thermal stability. The optical properties like band gap, refractive index and extinction coefficient are evaluated from the UV-visible spectral analysis. The singular property of the material, corrosion inhibition efficiency achieved by the adsorption of the sample molecules is determined by the weight loss method.
Hemojuvelin-hepcidin axis modeled and analyzed using Petri nets.
Formanowicz, Dorota; Kozak, Adam; Głowacki, Tomasz; Radom, Marcin; Formanowicz, Piotr
2013-12-01
Systems biology approach to investigate biological phenomena seems to be very promising because it is capable to capture one of the fundamental properties of living organisms, i.e. their inherent complexity. It allows for analysis biological entities as complex systems of interacting objects. The first and necessary step of such an analysis is building a precise model of the studied biological system. This model is expressed in the language of some branch of mathematics, as for example, differential equations. During the last two decades the theory of Petri nets has appeared to be very well suited for building models of biological systems. The structure of these nets reflects the structure of interacting biological molecules and processes. Moreover, on one hand, Petri nets have intuitive graphical representation being very helpful in understanding the structure of the system and on the other hand, there is a lot of mathematical methods and software tools supporting an analysis of the properties of the nets. In this paper a Petri net based model of the hemojuvelin-hepcidin axis involved in the maintenance of the human body iron homeostasis is presented. The analysis based mainly on T-invariants of the model properties has been made and some biological conclusions have been drawn. Copyright © 2013 Elsevier Inc. All rights reserved.
High Density Hydrogen Storage System Demonstration Using NaAlH4 Based Complex Compound Hydrides
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daniel A. Mosher; Xia Tang; Ronald J. Brown
2007-07-27
This final report describes the motivations, activities and results of the hydrogen storage independent project "High Density Hydrogen Storage System Demonstration Using NaAlH4 Based Complex Compound Hydrides" performed by the United Technologies Research Center under the Department of Energy Hydrogen Program, contract # DE-FC36-02AL67610. The objectives of the project were to identify and address the key systems technologies associated with applying complex hydride materials, particularly ones which differ from those for conventional metal hydride based storage. This involved the design, fabrication and testing of two prototype systems based on the hydrogen storage material NaAlH4. Safety testing, catalysis studies, heat exchangermore » optimization, reaction kinetics modeling, thermochemical finite element analysis, powder densification development and material neutralization were elements included in the effort.« less
A Complex Systems Approach to More Resilient Multi-Layered Security Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Nathanael J. K.; Jones, Katherine A.; Bandlow, Alisa
In July 2012, protestors cut through security fences and gained access to the Y-12 National Security Complex. This was believed to be a highly reliable, multi-layered security system. This report documents the results of a Laboratory Directed Research and Development (LDRD) project that created a consistent, robust mathematical framework using complex systems analysis algorithms and techniques to better understand the emergent behavior, vulnerabilities and resiliency of multi-layered security systems subject to budget constraints and competing security priorities. Because there are several dimensions to security system performance and a range of attacks that might occur, the framework is multi-objective for amore » performance frontier to be estimated. This research explicitly uses probability of intruder interruption given detection (P I) as the primary resilience metric. We demonstrate the utility of this framework with both notional as well as real-world examples of Physical Protection Systems (PPSs) and validate using a well-established force-on-force simulation tool, Umbra.« less
Information theory applications for biological sequence analysis.
Vinga, Susana
2014-05-01
Information theory (IT) addresses the analysis of communication systems and has been widely applied in molecular biology. In particular, alignment-free sequence analysis and comparison greatly benefited from concepts derived from IT, such as entropy and mutual information. This review covers several aspects of IT applications, ranging from genome global analysis and comparison, including block-entropy estimation and resolution-free metrics based on iterative maps, to local analysis, comprising the classification of motifs, prediction of transcription factor binding sites and sequence characterization based on linguistic complexity and entropic profiles. IT has also been applied to high-level correlations that combine DNA, RNA or protein features with sequence-independent properties, such as gene mapping and phenotype analysis, and has also provided models based on communication systems theory to describe information transmission channels at the cell level and also during evolutionary processes. While not exhaustive, this review attempts to categorize existing methods and to indicate their relation with broader transversal topics such as genomic signatures, data compression and complexity, time series analysis and phylogenetic classification, providing a resource for future developments in this promising area.
Using team cognitive work analysis to reveal healthcare team interactions in a birthing unit.
Ashoori, Maryam; Burns, Catherine M; d'Entremont, Barbara; Momtahan, Kathryn
2014-01-01
Cognitive work analysis (CWA) as an analytical approach for examining complex sociotechnical systems has shown success in modelling the work of single operators. The CWA approach incorporates social and team interactions, but a more explicit analysis of team aspects can reveal more information for systems design. In this paper, Team CWA is explored to understand teamwork within a birthing unit at a hospital. Team CWA models are derived from theories and models of teamwork and leverage the existing CWA approaches to analyse team interactions. Team CWA is explained and contrasted with prior approaches to CWA. Team CWA does not replace CWA, but supplements traditional CWA to more easily reveal team information. As a result, Team CWA may be a useful approach to enhance CWA in complex environments where effective teamwork is required. This paper looks at ways of analysing cognitive work in healthcare teams. Team Cognitive Work Analysis, when used to supplement traditional Cognitive Work Analysis, revealed more team information than traditional Cognitive Work Analysis. Team Cognitive Work Analysis should be considered when studying teams.
Using team cognitive work analysis to reveal healthcare team interactions in a birthing unit
Ashoori, Maryam; Burns, Catherine M.; d'Entremont, Barbara; Momtahan, Kathryn
2014-01-01
Cognitive work analysis (CWA) as an analytical approach for examining complex sociotechnical systems has shown success in modelling the work of single operators. The CWA approach incorporates social and team interactions, but a more explicit analysis of team aspects can reveal more information for systems design. In this paper, Team CWA is explored to understand teamwork within a birthing unit at a hospital. Team CWA models are derived from theories and models of teamworkand leverage the existing CWA approaches to analyse team interactions. Team CWA is explained and contrasted with prior approaches to CWA. Team CWA does not replace CWA, but supplements traditional CWA to more easily reveal team information. As a result, Team CWA may be a useful approach to enhance CWA in complex environments where effective teamwork is required. Practitioner Summary: This paper looks at ways of analysing cognitive work in healthcare teams. Team Cognitive Work Analysis, when used to supplement traditional Cognitive Work Analysis, revealed more team information than traditional Cognitive Work Analysis. Team Cognitive Work Analysis should be considered when studying teams PMID:24837514
Preparing new nurses with complexity science and problem-based learning.
Hodges, Helen F
2011-01-01
Successful nurses function effectively with adaptability, improvability, and interconnectedness, and can see emerging and unpredictable complex problems. Preparing new nurses for complexity requires a significant change in prevalent but dated nursing education models for rising graduates. The science of complexity coupled with problem-based learning and peer review contributes a feasible framework for a constructivist learning environment to examine real-time systems data; explore uncertainty, inherent patterns, and ambiguity; and develop skills for unstructured problem solving. This article describes a pilot study of a problem-based learning strategy guided by principles of complexity science in a community clinical nursing course. Thirty-five senior nursing students participated during a 3-year period. Assessments included peer review, a final project paper, reflection, and a satisfaction survey. Results were higher than expected levels of student satisfaction, increased breadth and analysis of complex data, acknowledgment of community as complex adaptive systems, and overall higher level thinking skills than in previous years. 2011, SLACK Incorporated.
Yousefpour, Parisa; Atyabi, Fatemeh; Farahani, Ebrahim Vashegani; Sakhtianchi, Ramin; Dinarvand, Rassoul
2011-01-01
This study deals with the preparation and investigation of a nanoscale delivery system for the anticancer drug doxorubicin (DOX) using its complexation with polyanionic carbohydrate dextran sulfate (DS). Dynamic light scattering, SEM, and zeta potential determination were used to characterize nanocomplexes. DOX-DS complexation was studied in the presence of ethanol as a hydrogen-bond disrupting agent, NaCl as an electrostatic shielding agent, and chitosan as a positively charged polymer. Thermodynamics of DOX-DS interaction was studied using isothermal titration calorimetry (ITC). A dialysis method was applied to investigate the release profile of DOX from DOX-DS nanocomplexes. Spherical and smooth-surfaced DOX-DS nanocomplexes (250–500 nm) with negative zeta potential were formed at a DS/DOX (w/w) ratio of 0.4–0.6, with over 90% drug encapsulation efficiency. DOX when complexed with DS showed lower fluorescence emission and 480 nm absorbance plus a 15 nm bathometric shift in its visible absorbance spectrum. Electrostatic hydrogen bonding and π-π stacking interactions are the main contributing interactions in DOX-DS complexation. Thermal analysis of DOX-DS complexation by ITC revealed that each DOX molecule binds with 3 DS glycosyl monomers. Drug release profile of nanocomplexes showed a fast DOX release followed by a slow sustained release, leading to release of 32% of entrapped DOX within 15 days. DOX-DS nanocomplexes may serve as a drug delivery system with efficient drug encapsulation and also may be taken into consideration in designing DOX controlled-release systems. PMID:21796249
CAD-Based Aerodynamic Design of Complex Configurations using a Cartesian Method
NASA Technical Reports Server (NTRS)
Nemec, Marian; Aftosmis, Michael J.; Pulliam, Thomas H.
2003-01-01
A modular framework for aerodynamic optimization of complex geometries is developed. By working directly with a parametric CAD system, complex-geometry models are modified nnd tessellated in an automatic fashion. The use of a component-based Cartesian method significantly reduces the demands on the CAD system, and also provides for robust and efficient flowfield analysis. The optimization is controlled using either a genetic or quasi-Newton algorithm. Parallel efficiency of the framework is maintained even when subject to limited CAD resources by dynamically re-allocating the processors of the flow solver. Overall, the resulting framework can explore designs incorporating large shape modifications and changes in topology.
NASA Technical Reports Server (NTRS)
Stehura, Aaron; Rozek, Matthew
2013-01-01
The complexity of the Mars Science Laboratory (MSL) mission presented the Entry, Descent, and Landing systems engineering team with many challenges in its Verification and Validation (V&V) campaign. This paper describes some of the logistical hurdles related to managing a complex set of requirements, test venues, test objectives, and analysis products in the implementation of a specific portion of the overall V&V program to test the interaction of flight software with the MSL avionics suite. Application-specific solutions to these problems are presented herein, which can be generalized to other space missions and to similar formidable systems engineering problems.
Multifractality and Network Analysis of Phase Transition
Li, Wei; Yang, Chunbin; Han, Jihui; Su, Zhu; Zou, Yijiang
2017-01-01
Many models and real complex systems possess critical thresholds at which the systems shift dramatically from one sate to another. The discovery of early-warnings in the vicinity of critical points are of great importance to estimate how far the systems are away from the critical states. Multifractal Detrended Fluctuation analysis (MF-DFA) and visibility graph method have been employed to investigate the multifractal and geometrical properties of the magnetization time series of the two-dimensional Ising model. Multifractality of the time series near the critical point has been uncovered from the generalized Hurst exponents and singularity spectrum. Both long-term correlation and broad probability density function are identified to be the sources of multifractality. Heterogeneous nature of the networks constructed from magnetization time series have validated the fractal properties. Evolution of the topological quantities of the visibility graph, along with the variation of multifractality, serve as new early-warnings of phase transition. Those methods and results may provide new insights about the analysis of phase transition problems and can be used as early-warnings for a variety of complex systems. PMID:28107414
Mathematical Methods for Optical Physics and Engineering
NASA Astrophysics Data System (ADS)
Gbur, Gregory J.
2011-01-01
1. Vector algebra; 2. Vector calculus; 3. Vector calculus in curvilinear coordinate systems; 4. Matrices and linear algebra; 5. Advanced matrix techniques and tensors; 6. Distributions; 7. Infinite series; 8. Fourier series; 9. Complex analysis; 10. Advanced complex analysis; 11. Fourier transforms; 12. Other integral transforms; 13. Discrete transforms; 14. Ordinary differential equations; 15. Partial differential equations; 16. Bessel functions; 17. Legendre functions and spherical harmonics; 18. Orthogonal functions; 19. Green's functions; 20. The calculus of variations; 21. Asymptotic techniques; Appendices; References; Index.
ISS-CREAM Thermal and Fluid System Design and Analysis
NASA Technical Reports Server (NTRS)
Thorpe, Rosemary S.
2015-01-01
Thermal and Fluids Analysis Workshop (TFAWS), Silver Spring MD NCTS 21070-15. The ISS-CREAM (Cosmic Ray Energetics And Mass for the International Space Station) payload is being developed by an international team and will provide significant cosmic ray characterization over a long time frame. Cold fluid provided by the ISS Exposed Facility (EF) is the primary means of cooling for 5 science instruments and over 7 electronics boxes. Thermal fluid integrated design and analysis was performed for CREAM using a Thermal Desktop model. This presentation will provide some specific design and modeling examples from the fluid cooling system, complex SCD (Silicon Charge Detector) and calorimeter hardware, and integrated payload and ISS level modeling. Features of Thermal Desktop such as CAD simplification, meshing of complex hardware, External References (Xrefs), and FloCAD modeling will be discussed.
Communication Network Integration and Group Uniformity in a Complex Organization.
ERIC Educational Resources Information Center
Danowski, James A.; Farace, Richard V.
This paper contains a discussion of the limitations of research on group processes in complex organizations and the manner in which a procedure for network analysis in on-going systems can reduce problems. The research literature on group uniformity processes and on theoretical models of these processes from an information processing perspective…
Complexity Framework for Sustainability: An Analysis of Five Papers
ERIC Educational Resources Information Center
Putnik, Goran D.
2009-01-01
Purpose: The purpose of this paper is to present an examination of the concepts and mechanisms of complexity and learning usability and applicability for management in turbulent environments as well as their examination through the Chaordic system thinking (CST) lenses and framework. Contributing to awareness of how different mechanisms could be…
Graphics Flutter Analysis Methods, an interactive computing system at Lockheed-California Company
NASA Technical Reports Server (NTRS)
Radovcich, N. A.
1975-01-01
An interactive computer graphics system, Graphics Flutter Analysis Methods (GFAM), was developed to complement FAMAS, a matrix-oriented batch computing system, and other computer programs in performing complex numerical calculations using a fully integrated data management system. GFAM has many of the matrix operation capabilities found in FAMAS, but on a smaller scale, and is utilized when the analysis requires a high degree of interaction between the engineer and computer, and schedule constraints exclude the use of batch entry programs. Applications of GFAM to a variety of preliminary design, development design, and project modification programs suggest that interactive flutter analysis using matrix representations is a feasible and cost effective computing tool.
A Novel Human Body Area Network for Brain Diseases Analysis.
Lin, Kai; Xu, Tianlang
2016-10-01
Development of wireless sensor and mobile communication technology provide an unprecedented opportunity for realizing smart and interactive healthcare systems. Designing such systems aims to remotely monitor the health and diagnose the diseases for users. In this paper, we design a novel human body area network for brain diseases analysis, which is named BABDA. Considering the brain is one of the most complex organs in the human body, the BABDA system provides four function modules to ensure the high quality of the analysis result, which includes initial data collection, data correction, data transmission and comprehensive data analysis. The performance evaluation conducted in a realistic environment with several criteria shows the availability and practicability of the BABDA system.
Mnatsakanyan, Mariam; Stevenson, Paul G; Shock, David; Conlan, Xavier A; Goodie, Tiffany A; Spencer, Kylie N; Barnett, Neil W; Francis, Paul S; Shalliker, R Andrew
2010-09-15
Differences between alkyl, dipole-dipole, hydrogen bonding, and pi-pi selective surfaces represented by non-resonance and resonance pi-stationary phases have been assessed for the separation of 'Ristretto' café espresso by employing 2DHPLC techniques with C18 phase selectivity detection. Geometric approach to factor analysis (GAFA) was used to measure the detected peaks (N), spreading angle (beta), correlation, practical peak capacity (n(p)) and percentage usage of the separations space, as an assessment of selectivity differences between regional quadrants of the two-dimensional separation plane. Although all tested systems were correlated to some degree to the C18 dimension, regional measurement of separation divergence revealed that performance of specific systems was better for certain sample components. The results illustrate that because of the complexity of the 'real' sample obtaining a truly orthogonal two-dimensional system for complex samples of natural origin may be practically impossible. Copyright (c) 2010 Elsevier B.V. All rights reserved.
Yuan, Naiming; Fu, Zuntao; Zhang, Huan; Piao, Lin; Xoplaki, Elena; Luterbacher, Juerg
2015-01-01
In this paper, a new method, detrended partial-cross-correlation analysis (DPCCA), is proposed. Based on detrended cross-correlation analysis (DCCA), this method is improved by including partial-correlation technique, which can be applied to quantify the relations of two non-stationary signals (with influences of other signals removed) on different time scales. We illustrate the advantages of this method by performing two numerical tests. Test I shows the advantages of DPCCA in handling non-stationary signals, while Test II reveals the “intrinsic” relations between two considered time series with potential influences of other unconsidered signals removed. To further show the utility of DPCCA in natural complex systems, we provide new evidence on the winter-time Pacific Decadal Oscillation (PDO) and the winter-time Nino3 Sea Surface Temperature Anomaly (Nino3-SSTA) affecting the Summer Rainfall over the middle-lower reaches of the Yangtze River (SRYR). By applying DPCCA, better significant correlations between SRYR and Nino3-SSTA on time scales of 6 ~ 8 years are found over the period 1951 ~ 2012, while significant correlations between SRYR and PDO on time scales of 35 years arise. With these physically explainable results, we have confidence that DPCCA is an useful method in addressing complex systems. PMID:25634341
Perez, Bianca; Liberman, Aaron
2011-01-01
This article explores the issues of risk taking and decision making in health care. An analysis of various sociocultural and psychological influences is provided for understanding of the dominant mind set in this industry. In tandem with this analysis, the evolution of system theories is described so as to promote understanding of the relative merits of the mechanistic and complexity philosophies. These philosophies are at odds with each other, conceptually and practically speaking; however, it seems that the complexity approach offers more promising strategies for the growth and development of health care. Recommendations for improving employee competencies and the organizational structure and culture in health care are offered in light of this analysis. These recommendations are relevant to activities that are clinical and administrative in nature.
Beach, Connor A; Krumm, Christoph; Spanjers, Charles S; Maduskar, Saurabh; Jones, Andrew J; Dauenhauer, Paul J
2016-03-07
Analysis of trace compounds, such as pesticides and other contaminants, within consumer products, fuels, and the environment requires quantification of increasingly complex mixtures of difficult-to-quantify compounds. Many compounds of interest are non-volatile and exhibit poor response in current gas chromatography and flame ionization systems. Here we show the reaction of trimethylsilylated chemical analytes to methane using a quantitative carbon detector (QCD; the Polyarc™ reactor) within a gas chromatograph (GC), thereby enabling enhanced detection (up to 10×) of highly functionalized compounds including carbohydrates, acids, drugs, flavorants, and pesticides. Analysis of a complex mixture of compounds shows that the GC-QCD method exhibits faster and more accurate analysis of complex mixtures commonly encountered in everyday products and the environment.
Success rates of a skeletal anchorage system in orthodontics: A retrospective analysis.
Lam, Raymond; Goonewardene, Mithran S; Allan, Brent P; Sugawara, Junji
2018-01-01
To evaluate the premise that skeletal anchorage with SAS miniplates are highly successful and predictable for a range of complex orthodontic movements. This retrospective cross-sectional analysis consisted of 421 bone plates placed by one clinician in 163 patients (95 female, 68 male, mean age 29.4 years ± 12.02). Simple descriptive statistics were performed for a wide range of malocclusions and desired movements to obtain success, complication, and failure rates. The success rate of skeletal anchorage system miniplates was 98.6%, where approximately 40% of cases experienced mild complications. The most common complication was soft tissue inflammation, which was amenable to focused oral hygiene and antiseptic rinses. Infection occurred in approximately 15% of patients where there was a statistically significant correlation with poor oral hygiene. The most common movements were distalization and intrusion of teeth. More than a third of the cases involved complex movements in more than one plane of space. The success rate of skeletal anchorage system miniplates is high and predictable for a wide range of complex orthodontic movements.
Goddard, Kimball E.
1988-01-01
The Cheyenne River system in Western South Dakota has been impacted by the discharge of about 100 million metric tons of gold-mill tailings to Whitewood Creek near Lead, South Dakota. In April 1985, the U.S. Geological Survey initiated an extensive series of research studies to investigate the magnitude of the impact and to define important processes acting on the contaminated sediments present in the system. The report presents all data collected during the 1985 and 1986 water years for these research studies. Some of the data included have been published previously. Hydrologic, geochemical, and biologic data are available for sites on Whitewood Creek, the Belle Fourche and Cheyenne Rivers, and for the Cheyenne River arm of Lake Oahe. Data complexity varies from routine discharge and water quality to very complex photon-correlation spectroscopy and energy-dispersive x-ray analysis. Methods for sample collection, handling and preservation, and laboratory analysis are also presented. No interpretations or complex statistical summaries are included. (USGS)
Rogge, Ryan A; Hansen, Jeffrey C
2015-01-01
Sedimentation velocity experiments measure the transport of molecules in solution under centrifugal force. Here, we describe a method for monitoring the sedimentation of very large biological molecular assemblies using the interference optical systems of the analytical ultracentrifuge. The mass, partial-specific volume, and shape of macromolecules in solution affect their sedimentation rates as reflected in the sedimentation coefficient. The sedimentation coefficient is obtained by measuring the solute concentration as a function of radial distance during centrifugation. Monitoring the concentration can be accomplished using interference optics, absorbance optics, or the fluorescence detection system, each with inherent advantages. The interference optical system captures data much faster than these other optical systems, allowing for sedimentation velocity analysis of extremely large macromolecular complexes that sediment rapidly at very low rotor speeds. Supramolecular oligomeric complexes produced by self-association of 12-mer chromatin fibers are used to illustrate the advantages of the interference optics. Using interference optics, we show that chromatin fibers self-associate at physiological divalent salt concentrations to form structures that sediment between 10,000 and 350,000S. The method for characterizing chromatin oligomers described in this chapter will be generally useful for characterization of any biological structures that are too large to be studied by the absorbance optical system. © 2015 Elsevier Inc. All rights reserved.
SQL is Dead; Long-live SQL: Relational Database Technology in Science Contexts
NASA Astrophysics Data System (ADS)
Howe, B.; Halperin, D.
2014-12-01
Relational databases are often perceived as a poor fit in science contexts: Rigid schemas, poor support for complex analytics, unpredictable performance, significant maintenance and tuning requirements --- these idiosyncrasies often make databases unattractive in science contexts characterized by heterogeneous data sources, complex analysis tasks, rapidly changing requirements, and limited IT budgets. In this talk, I'll argue that although the value proposition of typical relational database systems are weak in science, the core ideas that power relational databases have become incredibly prolific in open source science software, and are emerging as a universal abstraction for both big data and small data. In addition, I'll talk about two open source systems we are building to "jailbreak" the core technology of relational databases and adapt them for use in science. The first is SQLShare, a Database-as-a-Service system supporting collaborative data analysis and exchange by reducing database use to an Upload-Query-Share workflow with no installation, schema design, or configuration required. The second is Myria, a service that supports much larger scale data, complex analytics, and supports multiple back end systems. Finally, I'll describe some of the ways our collaborators in oceanography, astronomy, biology, fisheries science, and more are using these systems to replace script-based workflows for reasons of performance, flexibility, and convenience.
Learning the organization: a model for health system analysis for new nurse administrators.
Clark, Mary Jo
2004-01-01
Health systems are large and complex organizations in which multiple components and processes influence system outcomes. In order to effectively position themselves in such organizations, nurse administrators new to a system must gain a rapid understanding of overall system operation. Such understanding is facilitated by use of a model for system analysis. The model presented here examines the dynamic interrelationships between and among internal and external elements as they affect system performance. External elements to be analyzed include environmental factors and characteristics of system clientele. Internal elements flow from the mission and goals of the system and include system culture, services, resources, and outcomes.
NASA Technical Reports Server (NTRS)
Vakil, Sanjay S.; Hansman, R. John
2000-01-01
Autoflight systems in the current generation of aircraft have been implicated in several recent incidents and accidents. A contributory aspect to these incidents may be the manner in which aircraft transition between differing behaviours or 'modes.' The current state of aircraft automation was investigated and the incremental development of the autoflight system was tracked through a set of aircraft to gain insight into how these systems developed. This process appears to have resulted in a system without a consistent global representation. In order to evaluate and examine autoflight systems, a 'Hybrid Automation Representation' (HAR) was developed. This representation was used to examine several specific problems known to exist in aircraft systems. Cyclomatic complexity is an analysis tool from computer science which counts the number of linearly independent paths through a program graph. This approach was extended to examine autoflight mode transitions modelled with the HAR. A survey was conducted of pilots to identify those autoflight mode transitions which airline pilots find difficult. The transitions identified in this survey were analyzed using cyclomatic complexity to gain insight into the apparent complexity of the autoflight system from the perspective of the pilot. Mode transitions which had been identified as complex by pilots were found to have a high cyclomatic complexity. Further examination was made into a set of specific problems identified in aircraft: the lack of a consistent representation of automation, concern regarding appropriate feedback from the automation, and the implications of physical limitations on the autoflight systems. Mode transitions involved in changing to and leveling at a new altitude were identified across multiple aircraft by numerous pilots. Where possible, evaluation and verification of the behaviour of these autoflight mode transitions was investigated via aircraft-specific high fidelity simulators. Three solution approaches to concerns regarding autoflight systems, and mode transitions in particular, are presented in this thesis. The first is to use training to modify pilot behaviours, or procedures to work around known problems. The second approach is to mitigate problems by enhancing feedback. The third approach is to modify the process by which automation is designed. The Operator Directed Process forces the consideration and creation of an automation model early in the design process for use as the basis of the software specification and training.
Pseudotargeted MS Method for the Sensitive Analysis of Protein Phosphorylation in Protein Complexes.
Lyu, Jiawen; Wang, Yan; Mao, Jiawei; Yao, Yating; Wang, Shujuan; Zheng, Yong; Ye, Mingliang
2018-05-15
In this study, we presented an enrichment-free approach for the sensitive analysis of protein phosphorylation in minute amounts of samples, such as purified protein complexes. This method takes advantage of the high sensitivity of parallel reaction monitoring (PRM). Specifically, low confident phosphopeptides identified from the data-dependent acquisition (DDA) data set were used to build a pseudotargeted list for PRM analysis to allow the identification of additional phosphopeptides with high confidence. The development of this targeted approach is very easy as the same sample and the same LC-system were used for the discovery and the targeted analysis phases. No sample fractionation or enrichment was required for the discovery phase which allowed this method to analyze minute amount of sample. We applied this pseudotargeted MS method to quantitatively examine phosphopeptides in affinity purified endogenous Shc1 protein complexes at four temporal stages of EGF signaling and identified 82 phospho-sites. To our knowledge, this is the highest number of phospho-sites identified from the protein complexes. This pseudotargeted MS method is highly sensitive in the identification of low abundance phosphopeptides and could be a powerful tool to study phosphorylation-regulated assembly of protein complex.
Towards human-computer synergetic analysis of large-scale biological data.
Singh, Rahul; Yang, Hui; Dalziel, Ben; Asarnow, Daniel; Murad, William; Foote, David; Gormley, Matthew; Stillman, Jonathan; Fisher, Susan
2013-01-01
Advances in technology have led to the generation of massive amounts of complex and multifarious biological data in areas ranging from genomics to structural biology. The volume and complexity of such data leads to significant challenges in terms of its analysis, especially when one seeks to generate hypotheses or explore the underlying biological processes. At the state-of-the-art, the application of automated algorithms followed by perusal and analysis of the results by an expert continues to be the predominant paradigm for analyzing biological data. This paradigm works well in many problem domains. However, it also is limiting, since domain experts are forced to apply their instincts and expertise such as contextual reasoning, hypothesis formulation, and exploratory analysis after the algorithm has produced its results. In many areas where the organization and interaction of the biological processes is poorly understood and exploratory analysis is crucial, what is needed is to integrate domain expertise during the data analysis process and use it to drive the analysis itself. In context of the aforementioned background, the results presented in this paper describe advancements along two methodological directions. First, given the context of biological data, we utilize and extend a design approach called experiential computing from multimedia information system design. This paradigm combines information visualization and human-computer interaction with algorithms for exploratory analysis of large-scale and complex data. In the proposed approach, emphasis is laid on: (1) allowing users to directly visualize, interact, experience, and explore the data through interoperable visualization-based and algorithmic components, (2) supporting unified query and presentation spaces to facilitate experimentation and exploration, (3) providing external contextual information by assimilating relevant supplementary data, and (4) encouraging user-directed information visualization, data exploration, and hypotheses formulation. Second, to illustrate the proposed design paradigm and measure its efficacy, we describe two prototype web applications. The first, called XMAS (Experiential Microarray Analysis System) is designed for analysis of time-series transcriptional data. The second system, called PSPACE (Protein Space Explorer) is designed for holistic analysis of structural and structure-function relationships using interactive low-dimensional maps of the protein structure space. Both these systems promote and facilitate human-computer synergy, where cognitive elements such as domain knowledge, contextual reasoning, and purpose-driven exploration, are integrated with a host of powerful algorithmic operations that support large-scale data analysis, multifaceted data visualization, and multi-source information integration. The proposed design philosophy, combines visualization, algorithmic components and cognitive expertise into a seamless processing-analysis-exploration framework that facilitates sense-making, exploration, and discovery. Using XMAS, we present case studies that analyze transcriptional data from two highly complex domains: gene expression in the placenta during human pregnancy and reaction of marine organisms to heat stress. With PSPACE, we demonstrate how complex structure-function relationships can be explored. These results demonstrate the novelty, advantages, and distinctions of the proposed paradigm. Furthermore, the results also highlight how domain insights can be combined with algorithms to discover meaningful knowledge and formulate evidence-based hypotheses during the data analysis process. Finally, user studies against comparable systems indicate that both XMAS and PSPACE deliver results with better interpretability while placing lower cognitive loads on the users. XMAS is available at: http://tintin.sfsu.edu:8080/xmas. PSPACE is available at: http://pspace.info/.
Towards human-computer synergetic analysis of large-scale biological data
2013-01-01
Background Advances in technology have led to the generation of massive amounts of complex and multifarious biological data in areas ranging from genomics to structural biology. The volume and complexity of such data leads to significant challenges in terms of its analysis, especially when one seeks to generate hypotheses or explore the underlying biological processes. At the state-of-the-art, the application of automated algorithms followed by perusal and analysis of the results by an expert continues to be the predominant paradigm for analyzing biological data. This paradigm works well in many problem domains. However, it also is limiting, since domain experts are forced to apply their instincts and expertise such as contextual reasoning, hypothesis formulation, and exploratory analysis after the algorithm has produced its results. In many areas where the organization and interaction of the biological processes is poorly understood and exploratory analysis is crucial, what is needed is to integrate domain expertise during the data analysis process and use it to drive the analysis itself. Results In context of the aforementioned background, the results presented in this paper describe advancements along two methodological directions. First, given the context of biological data, we utilize and extend a design approach called experiential computing from multimedia information system design. This paradigm combines information visualization and human-computer interaction with algorithms for exploratory analysis of large-scale and complex data. In the proposed approach, emphasis is laid on: (1) allowing users to directly visualize, interact, experience, and explore the data through interoperable visualization-based and algorithmic components, (2) supporting unified query and presentation spaces to facilitate experimentation and exploration, (3) providing external contextual information by assimilating relevant supplementary data, and (4) encouraging user-directed information visualization, data exploration, and hypotheses formulation. Second, to illustrate the proposed design paradigm and measure its efficacy, we describe two prototype web applications. The first, called XMAS (Experiential Microarray Analysis System) is designed for analysis of time-series transcriptional data. The second system, called PSPACE (Protein Space Explorer) is designed for holistic analysis of structural and structure-function relationships using interactive low-dimensional maps of the protein structure space. Both these systems promote and facilitate human-computer synergy, where cognitive elements such as domain knowledge, contextual reasoning, and purpose-driven exploration, are integrated with a host of powerful algorithmic operations that support large-scale data analysis, multifaceted data visualization, and multi-source information integration. Conclusions The proposed design philosophy, combines visualization, algorithmic components and cognitive expertise into a seamless processing-analysis-exploration framework that facilitates sense-making, exploration, and discovery. Using XMAS, we present case studies that analyze transcriptional data from two highly complex domains: gene expression in the placenta during human pregnancy and reaction of marine organisms to heat stress. With PSPACE, we demonstrate how complex structure-function relationships can be explored. These results demonstrate the novelty, advantages, and distinctions of the proposed paradigm. Furthermore, the results also highlight how domain insights can be combined with algorithms to discover meaningful knowledge and formulate evidence-based hypotheses during the data analysis process. Finally, user studies against comparable systems indicate that both XMAS and PSPACE deliver results with better interpretability while placing lower cognitive loads on the users. XMAS is available at: http://tintin.sfsu.edu:8080/xmas. PSPACE is available at: http://pspace.info/. PMID:24267485
Nonlinear dynamics behavior analysis of the spatial configuration of a tendril-bearing plant
NASA Astrophysics Data System (ADS)
Feng, Jingjing; Zhang, Qichang; Wang, Wei; Hao, Shuying
2017-03-01
Tendril-bearing plants appear to have a spiraling shape when tendrils climb along a support during growth. The growth characteristics of a tendril-bearer can be simplified to a model of a thin elastic rod with a cylindrical constraint. In this paper, the connection between some typical configuration characteristics of tendrils and complex nonlinear dynamic behavior are qualitatively analyzed. The space configuration problem of tendrils can be explained through the study of the nonlinear dynamic behavior of the thin elastic rod system equation. In this study, the complex non-Z2 symmetric critical orbits in the system equation under critical parameters were presented. A new function transformation method that can effectively maintain the critical orbit properties was proposed, and a new nonlinear differential equations system containing complex nonlinear terms can been obtained to describe the cross section position and direction of a rod during climbing. Numerical simulation revealed that the new system can describe the configuration of a rod with reasonable accuracy. To adequately explain the growing regulation of the rod shape, the critical orbit and configuration of rod are connected in a direct way. The high precision analytical expressions of these complex non-Z2 symmetric critical orbits are obtained by introducing a suitable analytical method, and then these expressions are used to draw the corresponding three-dimensional configuration figures of an elastic thin rod. Combined with actual tendrils on a live plant, the space configuration of the winding knots of tendril is explained by the concept of heteroclinic orbit from the perspective of nonlinear dynamics, and correctness of the theoretical analysis was verified. This theoretical analysis method could also be effectively applied to other similar slender structures.
Anderson, James C; Blake, Alexander J; Moreno, Rafael Bou; Raynel, Guillaume; van Slageren, Joris
2009-11-14
The fixation of CO(2) at ambient temperature has been achieved by the reaction of Ni(cod)(2) and TMEDA in CO(2) saturated THF that yields a novel hexanuclear nickel(II) mu(3)-carbonato bridged complex [Ni(6)(mu(3)-CO(3))(4)(TMEDA)(6)(H(2)O)(12)](OH)(4) in 59% yield. The complex was characterised by MS analysis and the structure corroborated by single-crystal X-ray crystallography. The complex exhibits a rare carbonato binding mode for Ni(II) complexes and moderately strong antiferromagnetic interactions.
Models-3 is a flexible software system designed to simplify the development and use of environmental assessment and other decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheri...
Cognitive Complexity and Verbal Response Mode Use in Discussion.
ERIC Educational Resources Information Center
Kline, Susan L.; And Others
1990-01-01
Uses William B. Stiles' discourse analysis system to determine whether there are general differences in the way individuals varying in construct system development use their utterances to establish understanding with each other. Finds that construct system development was positively correlated with edification and question response mode use, and…
Emergy analysis of a silvo-pastoral system, a case study in southern Portugal
The Mediterranean silvo-pastoral system known as Montado, in Portugal, is a complex land use system composed of an open tree stratum in various densities and an herbaceous layer, used for livestock grazing. Livestock also profit from the acorns, and the grazing contributes to avo...
Analysis of Access Control Policies in Operating Systems
ERIC Educational Resources Information Center
Chen, Hong
2009-01-01
Operating systems rely heavily on access control mechanisms to achieve security goals and defend against remote and local attacks. The complexities of modern access control mechanisms and the scale of policy configurations are often overwhelming to system administrators and software developers. Therefore, mis-configurations are common, and the…
An intelligent decomposition approach for efficient design of non-hierarchic systems
NASA Technical Reports Server (NTRS)
Bloebaum, Christina L.
1992-01-01
The design process associated with large engineering systems requires an initial decomposition of the complex systems into subsystem modules which are coupled through transference of output data. The implementation of such a decomposition approach assumes the ability exists to determine what subsystems and interactions exist and what order of execution will be imposed during the analysis process. Unfortunately, this is quite often an extremely complex task which may be beyond human ability to efficiently achieve. Further, in optimizing such a coupled system, it is essential to be able to determine which interactions figure prominently enough to significantly affect the accuracy of the optimal solution. The ability to determine 'weak' versus 'strong' coupling strengths would aid the designer in deciding which couplings could be permanently removed from consideration or which could be temporarily suspended so as to achieve computational savings with minimal loss in solution accuracy. An approach that uses normalized sensitivities to quantify coupling strengths is presented. The approach is applied to a coupled system composed of analysis equations for verification purposes.
NASA Astrophysics Data System (ADS)
Yoon, Susan Anne
Understanding the world through a complex systems lens has recently garnered a great deal of interest in many knowledge disciplines. In the educational arena, interactional studies, through their focus on understanding patterns of system behaviour including the dynamical processes and trajectories of learning, lend support for investigating how a complex systems approach can inform educational research. This study uses previously existing literature and tools for complex systems applications and seeks to extend this research base by exploring learning outcomes of a complex systems framework when applied to curriculum and instruction. It is argued that by applying the evolutionary dynamics of variation, interaction and selection, complexity may be harnessed to achieve growth in both the social and cognitive systems of the classroom. Furthermore, if the goal of education, i.e., the social system under investigation, is to teach for understanding, conceptual knowledge of the kind described in Popper's (1972; 1976) World 3, needs to evolve. Both the study of memetic processes and knowledge building pioneered by Bereiter (cf. Bereiter, 2002) draw on the World 3 notion of ideas existing as conceptual artifacts that can be investigated as products outside of the individual mind providing an educational lens from which to proceed. The curricular topic addressed is the development of an ethical understanding of the scientific and technological issues of genetic engineering. 11 grade 8 students are studied as they proceed through 40 hours of curricular instruction based on the complex systems evolutionary framework. Results demonstrate growth in both complex systems thinking and content knowledge of the topic of genetic engineering. Several memetic processes are hypothesized to have influenced how and why ideas change. Categorized by factors influencing either reflective or non-reflective selection, these processes appear to have exerted differential effects on students' abilities to think and act in complex ways at various points throughout the study. Finally, an analysis of winner and loser memes is offered that is intended to reveal information about the conceptual system---its strengths and deficiencies---that can help educators assess curricular goals and organize and construct additional educational activities.
Mutale, Wilbroad; Ayles, Helen; Bond, Virginia; Chintu, Namwinga; Chilengi, Roma; Mwanamwenge, Margaret Tembo; Taylor, Angela; Spicer, Neil; Balabanova, Dina
2017-04-01
Strong health systems are said to be paramount to achieving effective and equitable health care. The World Health Organization has been advocating for using system-wide approaches such as 'systems thinking' to guide intervention design and evaluation. In this paper we report the system-wide effects of a complex health system intervention in Zambia known as Better Health Outcome through Mentorship and Assessment (BHOMA) that aimed to improve service quality. We conducted a qualitative study in three target districts. We used a systems thinking conceptual framework to guide the analysis focusing on intended and unintended consequences of the intervention. NVivo version 10 was used for data analysis. The addressed community responded positively to the BHOMA intervention. The indications were that in the short term there was increased demand for services but the health worker capacity was not severely affected. This means that the prediction that service demand would increase with implementation of BHOMA was correct and the workload also increased, but the help of clinic lay supporters meant that some of the work of clinicians was transferred to these lay workers. However, from a systems perspective, unintended consequences also occurred during the implementation of the BHOMA. We applied an innovative approach to evaluate a complex intervention in low-income settings, exploring empirically how systems thinking can be applied in the context of health system strengthening. Although the intervention had some positive outcomes by employing system-wide approaches, we also noted unintended consequences. © 2015 The Authors. Journal of Evaluation in Clinical Practice published by John Wiley & Sons, Ltd.
«Surgery first» or two stage complex rehabilitation plan for patients with malocclusions.
Andreishchev, A R; Kavrayskaya, A Yu; Nikolaev, A V
2016-01-01
The article considers stages of complex rehabilitation treatment plans of patients with bite anomalies. The study included 515 patients with various complex malocclusions. Two and conventional three stage treatment plans are described. The article suggests indications for the two stage treatment protocol. The evaluation of efficiency and stability of achieved treatment results obtained with a help of the system of quantitative analysis of dentooralfacial disorders is presented.
An unstructured-grid software system for solving complex aerodynamic problems
NASA Technical Reports Server (NTRS)
Frink, Neal T.; Pirzadeh, Shahyar; Parikh, Paresh
1995-01-01
A coordinated effort has been underway over the past four years to elevate unstructured-grid methodology to a mature level. The goal of this endeavor is to provide a validated capability to non-expert users for performing rapid aerodynamic analysis and design of complex configurations. The Euler component of the system is well developed, and is impacting a broad spectrum of engineering needs with capabilities such as rapid grid generation and inviscid flow analysis, inverse design, interactive boundary layers, and propulsion effects. Progress is also being made in the more tenuous Navier-Stokes component of the system. A robust grid generator is under development for constructing quality thin-layer tetrahedral grids, along with a companion Navier-Stokes flow solver. This paper presents an overview of this effort, along with a perspective on the present and future status of the methodology.
NASA Technical Reports Server (NTRS)
Tseng, K.; Morino, L.
1975-01-01
A general formulation is presented for the analysis of steady and unsteady, subsonic and supersonic aerodynamics for complex aircraft configurations. The theoretical formulation, the numerical procedure, the description of the program SOUSSA (steady, oscillatory and unsteady, subsonic and supersonic aerodynamics) and numerical results are included. In particular, generalized forces for fully unsteady (complex frequency) aerodynamics for a wing-body configuration, AGARD wing-tail interference in both subsonic and supersonic flows as well as flutter analysis results are included. The theoretical formulation is based upon an integral equation, which includes completely arbitrary motion. Steady and oscillatory aerodynamic flows are considered. Here small-amplitude, fully transient response in the time domain is considered. This yields the aerodynamic transfer function (Laplace transform of the fully unsteady operator) for frequency domain analysis. This is particularly convenient for the linear systems analysis of the whole aircraft.
Ono, N; Hirayama, F; Arima, H; Uekama, K
2001-01-01
The competitive inclusion complexations in the ternary phenacetin/competitors/beta-cyclodextrin (beta-CyD) systems were investigated by the solubility method, where m-bromobenzoic acid (m-BBA) and o-toluic acid (o-TA) were used as competitors. The solubility changes of the drug and competitors as a function of beta-CyD concentration in the ternary systems were formulated using their stability constants and intrinsic solubilities. The decrease in solubility of phenacetin by the addition of competitors could be quantitatively simulated by the formulation, when both drug and competitor give A(L) type solubility diagrams. On the other hand, when one of the guests gives a B(S) type solubility diagram, its solubility change was clearly reflected in that of the another guest, i.e., phenacetin gave an A(L) type solubility diagram in the binary phenacetin/beta-CyD system and o-TA gave a B(S) type diagram in the binary o-TA/beta-CyD system, but in the ternary phenacetin/o-TA/beta-CyD system, a new plateau region appeared in the original A(L) type diagram of phenacetin. This was explained by the solubilization theory of Higuchi and Connors. The solubility analysis of the ternary drug/competitor/CyD systems may be particularly useful for determination of the stability constant of a drug whose physicochemical and spectroscopic analyses are difficult, because they can be calculated by monitoring the solubility change of a competitor, without monitoring that of a drug. Furthermore, the present results suggest that attention should be paid to the type of the phase solubility diagram, as well as the magnitude of the stability constant and the solubility of the complex, for a rational formulation design of CyD complexes.
St-Maurice, Justin D; Burns, Catherine M
2017-07-28
Health care is a complex sociotechnical system. Patient treatment is evolving and needs to incorporate the use of technology and new patient-centered treatment paradigms. Cognitive work analysis (CWA) is an effective framework for understanding complex systems, and work domain analysis (WDA) is useful for understanding complex ecologies. Although previous applications of CWA have described patient treatment, due to their scope of work patients were previously characterized as biomedical machines, rather than patient actors involved in their own care. An abstraction hierarchy that characterizes patients as beings with complex social values and priorities is needed. This can help better understand treatment in a modern approach to care. The purpose of this study was to perform a WDA to represent the treatment of patients with medical records. The methods to develop this model included the analysis of written texts and collaboration with subject matter experts. Our WDA represents the ecology through its functional purposes, abstract functions, generalized functions, physical functions, and physical forms. Compared with other work domain models, this model is able to articulate the nuanced balance between medical treatment, patient education, and limited health care resources. Concepts in the analysis were similar to the modeling choices of other WDAs but combined them in as a comprehensive, systematic, and contextual overview. The model is helpful to understand user competencies and needs. Future models could be developed to model the patient's domain and enable the exploration of the shared decision-making (SDM) paradigm. Our work domain model links treatment goals, decision-making constraints, and task workflows. This model can be used by system developers who would like to use ecological interface design (EID) to improve systems. Our hierarchy is the first in a future set that could explore new treatment paradigms. Future hierarchies could model the patient as a controller and could be useful for mobile app development. ©Justin D St-Maurice, Catherine M Burns. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 28.07.2017.
2017-01-01
Background Health care is a complex sociotechnical system. Patient treatment is evolving and needs to incorporate the use of technology and new patient-centered treatment paradigms. Cognitive work analysis (CWA) is an effective framework for understanding complex systems, and work domain analysis (WDA) is useful for understanding complex ecologies. Although previous applications of CWA have described patient treatment, due to their scope of work patients were previously characterized as biomedical machines, rather than patient actors involved in their own care. Objective An abstraction hierarchy that characterizes patients as beings with complex social values and priorities is needed. This can help better understand treatment in a modern approach to care. The purpose of this study was to perform a WDA to represent the treatment of patients with medical records. Methods The methods to develop this model included the analysis of written texts and collaboration with subject matter experts. Our WDA represents the ecology through its functional purposes, abstract functions, generalized functions, physical functions, and physical forms. Results Compared with other work domain models, this model is able to articulate the nuanced balance between medical treatment, patient education, and limited health care resources. Concepts in the analysis were similar to the modeling choices of other WDAs but combined them in as a comprehensive, systematic, and contextual overview. The model is helpful to understand user competencies and needs. Future models could be developed to model the patient’s domain and enable the exploration of the shared decision-making (SDM) paradigm. Conclusion Our work domain model links treatment goals, decision-making constraints, and task workflows. This model can be used by system developers who would like to use ecological interface design (EID) to improve systems. Our hierarchy is the first in a future set that could explore new treatment paradigms. Future hierarchies could model the patient as a controller and could be useful for mobile app development. PMID:28754650
Topological framework for local structure analysis in condensed matter
Lazar, Emanuel A.; Han, Jian; Srolovitz, David J.
2015-01-01
Physical systems are frequently modeled as sets of points in space, each representing the position of an atom, molecule, or mesoscale particle. As many properties of such systems depend on the underlying ordering of their constituent particles, understanding that structure is a primary objective of condensed matter research. Although perfect crystals are fully described by a set of translation and basis vectors, real-world materials are never perfect, as thermal vibrations and defects introduce significant deviation from ideal order. Meanwhile, liquids and glasses present yet more complexity. A complete understanding of structure thus remains a central, open problem. Here we propose a unified mathematical framework, based on the topology of the Voronoi cell of a particle, for classifying local structure in ordered and disordered systems that is powerful and practical. We explain the underlying reason why this topological description of local structure is better suited for structural analysis than continuous descriptions. We demonstrate the connection of this approach to the behavior of physical systems and explore how crystalline structure is compromised at elevated temperatures. We also illustrate potential applications to identifying defects in plastically deformed polycrystals at high temperatures, automating analysis of complex structures, and characterizing general disordered systems. PMID:26460045
An online sleep apnea detection method based on recurrence quantification analysis.
Nguyen, Hoa Dinh; Wilkins, Brek A; Cheng, Qi; Benjamin, Bruce Allen
2014-07-01
This paper introduces an online sleep apnea detection method based on heart rate complexity as measured by recurrence quantification analysis (RQA) statistics of heart rate variability (HRV) data. RQA statistics can capture nonlinear dynamics of a complex cardiorespiratory system during obstructive sleep apnea. In order to obtain a more robust measurement of the nonstationarity of the cardiorespiratory system, we use different fixed amount of neighbor thresholdings for recurrence plot calculation. We integrate a feature selection algorithm based on conditional mutual information to select the most informative RQA features for classification, and hence, to speed up the real-time classification process without degrading the performance of the system. Two types of binary classifiers, i.e., support vector machine and neural network, are used to differentiate apnea from normal sleep. A soft decision fusion rule is developed to combine the results of these classifiers in order to improve the classification performance of the whole system. Experimental results show that our proposed method achieves better classification results compared with the previous recurrence analysis-based approach. We also show that our method is flexible and a strong candidate for a real efficient sleep apnea detection system.
Complexity and the Limits of Revolution: What Will Happen to the Arab Spring?
NASA Astrophysics Data System (ADS)
Gard-Murray, Alexander S.; Bar-Yam, Yaneer
The recent social unrest across the Middle East and North Africa has deposed dictators who had ruled for decades. While the events have been hailed as an "Arab Spring" by those who hope that repressive autocracies will be replaced by democracies, what sort of regimes will eventually emerge from the crisis remains far from certain. Here we provide a complex systems framework, validated by historical precedent, to help answer this question. We describe the dynamics of governmental change as an evolutionary process similar to biological evolution, in which complex organizations gradually arise by replication, variation, and competitive selection. Different kinds of governments, however, have differing levels of complexity. Democracies must be more systemically complex than autocracies because of their need to incorporate large numbers of people in decision-making. This difference has important implications for the relative robustness of democratic and autocratic governments after revolutions. Revolutions may disrupt existing evolved complexity, limiting the potential for building more complex structures quickly. Insofar as systemic complexity is reduced by revolution, democracy is harder to create in the wake of unrest than autocracy. Applying this analysis to the Middle East and North Africa, we infer that in the absence of stable institutions or external assistance, new governments are in danger of facing increasingly insurmountable challenges and reverting to autocracy.
Xu, Wei
2007-12-01
This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.
ERIC Educational Resources Information Center
Lukas, George; Feurzeig, Wallace
A description is provided of a computer system designed to aid in the analysis of student programing work. The first section of the report consists of an overview and user's guide. In it, the system input is described in terms of a "dribble file" which records all student inputs generated; also an introduction is given to the aids…
NASA Technical Reports Server (NTRS)
1979-01-01
This document addresses requirements for post-test data reduction in support of the Orbital Flight Tests (OFT) mission evaluation team, specifically those which are planned to be implemented in the ODRC (Orbiter Data Reduction Complex). Only those requirements which have been previously baselined by the Data Systems and Analysis Directorate configuration control board are included. This document serves as the control document between Institutional Data Systems Division and the Integration Division for OFT mission evaluation data processing requirements, and shall be the basis for detailed design of ODRC data processing systems.
Phase-space networks of geometrically frustrated systems.
Han, Yilong
2009-11-01
We illustrate a network approach to the phase-space study by using two geometrical frustration models: antiferromagnet on triangular lattice and square ice. Their highly degenerated ground states are mapped as discrete networks such that the quantitative network analysis can be applied to phase-space studies. The resulting phase spaces share some comon features and establish a class of complex networks with unique Gaussian spectral densities. Although phase-space networks are heterogeneously connected, the systems are still ergodic due to the random Poisson processes. This network approach can be generalized to phase spaces of some other complex systems.
Methodology and Results of Mathematical Modelling of Complex Technological Processes
NASA Astrophysics Data System (ADS)
Mokrova, Nataliya V.
2018-03-01
The methodology of system analysis allows us to draw a mathematical model of the complex technological process. The mathematical description of the plasma-chemical process was proposed. The importance the quenching rate and initial temperature decrease time was confirmed for producing the maximum amount of the target product. The results of numerical integration of the system of differential equations can be used to describe reagent concentrations, plasma jet rate and temperature in order to achieve optimal mode of hardening. Such models are applicable both for solving control problems and predicting future states of sophisticated technological systems.
Five schools of thought about complexity: Implications for design and process science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warfield, J.N.
1996-12-31
The prevalence of complexity is a fact of life in virtually all aspects of system design today. Five schools of thought concerning complexity seem to be present in areas where people strive to gain more facility with difficult issues: (1) Interdisciplinary or Cross-Disciplinary {open_quotes}approaches{close_quotes} or {open_quotes}methods{close_quotes} (fostered by the Association for Integrative Studies, a predominantly liberal-arts faculty activity), (2) Systems Dynamics (fostered by Jay Forrester, Dennis Meadows, Peter Senge, and others closely associated with MIT), (3) Chaos Theory (arising in small groups in many locations), (4) Adaptive Systems Theory (predominantly associated with the Santa Fe Institute), and (5) The Structure-Basedmore » school (developed by the author, his colleagues and associates). A comparison of these five schools of thought will be offered, in order to show the implications of them upon the development and application of design and process science. The following criteria of comparison will be used: (a) how complexity is defined, (b) analysis versus synthesis, (c) potential for acquiring practical competence in coping with complexity, and (d) relationship to underlying formalisms that facilitate computer assistance in applications. Through these comparisons, the advantages and disadvantages of each school of thought can be clarified, and the possibilities of changes in the educational system to provide for the management of complexity in system design can be articulated.« less
Interdisciplinary analysis procedures in the modeling and control of large space-based structures
NASA Technical Reports Server (NTRS)
Cooper, Paul A.; Stockwell, Alan E.; Kim, Zeen C.
1987-01-01
The paper describes a computer software system called the Integrated Multidisciplinary Analysis Tool, IMAT, that has been developed at NASA Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven interactive executive program, IMAT links a relational database to commercial structural and controls analysis codes. The paper describes the procedures followed to analyze a complex satellite structure and control system. The codes used to accomplish the analysis are described, and an example is provided of an application of IMAT to the analysis of a reference space station subject to a rectangular pulse loading at its docking port.
Polytopic vector analysis in igneous petrology: Application to lunar petrogenesis
NASA Technical Reports Server (NTRS)
Shervais, John W.; Ehrlich, R.
1993-01-01
Lunar samples represent a heterogeneous assemblage of rocks with complex inter-relationships that are difficult to decipher using standard petrogenetic approaches. These inter-relationships reflect several distinct petrogenetic trends as well as thermomechanical mixing of distinct components. Additional complications arise from the unequal quality of chemical analyses and from the fact that many samples (e.g., breccia clasts) are too small to be representative of the system from which they derived. Polytopic vector analysis (PVA) is a multi-variate procedure used as a tool for exploratory data analysis. PVA allows the analyst to classify samples and clarifies relationships among heterogenous samples with complex petrogenetic histories. It differs from orthogonal factor analysis in that it uses non-orthogonal multivariate sample vectors to extract sample endmember compositions. The output from a Q-mode (sample based) factor analysis is the initial step in PVA. The Q-mode analysis, using criteria established by Miesch and Klovan and Miesch, is used to determine the number of endmembers in the data system. The second step involves determination of endmembers and mixing proportions with all output expressed in the same geochemical variable as the input. The composition of endmembers is derived by analysis of the variability of the data set. Endmembers need not be present in the data set, nor is it necessary for their composition to be known a priori. A set of any endmembers defines a 'polytope' or classification figure (triangle for a three component system, tetrahedron for a four component system, a 'five-tope' in four dimensions for five component system, et cetera).
Wind turbine wake measurement in complex terrain
NASA Astrophysics Data System (ADS)
Hansen, KS; Larsen, GC; Menke, R.; Vasiljevic, N.; Angelou, N.; Feng, J.; Zhu, WJ; Vignaroli, A.; W, W. Liu; Xu, C.; Shen, WZ
2016-09-01
SCADA data from a wind farm and high frequency time series measurements obtained with remote scanning systems have been analysed with focus on identification of wind turbine wake properties in complex terrain. The analysis indicates that within the flow regime characterized by medium to large downstream distances (more than 5 diameters) from the wake generating turbine, the wake changes according to local atmospheric conditions e.g. vertical wind speed. In very complex terrain the wake effects are often “overruled” by distortion effects due to the terrain complexity or topology.
Complex Nonlinear Dynamic System of Oligopolies Price Game with Heterogeneous Players Under Noise
NASA Astrophysics Data System (ADS)
Liu, Feng; Li, Yaguang
A nonlinear four oligopolies price game with heterogeneous players, that are boundedly rational and adaptive, is built using two different special demand costs. Based on the theory of complex discrete dynamical system, the stability and the existing equilibrium point are investigated. The complex dynamic behavior is presented via bifurcation diagrams, the Lyapunov exponents to show equilibrium state, bifurcation and chaos with the variation in parameters. As disturbance is ubiquitous in economic systems, this paper focuses on the analysis of delay feedback control method under noise circumstances. Stable dynamics is confirmed to depend mainly on the low price adjustment speed, and if all four players have limited opportunities to stabilize the market, the new adaptive player facing profits of scale are found to be higher than the incumbents of bounded rational.
NASA Astrophysics Data System (ADS)
Dolotovskii, I. V.; Dolotovskaya, N. V.; Larin, E. A.
2018-05-01
The article presents the architecture and content of a specialized analytical system for monitoring operational conditions, planning of consumption and generation of energy resources, long-term planning of production activities and development of a strategy for the development of the energy complex of gas processing enterprises. A compositional model of structured data on the equipment of the main systems of the power complex is proposed. The correctness of the use of software modules and the database of the analytical system is confirmed by comparing the results of measurements on the equipment of the electric power system and simulation at the operating gas processing plant. A high accuracy in the planning of consumption of fuel and energy resources has been achieved (the error does not exceed 1%). Information and program modules of the analytical system allow us to develop a strategy for improving the energy complex in the face of changing technological topology and partial uncertainty of economic factors.
Enhancing metaproteomics-The value of models and defined environmental microbial systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herbst, Florian-Alexander; Lünsmann, Vanessa; Kjeldal, Henrik
Metaproteomicsthe large-scale characterization of the entire protein complement of environmental microbiota at a given point in timehas provided new features to study complex microbial communities in order to unravel these black boxes. Some new technical challenges arose that were not an issue for classical proteome analytics before that could be tackled by the application of different model systems. Here, we review different current and future model systems for metaproteome analysis. We introduce model systems for clinical and biotechnological research questions including acid mine drainage, anaerobic digesters, and activated sludge, following a short introduction to microbial communities and metaproteomics. Model systemsmore » are useful to evaluate the challenges encountered within (but not limited to) metaproteomics, including species complexity and coverage, biomass availability, or reliable protein extraction. Moreover, the implementation of model systems can be considered as a step forward to better understand microbial community responses and ecological functions of single member organisms. In the future, improvements are necessary to fully explore complex environmental systems by metaproteomics.« less
Enhancing metaproteomics-The value of models and defined environmental microbial systems
Herbst, Florian-Alexander; Lünsmann, Vanessa; Kjeldal, Henrik; ...
2016-01-21
Metaproteomicsthe large-scale characterization of the entire protein complement of environmental microbiota at a given point in timehas provided new features to study complex microbial communities in order to unravel these black boxes. Some new technical challenges arose that were not an issue for classical proteome analytics before that could be tackled by the application of different model systems. Here, we review different current and future model systems for metaproteome analysis. We introduce model systems for clinical and biotechnological research questions including acid mine drainage, anaerobic digesters, and activated sludge, following a short introduction to microbial communities and metaproteomics. Model systemsmore » are useful to evaluate the challenges encountered within (but not limited to) metaproteomics, including species complexity and coverage, biomass availability, or reliable protein extraction. Moreover, the implementation of model systems can be considered as a step forward to better understand microbial community responses and ecological functions of single member organisms. In the future, improvements are necessary to fully explore complex environmental systems by metaproteomics.« less
Delory, Benjamin M; Li, Mao; Topp, Christopher N; Lobet, Guillaume
2018-01-01
Quantifying plant morphology is a very challenging task that requires methods able to capture the geometry and topology of plant organs at various spatial scales. Recently, the use of persistent homology as a mathematical framework to quantify plant morphology has been successfully demonstrated for leaves, shoots, and root systems. In this paper, we present a new data analysis pipeline implemented in the R package archiDART to analyse root system architectures using persistent homology. In addition, we also show that both geometric and topological descriptors are necessary to accurately compare root systems and assess their natural complexity.
archiDART v3.0: A new data analysis pipeline allowing the topological analysis of plant root systems
Delory, Benjamin M.; Li, Mao; Topp, Christopher N.; Lobet, Guillaume
2018-01-01
Quantifying plant morphology is a very challenging task that requires methods able to capture the geometry and topology of plant organs at various spatial scales. Recently, the use of persistent homology as a mathematical framework to quantify plant morphology has been successfully demonstrated for leaves, shoots, and root systems. In this paper, we present a new data analysis pipeline implemented in the R package archiDART to analyse root system architectures using persistent homology. In addition, we also show that both geometric and topological descriptors are necessary to accurately compare root systems and assess their natural complexity. PMID:29636899
Performance of Complex Spreading MIMO Systems With Interference
2011-06-01
14 III. PERFORMANCE ANALYSIS OF DS -PSK MISO ...............................................15 A. SYSTEM DESCRIPTION...15 Figure 9. BER of DS PSK system for broadband jamming and diversity L=1. ..............20 Figure 10. BER of DS PSK system for...broadband jamming and diversity L=2. .............21 Figure 11. BER of DS PSK system for broadband jamming and diversity L=3. ..............22 Figure 12
Glucose time series complexity as a predictor of type 2 diabetes
Rodríguez de Castro, Carmen; Vargas, Borja; García Delgado, Emilio; García Carretero, Rafael; Ruiz‐Galiana, Julián; Varela, Manuel
2016-01-01
Abstract Background Complexity analysis of glucose profile may provide valuable information about the gluco‐regulatory system. We hypothesized that a complexity metric (detrended fluctuation analysis, DFA) may have a prognostic value for the development of type 2 diabetes in patients at risk. Methods A total of 206 patients with any of the following risk factors (1) essential hypertension, (2) obesity or (3) a first‐degree relative with a diagnosis of diabetes were included in a survival analysis study for a diagnosis of new onset type 2 diabetes. At inclusion, a glucometry by means of a Continuous Glucose Monitoring System was performed, and DFA was calculated for a 24‐h glucose time series. Patients were then followed up every 6 months, controlling for the development of diabetes. Results In a median follow‐up of 18 months, there were 18 new cases of diabetes (58.5 cases/1000 patient‐years). DFA was a significant predictor for the development of diabetes, with ten events in the highest quartile versus one in the lowest (log‐rank test chi2 = 9, df = 1, p = 0.003), even after adjusting for other relevant clinical and biochemical variables. In a Cox model, the risk of diabetes development increased 2.8 times for every 0.1 DFA units. In a multivariate analysis, only fasting glucose, HbA1c and DFA emerged as significant factors. Conclusions Detrended fluctuation analysis significantly performed as a harbinger of type 2 diabetes development in a high‐risk population. Complexity analysis may help in targeting patients who could be candidates for intensified treatment. Copyright © 2016 The Authors. Diabetes/Metabolism Research and Reviews Published by John Wiley & Sons Ltd. PMID:27253149
Understanding Teamwork in Trauma Resuscitation through Analysis of Team Errors
ERIC Educational Resources Information Center
Sarcevic, Aleksandra
2009-01-01
An analysis of human errors in complex work settings can lead to important insights into the workspace design. This type of analysis is particularly relevant to safety-critical, socio-technical systems that are highly dynamic, stressful and time-constrained, and where failures can result in catastrophic societal, economic or environmental…
Microvascular Autonomic Composites
2012-01-06
thermogravimetric analysis (TGA) was employed. The double wall allowed for increased thermal stability of the microcapsules, which was...fluorescent nanoparticles (Berfield et al. 2006). Digital Image Correlation (DIC) is a data analysis method, which applies a mathematical...Theme IV: Experimental Assessment & Analysis 2.4.1 Optical diagnostics for complex microfluidic systems pg. 50 2.4.2 Fluorescent thermometry
Economic Analysis Case Studies of Battery Energy Storage with SAM
DOE Office of Scientific and Technical Information (OSTI.GOV)
DiOrio, Nicholas; Dobos, Aron; Janzou, Steven
2015-11-01
Interest in energy storage has continued to increase as states like California have introduced mandates and subsidies to spur adoption. This energy storage includes customer sited behind-the-meter storage coupled with photovoltaics (PV). This paper presents case study results from California and Tennessee, which were performed to assess the economic benefit of customer-installed systems. Different dispatch strategies, including manual scheduling and automated peak-shaving were explored to determine ideal ways to use the storage system to increase the system value and mitigate demand charges. Incentives, complex electric tariffs, and site specific load and PV data were used to perform detailed analysis. Themore » analysis was performed using the free, publically available System Advisor Model (SAM) tool. We find that installation of photovoltaics with a lithium-ion battery system priced at $300/kWh in Los Angeles under a high demand charge utility rate structure and dispatched using perfect day-ahead forecasting yields a positive net-present value, while all other scenarios cost the customer more than the savings accrued. Different dispatch strategies, including manual scheduling and automated peak-shaving were explored to determine ideal ways to use the storage system to increase the system value and mitigate demand charges. Incentives, complex electric tariffs, and site specific load and PV data were used to perform detailed analysis. The analysis was performed using the free, publically available System Advisor Model (SAM) tool. We find that installation of photovoltaics with a lithium-ion battery system priced at $300/kWh in Los Angeles under a high demand charge utility rate structure and dispatched using perfect day-ahead forecasting yields a positive net-present value, while all other scenarios cost the customer more than the savings accrued.« less
Steinberger, Dina M; Douglas, Stephen V; Kirschbaum, Mark S
2009-09-01
A multidisciplinary team from the University of Wisconsin Hospital and Clinics transplant program used failure mode and effects analysis to proactively examine opportunities for communication and handoff failures across the continuum of care from organ procurement to transplantation. The team performed a modified failure mode and effects analysis that isolated the multiple linked, serial, and complex information exchanges occurring during the transplantation of one solid organ. Failure mode and effects analysis proved effective for engaging a diverse group of persons who had an investment in the outcome in analysis and discussion of opportunities to improve the system's resilience for avoiding errors during a time-pressured and complex process.
Measures of complexity in signal analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurths, J.; Schwarz, U.; Witt, A.
Observational data of natural systems, as measured in astrophysical, geophysical or physiological experiments are typically quite different from those obtained in laboratories. Due to the peculiarities with these data, well-known characteristics processes, such as periodicities or fractal dimension, often do not provide a suitable description. To study such data, we present here the use of measures of complexity, which are mainly basing on symbolic dynamics. We distinguish two types of such quantities: traditional measures (e.g. algorithmic complexity) which are measures of randomness and alternative measures (e.g. {epsilon}-complexity) which relate highest complexity to some critical points. It is important to notemore » that there is no optimum measure of complexity. Its choice should depend on the context. Mostly, a combination of some such quantities is appropriate. Applying this concept to three examples in astrophysics, cardiology and cognitive psychology, we show that it can be helpful also in cases where other tools of data analysis fail. {copyright} {ital 1996 American Institute of Physics.}« less
Sturmberg, Joachim P.; Bennett, Jeanette M.; Picard, Martin; Seely, Andrew J. E.
2015-01-01
In this position paper, we submit a synthesis of theoretical models based on physiology, non-equilibrium thermodynamics, and non-linear time-series analysis. Based on an understanding of the human organism as a system of interconnected complex adaptive systems, we seek to examine the relationship between health, complexity, variability, and entropy production, as it might be useful to help understand aging, and improve care for patients. We observe the trajectory of life is characterized by the growth, plateauing and subsequent loss of adaptive function of organ systems, associated with loss of functioning and coordination of systems. Understanding development and aging requires the examination of interdependence among these organ systems. Increasing evidence suggests network interconnectedness and complexity can be captured/measured/associated with the degree and complexity of healthy biologic rhythm variability (e.g., heart and respiratory rate variability). We review physiological mechanisms linking the omics, arousal/stress systems, immune function, and mitochondrial bioenergetics; highlighting their interdependence in normal physiological function and aging. We argue that aging, known to be characterized by a loss of variability, is manifested at multiple scales, within functional units at the small scale, and reflected by diagnostic features at the larger scale. While still controversial and under investigation, it appears conceivable that the integrity of whole body complexity may be, at least partially, reflected in the degree and variability of intrinsic biologic rhythms, which we believe are related to overall system complexity that may be a defining feature of health and it's loss through aging. Harnessing this information for the development of therapeutic and preventative strategies may hold an opportunity to significantly improve the health of our patients across the trajectory of life. PMID:26082722
Quantitative trace analysis of complex mixtures using SABRE hyperpolarization.
Eshuis, Nan; van Weerdenburg, Bram J A; Feiters, Martin C; Rutjes, Floris P J T; Wijmenga, Sybren S; Tessari, Marco
2015-01-26
Signal amplification by reversible exchange (SABRE) is an emerging nuclear spin hyperpolarization technique that strongly enhances NMR signals of small molecules in solution. However, such signal enhancements have never been exploited for concentration determination, as the efficiency of SABRE can strongly vary between different substrates or even between nuclear spins in the same molecule. The first application of SABRE for the quantitative analysis of a complex mixture is now reported. Despite the inherent complexity of the system under investigation, which involves thousands of competing binding equilibria, analytes at concentrations in the low micromolar range could be quantified from single-scan SABRE spectra using a standard-addition approach. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Cook, Daniel L; Farley, Joel F; Tapscott, Stephen J
2001-01-01
Background: We propose that a computerized, internet-based graphical description language for systems biology will be essential for describing, archiving and analyzing complex problems of biological function in health and disease. Results: We outline here a conceptual basis for designing such a language and describe BioD, a prototype language that we have used to explore the utility and feasibility of this approach to functional biology. Using example models, we demonstrate that a rather limited lexicon of icons and arrows suffices to describe complex cell-biological systems as discrete models that can be posted and linked on the internet. Conclusions: Given available computer and internet technology, BioD may be implemented as an extensible, multidisciplinary language that can be used to archive functional systems knowledge and be extended to support both qualitative and quantitative functional analysis. PMID:11305940
Jatobá, Alessandro; de Carvalho, Paulo Victor R; da Cunha, Amauri Marques
2012-01-01
Work in organizations requires a minimum level of consensus on the understanding of the practices performed. To adopt technological devices to support the activities in environments where work is complex, characterized by the interdependence among a large number of variables, understanding about how work is done not only takes an even greater importance, but also becomes a more difficult task. Therefore, this study aims to present a method for modeling of work in complex systems, which allows improving the knowledge about the way activities are performed where these activities do not simply happen by performing procedures. Uniting techniques of Cognitive Task Analysis with the concept of Work Process, this work seeks to provide a method capable of providing a detailed and accurate vision of how people perform their tasks, in order to apply information systems for supporting work in organizations.
Large-scale structural analysis: The structural analyst, the CSM Testbed and the NAS System
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Mccleary, Susan L.; Macy, Steven C.; Aminpour, Mohammad A.
1989-01-01
The Computational Structural Mechanics (CSM) activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM testbed methods development environment is presented and some numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.
Applying STAMP in Accident Analysis
NASA Technical Reports Server (NTRS)
Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen
2003-01-01
Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.
Combining real-time monitoring and knowledge-based analysis in MARVEL
NASA Technical Reports Server (NTRS)
Schwuttke, Ursula M.; Quan, A. G.; Angelino, R.; Veregge, J. R.
1993-01-01
Real-time artificial intelligence is gaining increasing attention for applications in which conventional software methods are unable to meet technology needs. One such application area is the monitoring and analysis of complex systems. MARVEL, a distributed monitoring and analysis tool with multiple expert systems, was developed and successfully applied to the automation of interplanetary spacecraft operations at NASA's Jet Propulsion Laboratory. MARVEL implementation and verification approaches, the MARVEL architecture, and the specific benefits that were realized by using MARVEL in operations are described.
Hoffmann, Jürgen; Wallwiener, Diethelm
2009-04-08
One of the basic prerequisites for generating evidence-based data is the availability of classification systems. Attempts to date to classify breast cancer operations have focussed on specific problems, e.g. the avoidance of secondary corrective surgery for surgical defects, rather than taking a generic approach. Starting from an existing, simpler empirical scheme based on the complexity of breast surgical procedures, which was used in-house primarily in operative report-writing, a novel classification of ablative and breast-conserving procedures initially needed to be developed and elaborated systematically. To obtain proof of principle, a prospectively planned analysis of patient records for all major breast cancer-related operations performed at our breast centre in 2005 and 2006 was conducted using the new classification. Data were analysed using basic descriptive statistics such as frequency tables. A novel two-type, six-tier classification system comprising 12 main categories, 13 subcategories and 39 sub-subcategories of oncological, oncoplastic and reconstructive breast cancer-related surgery was successfully developed. Our system permitted unequivocal classification, without exception, of all 1225 procedures performed in 1166 breast cancer patients in 2005 and 2006. Breast cancer-related surgical procedures can be generically classified according to their surgical complexity. Analysis of all major procedures performed at our breast centre during the study period provides proof of principle for this novel classification system. We envisage various applications for this classification, including uses in randomised clinical trials, guideline development, specialist surgical training, continuing professional development as well as quality of care and public health research.
Advanced Diagnostic and Prognostic Testbed (ADAPT) Testability Analysis Report
NASA Technical Reports Server (NTRS)
Ossenfort, John
2008-01-01
As system designs become more complex, determining the best locations to add sensors and test points for the purpose of testing and monitoring these designs becomes more difficult. Not only must the designer take into consideration all real and potential faults of the system, he or she must also find efficient ways of detecting and isolating those faults. Because sensors and cabling take up valuable space and weight on a system, and given constraints on bandwidth and power, it is even more difficult to add sensors into these complex designs after the design has been completed. As a result, a number of software tools have been developed to assist the system designer in proper placement of these sensors during the system design phase of a project. One of the key functions provided by many of these software programs is a testability analysis of the system essentially an evaluation of how observable the system behavior is using available tests. During the design phase, testability metrics can help guide the designer in improving the inherent testability of the design. This may include adding, removing, or modifying tests; breaking up feedback loops, or changing the system to reduce fault propagation. Given a set of test requirements, the analysis can also help to verify that the system will meet those requirements. Of course, a testability analysis requires that a software model of the physical system is available. For the analysis to be most effective in guiding system design, this model should ideally be constructed in parallel with these efforts. The purpose of this paper is to present the final testability results of the Advanced Diagnostic and Prognostic Testbed (ADAPT) after the system model was completed. The tool chosen to build the model and to perform the testability analysis with is the Testability Engineering and Maintenance System Designer (TEAMS-Designer). The TEAMS toolset is intended to be a solution to span all phases of the system, from design and development through health management and maintenance. TEAMS-Designer is the model-building and testability analysis software in that suite.
Climate metrics and aviation : analysis of current understanding and uncertainties
DOT National Transportation Integrated Search
2008-01-22
The impact of climate-altering agents on the atmospheric system is a result of a complex system : of interactions and feedbacks within the atmosphere, and with the oceans, the land surface, the : biosphere and the cryosphere. Climate metrics are used...
A Conceptual Model of the Pasadena Housing System
NASA Technical Reports Server (NTRS)
Hirshberg, Alan S.; Barber, Thomas A.
1971-01-01
During the last 5 years, there have been several attempts at applying systems analysis to complex urban problems. This paper describes one such attempt by a multidisciplinary team of students, engineers, professors, and community representatives. The Project organization is discussed and the interaction of the different disciplines (the process) described. The two fundamental analysis questions posed by the Project were: "Why do houses deteriorate?" and "Why do people move?" The analysis of these questions led to the development of a conceptual system model of housing in Pasadena. The major elements of this model are described, and several conclusions drawn from it are presented.
Mrabet, Yassine; Semmar, Nabil
2010-05-01
Complexity of metabolic systems can be undertaken at different scales (metabolites, metabolic pathways, metabolic network map, biological population) and under different aspects (structural, functional, evolutive). To analyse such a complexity, metabolic systems need to be decomposed into different components according to different concepts. Four concepts are presented here consisting in considering metabolic systems as sets of metabolites, chemical reactions, metabolic pathways or successive processes. From a metabolomic dataset, such decompositions are performed using different mathematical methods including correlation, stiochiometric, ordination, classification, combinatorial and kinetic analyses. Correlation analysis detects and quantifies affinities/oppositions between metabolites. Stoichiometric analysis aims to identify the organisation of a metabolic network into different metabolic pathways on the hand, and to quantify/optimize the metabolic flux distribution through the different chemical reactions of the system. Ordination and classification analyses help to identify different metabolic trends and their associated metabolites in order to highlight chemical polymorphism representing different variability poles of the metabolic system. Then, metabolic processes/correlations responsible for such a polymorphism can be extracted in silico by combining metabolic profiles representative of different metabolic trends according to a weighting bootstrap approach. Finally evolution of metabolic processes in time can be analysed by different kinetic/dynamic modelling approaches.
Quantitative analysis of ribosome–mRNA complexes at different translation stages
Shirokikh, Nikolay E.; Alkalaeva, Elena Z.; Vassilenko, Konstantin S.; Afonina, Zhanna A.; Alekhina, Olga M.; Kisselev, Lev L.; Spirin, Alexander S.
2010-01-01
Inhibition of primer extension by ribosome–mRNA complexes (toeprinting) is a proven and powerful technique for studying mechanisms of mRNA translation. Here we have assayed an advanced toeprinting approach that employs fluorescently labeled DNA primers, followed by capillary electrophoresis utilizing standard instruments for sequencing and fragment analysis. We demonstrate that this improved technique is not merely fast and cost-effective, but also brings the primer extension inhibition method up to the next level. The electrophoretic pattern of the primer extension reaction can be characterized with a precision unattainable by the common toeprint analysis utilizing radioactive isotopes. This method allows us to detect and quantify stable ribosomal complexes at all stages of translation, including initiation, elongation and termination, generated during the complete translation process in both the in vitro reconstituted translation system and the cell lysate. We also point out the unique advantages of this new methodology, including the ability to assay sites of the ribosomal complex assembly on several mRNA species in the same reaction mixture. PMID:19910372
Gerlai, Robert
2017-08-01
Analysis of the zebrafish allows one to combine two distinct scientific approaches, comparative ethology and neurobehavioral genetics. Furthermore, this species arguably represents an optimal compromise between system complexity and practical simplicity. This mini-review focuses on a complex form of learning, relational learning and memory, in zebrafish. It argues that zebrafish are capable of this type of learning, and it attempts to show how this species may be useful in the analysis of the mechanisms and the evolution of this complex brain function. The review is not intended to be comprehensive. It is a short opinion piece that reflects the author's own biases, and it draws some of its examples from the work coming from his own laboratory. Nevertheless, it is written in the hope that it will persuade those who have not utilized zebrafish and who may be interested in opening their research horizon to this relatively novel but powerful vertebrate research tool. Copyright © 2017 Elsevier B.V. All rights reserved.
Ikeda, Atsushi; Hennig, Christoph; Rossberg, André; Tsushima, Satoru; Scheinost, Andreas C; Bernhard, Gert
2008-02-15
A multitechnique approach using extended X-ray absorption fine structure (EXAFS) spectroscopy based on iterative transformation factor analysis (ITFA), UV-visible absorption spectroscopy, and density functional theory (DFT) calculations has been performed in order to investigate the speciation of uranium(VI) nitrate species in acetonitrile and to identify the complex structure of individual species in the system. UV-visible spectral titration suggests that there are four different species in the system, that is, pure solvated species, mono-, di-, and trinitrate species. The pure EXAFS spectra of these individual species are extracted by ITFA from the measured spectral mixtures on the basis of the speciation distribution profile calculated from the UV-visible data. Data analysis of the extracted EXAFS spectra, with the help of DFT calculations, reveals the most probable complex structures of the individual species. The pure solvated species corresponds to a uranyl hydrate complex with an equatorial coordination number (CNeq) of 5, [UO2(H2O)5]2+. Nitrate ions tend to coordinate to the uranyl(VI) ion in a bidentate fashion rather than a unidentate one in acetonitrile for all the nitrate species. The mononitrate species forms the complex of [UO2(H2O)3NO3]+ with a CNeq value of 5, while the di- and trinitrate species have a CNeq value of 6, corresponding to [UO2(H2O)2(NO3)2]0 (D2h) and [UO2(NO3)3]- (D3h), respectively.
Le Bihan, Thierry; Robinson, Mark D; Stewart, Ian I; Figeys, Daniel
2004-01-01
Although HPLC-ESI-MS/MS is rapidly becoming an indispensable tool for the analysis of peptides in complex mixtures, the sequence coverage it affords is often quite poor. Low protein expression resulting in peptide signal intensities that fall below the limit of detection of the MS system in combination with differences in peptide ionization efficiency plays a significant role in this. A second important factor stems from differences in physicochemical properties of each peptide and how these properties relate to chromatographic retention and ultimate detection. To identify and understand those properties, we compared data from experimentally identified peptides with data from peptides predicted by in silico digest of all corresponding proteins in the experimental set. Three different complex protein mixtures extracted were used to define a training set to evaluate the amino acid retention coefficients based on linear regression analysis. The retention coefficients were also compared with other previous hydrophobic and retention scale. From this, we have constructed an empirical model that can be readily used to predict peptides that are likely to be observed on our HPLC-ESI-MS/MS system based on their physicochemical properties. Finally, we demonstrated that in silico prediction of peptides and their retention coefficients can be used to generate an inclusion list for a targeted mass spectrometric identification of low abundance proteins in complex protein samples. This approach is based on experimentally derived data to calibrate the method and therefore may theoretically be applied to any HPLC-MS/MS system on which data are being generated.
Analyzing system safety in lithium-ion grid energy storage
NASA Astrophysics Data System (ADS)
Rosewater, David; Williams, Adam
2015-12-01
As grid energy storage systems become more complex, it grows more difficult to design them for safe operation. This paper first reviews the properties of lithium-ion batteries that can produce hazards in grid scale systems. Then the conventional safety engineering technique Probabilistic Risk Assessment (PRA) is reviewed to identify its limitations in complex systems. To address this gap, new research is presented on the application of Systems-Theoretic Process Analysis (STPA) to a lithium-ion battery based grid energy storage system. STPA is anticipated to fill the gaps recognized in PRA for designing complex systems and hence be more effective or less costly to use during safety engineering. It was observed that STPA is able to capture causal scenarios for accidents not identified using PRA. Additionally, STPA enabled a more rational assessment of uncertainty (all that is not known) thereby promoting a healthy skepticism of design assumptions. We conclude that STPA may indeed be more cost effective than PRA for safety engineering in lithium-ion battery systems. However, further research is needed to determine if this approach actually reduces safety engineering costs in development, or improves industry safety standards.