Lamontagne, Marie-Eve
2013-01-01
Integration is a popular strategy to increase the quality of care within systems of care. However, there is no common language, approach or tool allowing for a valid description, comparison and evaluation of integrated care. Social network analysis could be a viable methodology to provide an objective picture of integrated networks. To illustrate social network analysis use in the context of systems of care for traumatic brain injury. We surveyed members of a network using a validated questionnaire to determine the links between them. We determined the density, centrality, multiplexity, and quality of the links reported. The network was described as moderately dense (0.6), the most prevalent link was knowledge, and four organisation members of a consortium were central to the network. Social network analysis allowed us to create a graphic representation of the network. Social network analysis is a useful methodology to objectively characterise integrated networks.
Transportation networks : data, analysis, methodology development and visualization.
DOT National Transportation Integrated Search
2007-12-29
This project provides data compilation, analysis methodology and visualization methodology for the current network : data assets of the Alabama Department of Transportation (ALDOT). This study finds that ALDOT is faced with a : considerable number of...
Lamontagne, Marie-Eve
2013-01-01
Introduction Integration is a popular strategy to increase the quality of care within systems of care. However, there is no common language, approach or tool allowing for a valid description, comparison and evaluation of integrated care. Social network analysis could be a viable methodology to provide an objective picture of integrated networks. Goal of the article To illustrate social network analysis use in the context of systems of care for traumatic brain injury. Method We surveyed members of a network using a validated questionnaire to determine the links between them. We determined the density, centrality, multiplexity, and quality of the links reported. Results The network was described as moderately dense (0.6), the most prevalent link was knowledge, and four organisation members of a consortium were central to the network. Social network analysis allowed us to create a graphic representation of the network. Conclusion Social network analysis is a useful methodology to objectively characterise integrated networks. PMID:24250281
Optimized planning methodologies of ASON implementation
NASA Astrophysics Data System (ADS)
Zhou, Michael M.; Tamil, Lakshman S.
2005-02-01
Advanced network planning concerns effective network-resource allocation for dynamic and open business environment. Planning methodologies of ASON implementation based on qualitative analysis and mathematical modeling are presented in this paper. The methodology includes method of rationalizing technology and architecture, building network and nodal models, and developing dynamic programming for multi-period deployment. The multi-layered nodal architecture proposed here can accommodate various nodal configurations for a multi-plane optical network and the network modeling presented here computes the required network elements for optimizing resource allocation.
A Methodology to Develop Entrepreneurial Networks: The Tech Ecosystem of Six African Cities
2014-11-01
Information Center. Greve, A. and Salaff, J. W. (2003), Social Networks and Entrepreneurship . Entrepreneurship Theory and Practice, 28: 1–22. doi...methodology enables us to accurately measure social capital and circumvents the massive effort of mapping an individual’s social network before...locating the social resources in it. 15. SUBJECT TERMS Network Analysis, Economic Networks, Network Topology, Network Classification 16. SECURITY
van Diessen, E; Numan, T; van Dellen, E; van der Kooi, A W; Boersma, M; Hofman, D; van Lutterveld, R; van Dijk, B W; van Straaten, E C W; Hillebrand, A; Stam, C J
2015-08-01
Electroencephalogram (EEG) and magnetoencephalogram (MEG) recordings during resting state are increasingly used to study functional connectivity and network topology. Moreover, the number of different analysis approaches is expanding along with the rising interest in this research area. The comparison between studies can therefore be challenging and discussion is needed to underscore methodological opportunities and pitfalls in functional connectivity and network studies. In this overview we discuss methodological considerations throughout the analysis pipeline of recording and analyzing resting state EEG and MEG data, with a focus on functional connectivity and network analysis. We summarize current common practices with their advantages and disadvantages; provide practical tips, and suggestions for future research. Finally, we discuss how methodological choices in resting state research can affect the construction of functional networks. When taking advantage of current best practices and avoid the most obvious pitfalls, functional connectivity and network studies can be improved and enable a more accurate interpretation and comparison between studies. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Social Network Analysis: A New Methodology for Counseling Research.
ERIC Educational Resources Information Center
Koehly, Laura M.; Shivy, Victoria A.
1998-01-01
Social network analysis (SNA) uses indices of relatedness among individuals to produce representations of social structures and positions inherent in dyads or groups. SNA methods provide quantitative representations of ongoing transactional patterns in a given social environment. Methodological issues, applications and resources are discussed…
Network Analysis in Comparative Social Sciences
ERIC Educational Resources Information Center
Vera, Eugenia Roldan; Schupp, Thomas
2006-01-01
This essay describes the pertinence of Social Network Analysis (SNA) for the social sciences in general, and discusses its methodological and conceptual implications for comparative research in particular. The authors first present a basic summary of the theoretical and methodological assumptions of SNA, followed by a succinct overview of its…
Toddi A. Steelman; Branda Nowell; Deena Bayoumi; Sarah McCaffrey
2014-01-01
We leverage economic theory, network theory, and social network analytical techniques to bring greater conceptual and methodological rigor to understand how information is exchanged during disasters. We ask, "How can information relationships be evaluated more systematically during a disaster response?" "Infocentric analysis"a term and...
Simulation of Attacks for Security in Wireless Sensor Network.
Diaz, Alvaro; Sanchez, Pablo
2016-11-18
The increasing complexity and low-power constraints of current Wireless Sensor Networks (WSN) require efficient methodologies for network simulation and embedded software performance analysis of nodes. In addition, security is also a very important feature that has to be addressed in most WSNs, since they may work with sensitive data and operate in hostile unattended environments. In this paper, a methodology for security analysis of Wireless Sensor Networks is presented. The methodology allows designing attack-aware embedded software/firmware or attack countermeasures to provide security in WSNs. The proposed methodology includes attacker modeling and attack simulation with performance analysis (node's software execution time and power consumption estimation). After an analysis of different WSN attack types, an attacker model is proposed. This model defines three different types of attackers that can emulate most WSN attacks. In addition, this paper presents a virtual platform that is able to model the node hardware, embedded software and basic wireless channel features. This virtual simulation analyzes the embedded software behavior and node power consumption while it takes into account the network deployment and topology. Additionally, this simulator integrates the previously mentioned attacker model. Thus, the impact of attacks on power consumption and software behavior/execution-time can be analyzed. This provides developers with essential information about the effects that one or multiple attacks could have on the network, helping them to develop more secure WSN systems. This WSN attack simulator is an essential element of the attack-aware embedded software development methodology that is also introduced in this work.
Graphical tools for network meta-analysis in STATA.
Chaimani, Anna; Higgins, Julian P T; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia
2013-01-01
Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.
Graphical Tools for Network Meta-Analysis in STATA
Chaimani, Anna; Higgins, Julian P. T.; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia
2013-01-01
Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results. PMID:24098547
Wavelet and Multiresolution Analysis for Finite Element Networking Paradigms
NASA Technical Reports Server (NTRS)
Kurdila, Andrew J.; Sharpley, Robert C.
1999-01-01
This paper presents a final report on Wavelet and Multiresolution Analysis for Finite Element Networking Paradigms. The focus of this research is to derive and implement: 1) Wavelet based methodologies for the compression, transmission, decoding, and visualization of three dimensional finite element geometry and simulation data in a network environment; 2) methodologies for interactive algorithm monitoring and tracking in computational mechanics; and 3) Methodologies for interactive algorithm steering for the acceleration of large scale finite element simulations. Also included in this report are appendices describing the derivation of wavelet based Particle Image Velocity algorithms and reduced order input-output models for nonlinear systems by utilizing wavelet approximations.
Simulation of Attacks for Security in Wireless Sensor Network
Diaz, Alvaro; Sanchez, Pablo
2016-01-01
The increasing complexity and low-power constraints of current Wireless Sensor Networks (WSN) require efficient methodologies for network simulation and embedded software performance analysis of nodes. In addition, security is also a very important feature that has to be addressed in most WSNs, since they may work with sensitive data and operate in hostile unattended environments. In this paper, a methodology for security analysis of Wireless Sensor Networks is presented. The methodology allows designing attack-aware embedded software/firmware or attack countermeasures to provide security in WSNs. The proposed methodology includes attacker modeling and attack simulation with performance analysis (node’s software execution time and power consumption estimation). After an analysis of different WSN attack types, an attacker model is proposed. This model defines three different types of attackers that can emulate most WSN attacks. In addition, this paper presents a virtual platform that is able to model the node hardware, embedded software and basic wireless channel features. This virtual simulation analyzes the embedded software behavior and node power consumption while it takes into account the network deployment and topology. Additionally, this simulator integrates the previously mentioned attacker model. Thus, the impact of attacks on power consumption and software behavior/execution-time can be analyzed. This provides developers with essential information about the effects that one or multiple attacks could have on the network, helping them to develop more secure WSN systems. This WSN attack simulator is an essential element of the attack-aware embedded software development methodology that is also introduced in this work. PMID:27869710
De Brún, Aoife; McAuliffe, Eilish
2018-03-13
Health systems research recognizes the complexity of healthcare, and the interacting and interdependent nature of components of a health system. To better understand such systems, innovative methods are required to depict and analyze their structures. This paper describes social network analysis as a methodology to depict, diagnose, and evaluate health systems and networks therein. Social network analysis is a set of techniques to map, measure, and analyze social relationships between people, teams, and organizations. Through use of a case study exploring support relationships among senior managers in a newly established hospital group, this paper illustrates some of the commonly used network- and node-level metrics in social network analysis, and demonstrates the value of these maps and metrics to understand systems. Network analysis offers a valuable approach to health systems and services researchers as it offers a means to depict activity relevant to network questions of interest, to identify opinion leaders, influencers, clusters in the network, and those individuals serving as bridgers across clusters. The strengths and limitations inherent in the method are discussed, and the applications of social network analysis in health services research are explored.
New methodologies for multi-scale time-variant reliability analysis of complex lifeline networks
NASA Astrophysics Data System (ADS)
Kurtz, Nolan Scot
The cost of maintaining existing civil infrastructure is enormous. Since the livelihood of the public depends on such infrastructure, its state must be managed appropriately using quantitative approaches. Practitioners must consider not only which components are most fragile to hazard, e.g. seismicity, storm surge, hurricane winds, etc., but also how they participate on a network level using network analysis. Focusing on particularly damaged components does not necessarily increase network functionality, which is most important to the people that depend on such infrastructure. Several network analyses, e.g. S-RDA, LP-bounds, and crude-MCS, and performance metrics, e.g. disconnection bounds and component importance, are available for such purposes. Since these networks are existing, the time state is also important. If networks are close to chloride sources, deterioration may be a major issue. Information from field inspections may also have large impacts on quantitative models. To address such issues, hazard risk analysis methodologies for deteriorating networks subjected to seismicity, i.e. earthquakes, have been created from analytics. A bridge component model has been constructed for these methodologies. The bridge fragilities, which were constructed from data, required a deeper level of analysis as these were relevant for specific structures. Furthermore, chloride-induced deterioration network effects were investigated. Depending on how mathematical models incorporate new information, many approaches are available, such as Bayesian model updating. To make such procedures more flexible, an adaptive importance sampling scheme was created for structural reliability problems. Additionally, such a method handles many kinds of system and component problems with singular or multiple important regions of the limit state function. These and previously developed analysis methodologies were found to be strongly sensitive to the network size. Special network topologies may be more or less computationally difficult, while the resolution of the network also has large affects. To take advantage of some types of topologies, network hierarchical structures with super-link representation have been used in the literature to increase the computational efficiency by analyzing smaller, densely connected networks; however, such structures were based on user input and subjective at times. To address this, algorithms must be automated and reliable. These hierarchical structures may indicate the structure of the network itself. This risk analysis methodology has been expanded to larger networks using such automated hierarchical structures. Component importance is the most important objective from such network analysis; however, this may only provide the information of which bridges to inspect/repair earliest and little else. High correlations influence such component importance measures in a negative manner. Additionally, a regional approach is not appropriately modelled. To investigate a more regional view, group importance measures based on hierarchical structures have been created. Such structures may also be used to create regional inspection/repair approaches. Using these analytical, quantitative risk approaches, the next generation of decision makers may make both component and regional-based optimal decisions using information from both network function and further effects of infrastructure deterioration.
Akosah, Eric; Ohemeng-Dapaah, Seth; Sakyi Baah, Joseph; Kanter, Andrew S
2013-01-01
Background The network structure of an organization influences how well or poorly an organization communicates and manages its resources. In the Millennium Villages Project site in Bonsaaso, Ghana, a mobile phone closed user group has been introduced for use by the Bonsaaso Millennium Villages Project Health Team and other key individuals. No assessment on the benefits or barriers of the use of the closed user group had been carried out. Objective The purpose of this research was to make the case for the use of social network analysis methods to be applied in health systems research—specifically related to mobile health. Methods This study used mobile phone voice records of, conducted interviews with, and reviewed call journals kept by a mobile phone closed user group consisting of the Bonsaaso Millennium Villages Project Health Team. Social network analysis methodology complemented by a qualitative component was used. Monthly voice data of the closed user group from Airtel Bharti Ghana were analyzed using UCINET and visual depictions of the network were created using NetDraw. Interviews and call journals kept by informants were analyzed using NVivo. Results The methodology was successful in helping identify effective organizational structure. Members of the Health Management Team were the more central players in the network, rather than the Community Health Nurses (who might have been expected to be central). Conclusions Social network analysis methodology can be used to determine the most productive structure for an organization or team, identify gaps in communication, identify key actors with greatest influence, and more. In conclusion, this methodology can be a useful analytical tool, especially in the context of mobile health, health services, and operational and managerial research. PMID:23552721
Kaonga, Nadi Nina; Labrique, Alain; Mechael, Patricia; Akosah, Eric; Ohemeng-Dapaah, Seth; Sakyi Baah, Joseph; Kodie, Richmond; Kanter, Andrew S; Levine, Orin
2013-04-03
The network structure of an organization influences how well or poorly an organization communicates and manages its resources. In the Millennium Villages Project site in Bonsaaso, Ghana, a mobile phone closed user group has been introduced for use by the Bonsaaso Millennium Villages Project Health Team and other key individuals. No assessment on the benefits or barriers of the use of the closed user group had been carried out. The purpose of this research was to make the case for the use of social network analysis methods to be applied in health systems research--specifically related to mobile health. This study used mobile phone voice records of, conducted interviews with, and reviewed call journals kept by a mobile phone closed user group consisting of the Bonsaaso Millennium Villages Project Health Team. Social network analysis methodology complemented by a qualitative component was used. Monthly voice data of the closed user group from Airtel Bharti Ghana were analyzed using UCINET and visual depictions of the network were created using NetDraw. Interviews and call journals kept by informants were analyzed using NVivo. The methodology was successful in helping identify effective organizational structure. Members of the Health Management Team were the more central players in the network, rather than the Community Health Nurses (who might have been expected to be central). Social network analysis methodology can be used to determine the most productive structure for an organization or team, identify gaps in communication, identify key actors with greatest influence, and more. In conclusion, this methodology can be a useful analytical tool, especially in the context of mobile health, health services, and operational and managerial research.
Megacity analysis: a clustering approach to classification
2017-06-01
kinetic or non -kinetic urban operations. We develop and implement a methodology to classify megacities into groups. Using 33 variables, we construct a...is interested in these megacity networks and their implications for potential urban operations. We develop a methodology to group like megacities...is interested in these megacity networks and their implications for potential urban operations. We develop a methodology to group like megacities
A network-base analysis of CMIP5 "historical" experiments
NASA Astrophysics Data System (ADS)
Bracco, A.; Foudalis, I.; Dovrolis, C.
2012-12-01
In computer science, "complex network analysis" refers to a set of metrics, modeling tools and algorithms commonly used in the study of complex nonlinear dynamical systems. Its main premise is that the underlying topology or network structure of a system has a strong impact on its dynamics and evolution. By allowing to investigate local and non-local statistical interaction, network analysis provides a powerful, but only marginally explored, framework to validate climate models and investigate teleconnections, assessing their strength, range, and impacts on the climate system. In this work we propose a new, fast, robust and scalable methodology to examine, quantify, and visualize climate sensitivity, while constraining general circulation models (GCMs) outputs with observations. The goal of our novel approach is to uncover relations in the climate system that are not (or not fully) captured by more traditional methodologies used in climate science and often adopted from nonlinear dynamical systems analysis, and to explain known climate phenomena in terms of the network structure or its metrics. Our methodology is based on a solid theoretical framework and employs mathematical and statistical tools, exploited only tentatively in climate research so far. Suitably adapted to the climate problem, these tools can assist in visualizing the trade-offs in representing global links and teleconnections among different data sets. Here we present the methodology, and compare network properties for different reanalysis data sets and a suite of CMIP5 coupled GCM outputs. With an extensive model intercomparison in terms of the climate network that each model leads to, we quantify how each model reproduces major teleconnections, rank model performances, and identify common or specific errors in comparing model outputs and observations.
Control Theoretic Modeling for Uncertain Cultural Attitudes and Unknown Adversarial Intent
2009-02-01
Constructive computational tools. 15. SUBJECT TERMS social learning, social networks , multiagent systems, game theory 16. SECURITY CLASSIFICATION OF: a...over- reactionary behaviors; 3) analysis of rational social learning in networks : analysis of belief propagation in social networks in various...general methodology as a predictive device for social network formation and for communication network formation with constraints on the lengths of
Igras, Susan; Diakité, Mariam; Lundgren, Rebecka
2017-07-01
In West Africa, social factors influence whether couples with unmet need for family planning act on birth-spacing desires. Tékponon Jikuagou is testing a social network-based intervention to reduce social barriers by diffusing new ideas. Individuals and groups judged socially influential by their communities provide entrée to networks. A participatory social network mapping methodology was designed to identify these diffusion actors. Analysis of monitoring data, in-depth interviews, and evaluation reports assessed the methodology's acceptability to communities and staff and whether it produced valid, reliable data to identify influential individuals and groups who diffuse new ideas through their networks. Results indicated the methodology's acceptability. Communities were actively and equitably engaged. Staff appreciated its ability to yield timely, actionable information. The mapping methodology also provided valid and reliable information by enabling communities to identify highly connected and influential network actors. Consistent with social network theory, this methodology resulted in the selection of informal groups and individuals in both informal and formal positions. In-depth interview data suggest these actors were diffusing new ideas, further confirming their influence/connectivity. The participatory methodology generated insider knowledge of who has social influence, challenging commonly held assumptions. Collecting and displaying information fostered staff and community learning, laying groundwork for social change.
Neural network approach in multichannel auditory event-related potential analysis.
Wu, F Y; Slater, J D; Ramsay, R E
1994-04-01
Even though there are presently no clearly defined criteria for the assessment of P300 event-related potential (ERP) abnormality, it is strongly indicated through statistical analysis that such criteria exist for classifying control subjects and patients with diseases resulting in neuropsychological impairment such as multiple sclerosis (MS). We have demonstrated the feasibility of artificial neural network (ANN) methods in classifying ERP waveforms measured at a single channel (Cz) from control subjects and MS patients. In this paper, we report the results of multichannel ERP analysis and a modified network analysis methodology to enhance automation of the classification rule extraction process. The proposed methodology significantly reduces the work of statistical analysis. It also helps to standardize the criteria of P300 ERP assessment and facilitate the computer-aided analysis on neuropsychological functions.
Semantic Networks and Social Networks
ERIC Educational Resources Information Center
Downes, Stephen
2005-01-01
Purpose: To illustrate the need for social network metadata within semantic metadata. Design/methodology/approach: Surveys properties of social networks and the semantic web, suggests that social network analysis applies to semantic content, argues that semantic content is more searchable if social network metadata is merged with semantic web…
What Does Global Migration Network Say about Recent Changes in the World System Structure?
ERIC Educational Resources Information Center
Zinkina, Julia; Korotayev, Andrey
2014-01-01
Purpose: The aim of this paper is to investigate whether the structure of the international migration system has remained stable through the recent turbulent changes in the world system. Design/methodology/approach: The methodology draws on the social network analysis framework--but with some noteworthy limitations stipulated by the specifics of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bri Rolston
2005-06-01
Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills,more » and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between those threats and the defensive capabilities of control systems can be analyzed. The results of the gap analysis drive changes in the cyber security of critical infrastructure networks to close the gap between current exploits and existing defenses. The analysis also provides defenders with an idea of how threat technology is evolving and how defenses will need to be modified to address these emerging trends.« less
Challenges to the Learning Organization in the Context of Generational Diversity and Social Networks
ERIC Educational Resources Information Center
Kaminska, Renata; Borzillo, Stefano
2018-01-01
Purpose: The purpose of this paper is to gain a better understanding of the challenges to the emergence of a learning organization (LO) posed by a context of generational diversity and an enterprise social networking system (ESNS). Design/methodology/approach: This study uses a qualitative methodology based on an analysis of 20 semi-structured…
WGCNA: an R package for weighted correlation network analysis.
Langfelder, Peter; Horvath, Steve
2008-12-29
Correlation networks are increasingly being used in bioinformatics applications. For example, weighted gene co-expression network analysis is a systems biology method for describing the correlation patterns among genes across microarray samples. Weighted correlation network analysis (WGCNA) can be used for finding clusters (modules) of highly correlated genes, for summarizing such clusters using the module eigengene or an intramodular hub gene, for relating modules to one another and to external sample traits (using eigengene network methodology), and for calculating module membership measures. Correlation networks facilitate network based gene screening methods that can be used to identify candidate biomarkers or therapeutic targets. These methods have been successfully applied in various biological contexts, e.g. cancer, mouse genetics, yeast genetics, and analysis of brain imaging data. While parts of the correlation network methodology have been described in separate publications, there is a need to provide a user-friendly, comprehensive, and consistent software implementation and an accompanying tutorial. The WGCNA R software package is a comprehensive collection of R functions for performing various aspects of weighted correlation network analysis. The package includes functions for network construction, module detection, gene selection, calculations of topological properties, data simulation, visualization, and interfacing with external software. Along with the R package we also present R software tutorials. While the methods development was motivated by gene expression data, the underlying data mining approach can be applied to a variety of different settings. The WGCNA package provides R functions for weighted correlation network analysis, e.g. co-expression network analysis of gene expression data. The R package along with its source code and additional material are freely available at http://www.genetics.ucla.edu/labs/horvath/CoexpressionNetwork/Rpackages/WGCNA.
Note: Methodology for the analysis of Bluetooth gateways in an implemented scatternet.
Etxaniz, J; Monje, P M; Aranguren, G
2014-03-01
This Note introduces a novel methodology to analyze the time performance of Bluetooth gateways in multi-hop networks, known as scatternets. The methodology is focused on distinguishing between the processing time and the time that each communication between nodes takes along an implemented scatternet. This technique is not only valid for Bluetooth networks but also for other wireless networks that offer access to their middleware in order to include beacons in the operation of the nodes. We show in this Note the results of the tests carried out on a Bluetooth scatternet in order to highlight the reliability and effectiveness of the methodology. The results also validate this technique showing convergence in the results when subtracting the time for the beacons from the delay measurements.
van Dam, Jesse C J; Schaap, Peter J; Martins dos Santos, Vitor A P; Suárez-Diez, María
2014-09-26
Different methods have been developed to infer regulatory networks from heterogeneous omics datasets and to construct co-expression networks. Each algorithm produces different networks and efforts have been devoted to automatically integrate them into consensus sets. However each separate set has an intrinsic value that is diluted and partly lost when building a consensus network. Here we present a methodology to generate co-expression networks and, instead of a consensus network, we propose an integration framework where the different networks are kept and analysed with additional tools to efficiently combine the information extracted from each network. We developed a workflow to efficiently analyse information generated by different inference and prediction methods. Our methodology relies on providing the user the means to simultaneously visualise and analyse the coexisting networks generated by different algorithms, heterogeneous datasets, and a suite of analysis tools. As a show case, we have analysed the gene co-expression networks of Mycobacterium tuberculosis generated using over 600 expression experiments. Regarding DNA damage repair, we identified SigC as a key control element, 12 new targets for LexA, an updated LexA binding motif, and a potential mismatch repair system. We expanded the DevR regulon with 27 genes while identifying 9 targets wrongly assigned to this regulon. We discovered 10 new genes linked to zinc uptake and a new regulatory mechanism for ZuR. The use of co-expression networks to perform system level analysis allows the development of custom made methodologies. As show cases we implemented a pipeline to integrate ChIP-seq data and another method to uncover multiple regulatory layers. Our workflow is based on representing the multiple types of information as network representations and presenting these networks in a synchronous framework that allows their simultaneous visualization while keeping specific associations from the different networks. By simultaneously exploring these networks and metadata, we gained insights into regulatory mechanisms in M. tuberculosis that could not be obtained through the separate analysis of each data type.
WGCNA: an R package for weighted correlation network analysis
Langfelder, Peter; Horvath, Steve
2008-01-01
Background Correlation networks are increasingly being used in bioinformatics applications. For example, weighted gene co-expression network analysis is a systems biology method for describing the correlation patterns among genes across microarray samples. Weighted correlation network analysis (WGCNA) can be used for finding clusters (modules) of highly correlated genes, for summarizing such clusters using the module eigengene or an intramodular hub gene, for relating modules to one another and to external sample traits (using eigengene network methodology), and for calculating module membership measures. Correlation networks facilitate network based gene screening methods that can be used to identify candidate biomarkers or therapeutic targets. These methods have been successfully applied in various biological contexts, e.g. cancer, mouse genetics, yeast genetics, and analysis of brain imaging data. While parts of the correlation network methodology have been described in separate publications, there is a need to provide a user-friendly, comprehensive, and consistent software implementation and an accompanying tutorial. Results The WGCNA R software package is a comprehensive collection of R functions for performing various aspects of weighted correlation network analysis. The package includes functions for network construction, module detection, gene selection, calculations of topological properties, data simulation, visualization, and interfacing with external software. Along with the R package we also present R software tutorials. While the methods development was motivated by gene expression data, the underlying data mining approach can be applied to a variety of different settings. Conclusion The WGCNA package provides R functions for weighted correlation network analysis, e.g. co-expression network analysis of gene expression data. The R package along with its source code and additional material are freely available at . PMID:19114008
Gene network analysis: from heart development to cardiac therapy.
Ferrazzi, Fulvia; Bellazzi, Riccardo; Engel, Felix B
2015-03-01
Networks offer a flexible framework to represent and analyse the complex interactions between components of cellular systems. In particular gene networks inferred from expression data can support the identification of novel hypotheses on regulatory processes. In this review we focus on the use of gene network analysis in the study of heart development. Understanding heart development will promote the elucidation of the aetiology of congenital heart disease and thus possibly improve diagnostics. Moreover, it will help to establish cardiac therapies. For example, understanding cardiac differentiation during development will help to guide stem cell differentiation required for cardiac tissue engineering or to enhance endogenous repair mechanisms. We introduce different methodological frameworks to infer networks from expression data such as Boolean and Bayesian networks. Then we present currently available temporal expression data in heart development and discuss the use of network-based approaches in published studies. Collectively, our literature-based analysis indicates that gene network analysis constitutes a promising opportunity to infer therapy-relevant regulatory processes in heart development. However, the use of network-based approaches has so far been limited by the small amount of samples in available datasets. Thus, we propose to acquire high-resolution temporal expression data to improve the mathematical descriptions of regulatory processes obtained with gene network inference methodologies. Especially probabilistic methods that accommodate the intrinsic variability of biological systems have the potential to contribute to a deeper understanding of heart development.
Mapping the Field of Educational Administration Research: A Journal Citation Network Analysis
ERIC Educational Resources Information Center
Wang, Yinying; Bowers, Alex J.
2016-01-01
Purpose: The purpose of this paper is to uncover how knowledge is exchanged and disseminated in the educational administration research literature through the journal citation network. Design/ Methodology/Approach: Drawing upon social network theory and citation network studies in other disciplines, the authors constructed an educational…
Interfacing Network Simulations and Empirical Data
2009-05-01
contraceptive innovations in the Cameroon. He found that real-world adoption rates did not follow simulation models when the network relationships were...Analysis of the Coevolution of Adolescents ’ Friendship Networks, Taste in Music, and Alcohol Consumption. Methodology, 2: 48-56. Tichy, N.M., Tushman
GetReal in network meta-analysis: a review of the methodology.
Efthimiou, Orestis; Debray, Thomas P A; van Valkenhoef, Gert; Trelle, Sven; Panayidou, Klea; Moons, Karel G M; Reitsma, Johannes B; Shang, Aijing; Salanti, Georgia
2016-09-01
Pairwise meta-analysis is an established statistical tool for synthesizing evidence from multiple trials, but it is informative only about the relative efficacy of two specific interventions. The usefulness of pairwise meta-analysis is thus limited in real-life medical practice, where many competing interventions may be available for a certain condition and studies informing some of the pairwise comparisons may be lacking. This commonly encountered scenario has led to the development of network meta-analysis (NMA). In the last decade, several applications, methodological developments, and empirical studies in NMA have been published, and the area is thriving as its relevance to public health is increasingly recognized. This article presents a review of the relevant literature on NMA methodology aiming to pinpoint the developments that have appeared in the field. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Baker-Doyle, Kira J.; Yoon, Susan A.
2011-01-01
This paper presents the first in a series of studies on the informal advice networks of a community of teachers in an in-service professional development program. The aim of the research was to use Social Network Analysis as a methodological tool to reveal the social networks developed by the teachers, and to examine whether these networks…
Value-Creating Networks: Organizational Issues and Challenges
ERIC Educational Resources Information Center
Allee, Verna
2009-01-01
Purpose: The purpose of this paper is to provide examples of evaluating value-creating networks and to address the organizational issues and challenges of a network orientation. Design/methodology/approach: Value network analysis was first developed in 1993 and was adapted in 1997 for intangible asset management. It has been applied from shopfloor…
Three-dimensional stochastic adjustment of volcano geodetic network in Arenal volcano, Costa Rica
NASA Astrophysics Data System (ADS)
Muller, C.; van der Laat, R.; Cattin, P.-H.; Del Potro, R.
2009-04-01
Volcano geodetic networks are a key instrument to understanding magmatic processes and, thus, forecasting potentially hazardous activity. These networks are extensively used on volcanoes worldwide and generally comprise a number of different traditional and modern geodetic surveying techniques such as levelling, distances, triangulation and GNSS. However, in most cases, data from the different methodologies are surveyed, adjusted and analysed independently. Experience shows that the problem with this procedure is the mismatch between the excellent correlation of position values within a single technique and the low cross-correlation of such values within different techniques or when the same network is surveyed shortly after using the same technique. Moreover one different independent network for each geodetic surveying technique strongly increase logistics and thus the cost of each measurement campaign. It is therefore important to develop geodetic networks which combine the different geodetic surveying technique, and to adjust geodetic data together in order to better quantify the uncertainties associated to the measured displacements. In order to overcome the lack of inter-methodology data integration, the Geomatic Institute of the University of Applied Sciences of Western Switzerland (HEIG-VD) has developed a methodology which uses a 3D stochastic adjustment software of redundant geodetic networks, TRINET+. The methodology consists of using each geodetic measurement technique for its strengths relative to other methodologies. Also, the combination of the measurements in a single network allows more cost-effective surveying. The geodetic data are thereafter adjusted and analysed in the same referential frame. The adjustment methodology is based on the least mean square method and links the data with the geometry. Trinet+ also allows to run a priori simulations of the network, hence testing the quality and resolution to be expected for a determined network even before it is built. Moreover, a posterior analysis enables identifying, and hence dismissing, measurement errors (antenna height, atmospheric effects, etc.). Here we present a preliminary effort to apply this technique to volcano deformation. A geodetic network has been developed on the western flank of the Arenal volcano in Costa Rica. It is surveyed with GNSS, angular and EDM (Electronic Distance Measurements) measurements. Three measurement campaigns were carried out between February and June 2008. The results show consistent and accurate output of deformation and uncertainty for each of the 12 benchmarks surveyed. The three campaigns also prove the repeatability and consistency of the statistical indicators and the displacement vectors. Although, this methodology has only recently been applied to volcanoes, we suggest that due to its cost-effective high-quality results it has the potential to be incorporated into the design and analysis of volcano geodetic networks worldwide.
ERIC Educational Resources Information Center
Oancea, Alis; Florez Petour, Teresa; Atkinson, Jeanette
2017-01-01
This article introduces a methodological approach for articulating and communicating the impact and value of research: qualitative network analysis using collaborative configuration tracing and visualization. The approach was proposed initially in Oancea ("Interpretations and Practices of Research Impact across the Range of Disciplines…
Assessing Group Interaction with Social Language Network Analysis
NASA Astrophysics Data System (ADS)
Scholand, Andrew J.; Tausczik, Yla R.; Pennebaker, James W.
In this paper we discuss a new methodology, social language network analysis (SLNA), that combines tools from social language processing and network analysis to assess socially situated working relationships within a group. Specifically, SLNA aims to identify and characterize the nature of working relationships by processing artifacts generated with computer-mediated communication systems, such as instant message texts or emails. Because social language processing is able to identify psychological, social, and emotional processes that individuals are not able to fully mask, social language network analysis can clarify and highlight complex interdependencies between group members, even when these relationships are latent or unrecognized.
Network representation of protein interactions: Theory of graph description and analysis.
Kurzbach, Dennis
2016-09-01
A methodological framework is presented for the graph theoretical interpretation of NMR data of protein interactions. The proposed analysis generalizes the idea of network representations of protein structures by expanding it to protein interactions. This approach is based on regularization of residue-resolved NMR relaxation times and chemical shift data and subsequent construction of an adjacency matrix that represents the underlying protein interaction as a graph or network. The network nodes represent protein residues. Two nodes are connected if two residues are functionally correlated during the protein interaction event. The analysis of the resulting network enables the quantification of the importance of each amino acid of a protein for its interactions. Furthermore, the determination of the pattern of correlations between residues yields insights into the functional architecture of an interaction. This is of special interest for intrinsically disordered proteins, since the structural (three-dimensional) architecture of these proteins and their complexes is difficult to determine. The power of the proposed methodology is demonstrated at the example of the interaction between the intrinsically disordered protein osteopontin and its natural ligand heparin. © 2016 The Protein Society.
Learning as Issue Framing in Agricultural Innovation Networks
ERIC Educational Resources Information Center
Tisenkopfs, Talis; Kunda, Ilona; Šumane, Sandra
2014-01-01
Purpose: Networks are increasingly viewed as entities of learning and innovation in agriculture. In this article we explore learning as issue framing in two agricultural innovation networks. Design/methodology/approach: We combine frame analysis and social learning theories to analyse the processes and factors contributing to frame convergence and…
ERIC Educational Resources Information Center
Putnik, Goran; Costa, Eric; Alves, Cátia; Castro, Hélio; Varela, Leonilde; Shah, Vaibhav
2016-01-01
Social network-based engineering education (SNEE) is designed and implemented as a model of Education 3.0 paradigm. SNEE represents a new learning methodology, which is based on the concept of social networks and represents an extended model of project-led education. The concept of social networks was applied in the real-life experiment,…
Transition Characteristic Analysis of Traffic Evolution Process for Urban Traffic Network
Chen, Hong; Li, Yang
2014-01-01
The characterization of the dynamics of traffic states remains fundamental to seeking for the solutions of diverse traffic problems. To gain more insights into traffic dynamics in the temporal domain, this paper explored temporal characteristics and distinct regularity in the traffic evolution process of urban traffic network. We defined traffic state pattern through clustering multidimensional traffic time series using self-organizing maps and construct a pattern transition network model that is appropriate for representing and analyzing the evolution progress. The methodology is illustrated by an application to data flow rate of multiple road sections from Network of Shenzhen's Nanshan District, China. Analysis and numerical results demonstrated that the methodology permits extracting many useful traffic transition characteristics including stability, preference, activity, and attractiveness. In addition, more information about the relationships between these characteristics was extracted, which should be helpful in understanding the complex behavior of the temporal evolution features of traffic patterns. PMID:24982969
Konchak, Chad; Prasad, Kislaya
2012-01-01
Objectives To develop a methodology for integrating social networks into traditional cost-effectiveness analysis (CEA) studies. This will facilitate the economic evaluation of treatment policies in settings where health outcomes are subject to social influence. Design This is a simulation study based on a Markov model. The lifetime health histories of a cohort are simulated, and health outcomes compared, under alternative treatment policies. Transition probabilities depend on the health of others with whom there are shared social ties. Setting The methodology developed is shown to be applicable in any healthcare setting where social ties affect health outcomes. The example of obesity prevention is used for illustration under the assumption that weight changes are subject to social influence. Main outcome measures Incremental cost-effectiveness ratio (ICER). Results When social influence increases, treatment policies become more cost effective (have lower ICERs). The policy of only treating individuals who span multiple networks can be more cost effective than the policy of treating everyone. This occurs when the network is more fragmented. Conclusions (1) When network effects are accounted for, they result in very different values of incremental cost-effectiveness ratios (ICERs). (2) Treatment policies can be devised to take network structure into account. The integration makes it feasible to conduct a cost-benefit evaluation of such policies. PMID:23117559
Batalle, Dafnis; Muñoz-Moreno, Emma; Figueras, Francesc; Bargallo, Nuria; Eixarch, Elisenda; Gratacos, Eduard
2013-12-01
Obtaining individual biomarkers for the prediction of altered neurological outcome is a challenge of modern medicine and neuroscience. Connectomics based on magnetic resonance imaging (MRI) stands as a good candidate to exhaustively extract information from MRI by integrating the information obtained in a few network features that can be used as individual biomarkers of neurological outcome. However, this approach typically requires the use of diffusion and/or functional MRI to extract individual brain networks, which require high acquisition times and present an extreme sensitivity to motion artifacts, critical problems when scanning fetuses and infants. Extraction of individual networks based on morphological similarity from gray matter is a new approach that benefits from the power of graph theory analysis to describe gray matter morphology as a large-scale morphological network from a typical clinical anatomic acquisition such as T1-weighted MRI. In the present paper we propose a methodology to normalize these large-scale morphological networks to a brain network with standardized size based on a parcellation scheme. The proposed methodology was applied to reconstruct individual brain networks of 63 one-year-old infants, 41 infants with intrauterine growth restriction (IUGR) and 22 controls, showing altered network features in the IUGR group, and their association with neurodevelopmental outcome at two years of age by means of ordinal regression analysis of the network features obtained with Bayley Scale for Infant and Toddler Development, third edition. Although it must be more widely assessed, this methodology stands as a good candidate for the development of biomarkers for altered neurodevelopment in the pediatric population. © 2013 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Munoz, David Andres; Queupil, Juan Pablo; Fraser, Pablo
2016-01-01
Purpose: The purpose of this paper is to analyze collaboration networks and their patterns among higher education institutions (HEIs) in Chile and the Latin American region. This will provide evidence to educational managements in order to properly allocate their efforts to improve collaboration. Design/methodology/approach: This quantitative…
Hilgers, Ralf-Dieter; Bogdan, Malgorzata; Burman, Carl-Fredrik; Dette, Holger; Karlsson, Mats; König, Franz; Male, Christoph; Mentré, France; Molenberghs, Geert; Senn, Stephen
2018-05-11
IDeAl (Integrated designs and analysis of small population clinical trials) is an EU funded project developing new statistical design and analysis methodologies for clinical trials in small population groups. Here we provide an overview of IDeAl findings and give recommendations to applied researchers. The description of the findings is broken down by the nine scientific IDeAl work packages and summarizes results from the project's more than 60 publications to date in peer reviewed journals. In addition, we applied text mining to evaluate the publications and the IDeAl work packages' output in relation to the design and analysis terms derived from in the IRDiRC task force report on small population clinical trials. The results are summarized, describing the developments from an applied viewpoint. The main result presented here are 33 practical recommendations drawn from the work, giving researchers a comprehensive guidance to the improved methodology. In particular, the findings will help design and analyse efficient clinical trials in rare diseases with limited number of patients available. We developed a network representation relating the hot topics developed by the IRDiRC task force on small population clinical trials to IDeAl's work as well as relating important methodologies by IDeAl's definition necessary to consider in design and analysis of small-population clinical trials. These network representation establish a new perspective on design and analysis of small-population clinical trials. IDeAl has provided a huge number of options to refine the statistical methodology for small-population clinical trials from various perspectives. A total of 33 recommendations developed and related to the work packages help the researcher to design small population clinical trial. The route to improvements is displayed in IDeAl-network representing important statistical methodological skills necessary to design and analysis of small-population clinical trials. The methods are ready for use.
Calibration Testing of Network Tap Devices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Popovsky, Barbara; Chee, Brian; Frincke, Deborah A.
2007-11-14
Abstract: Understanding the behavior of network forensic devices is important to support prosecutions of malicious conduct on computer networks as well as legal remedies for false accusations of network management negligence. Individuals who seek to establish the credibility of network forensic data must speak competently about how the data was gathered and the potential for data loss. Unfortunately, manufacturers rarely provide information about the performance of low-layer network devices at a level that will survive legal challenges. This paper proposes a first step toward an independent calibration standard by establishing a validation testing methodology for evaluating forensic taps against manufacturermore » specifications. The methodology and the theoretical analysis that led to its development are offered as a conceptual framework for developing a standard and to "operationalize" network forensic readiness. This paper also provides details of an exemplar test, testing environment, procedures and results.« less
System data communication structures for active-control transport aircraft, volume 1
NASA Technical Reports Server (NTRS)
Hopkins, A. L.; Martin, J. H.; Brock, L. D.; Jansson, D. G.; Serben, S.; Smith, T. B.; Hanley, L. D.
1981-01-01
Candidate data communication techniques are identified, including dedicated links, local buses, broadcast buses, multiplex buses, and mesh networks. The design methodology for mesh networks is then discussed, including network topology and node architecture. Several concepts of power distribution are reviewed, including current limiting and mesh networks for power. The technology issues of packaging, transmission media, and lightning are addressed, and, finally, the analysis tools developed to aid in the communication design process are described. There are special tools to analyze the reliability and connectivity of networks and more general reliability analysis tools for all types of systems.
Process mapping as a tool for home health network analysis.
Pluto, Delores M; Hirshorn, Barbara A
2003-01-01
Process mapping is a qualitative tool that allows service providers, policy makers, researchers, and other concerned stakeholders to get a "bird's eye view" of a home health care organizational network or a very focused, in-depth view of a component of such a network. It can be used to share knowledge about community resources directed at the older population, identify gaps in resource availability and access, and promote on-going collaborative interactions that encourage systemic policy reassessment and programmatic refinement. This article is a methodological description of process mapping, which explores its utility as a practice and research tool, illustrates its use in describing service-providing networks, and discusses some of the issues that are key to successfully using this methodology.
Martins, Marcelo Ramos; Schleder, Adriana Miralles; Droguett, Enrique López
2014-12-01
This article presents an iterative six-step risk analysis methodology based on hybrid Bayesian networks (BNs). In typical risk analysis, systems are usually modeled as discrete and Boolean variables with constant failure rates via fault trees. Nevertheless, in many cases, it is not possible to perform an efficient analysis using only discrete and Boolean variables. The approach put forward by the proposed methodology makes use of BNs and incorporates recent developments that facilitate the use of continuous variables whose values may have any probability distributions. Thus, this approach makes the methodology particularly useful in cases where the available data for quantification of hazardous events probabilities are scarce or nonexistent, there is dependence among events, or when nonbinary events are involved. The methodology is applied to the risk analysis of a regasification system of liquefied natural gas (LNG) on board an FSRU (floating, storage, and regasification unit). LNG is becoming an important energy source option and the world's capacity to produce LNG is surging. Large reserves of natural gas exist worldwide, particularly in areas where the resources exceed the demand. Thus, this natural gas is liquefied for shipping and the storage and regasification process usually occurs at onshore plants. However, a new option for LNG storage and regasification has been proposed: the FSRU. As very few FSRUs have been put into operation, relevant failure data on FSRU systems are scarce. The results show the usefulness of the proposed methodology for cases where the risk analysis must be performed under considerable uncertainty. © 2014 Society for Risk Analysis.
NASA Technical Reports Server (NTRS)
Hopkins, Dale A.; Patnaik, Surya N.
2000-01-01
A preliminary aircraft engine design methodology is being developed that utilizes a cascade optimization strategy together with neural network and regression approximation methods. The cascade strategy employs different optimization algorithms in a specified sequence. The neural network and regression methods are used to approximate solutions obtained from the NASA Engine Performance Program (NEPP), which implements engine thermodynamic cycle and performance analysis models. The new methodology is proving to be more robust and computationally efficient than the conventional optimization approach of using a single optimization algorithm with direct reanalysis. The methodology has been demonstrated on a preliminary design problem for a novel subsonic turbofan engine concept that incorporates a wave rotor as a cycle-topping device. Computations of maximum thrust were obtained for a specific design point in the engine mission profile. The results (depicted in the figure) show a significant improvement in the maximum thrust obtained using the new methodology in comparison to benchmark solutions obtained using NEPP in a manual design mode.
ERIC Educational Resources Information Center
Engel, Anna; Coll, Cesar; Bustos, Alfonso
2013-01-01
This work explores some methodological challenges in the application of Social Network Analysis (SNA) to the study of "Asynchronous Learning Networks" (ALN). Our interest in the SNA is situated within the framework of the study of Distributed Teaching Presence (DTP), understood as the exercise of educational influence, through a multi-method…
A neural network based methodology to predict site-specific spectral acceleration values
NASA Astrophysics Data System (ADS)
Kamatchi, P.; Rajasankar, J.; Ramana, G. V.; Nagpal, A. K.
2010-12-01
A general neural network based methodology that has the potential to replace the computationally-intensive site-specific seismic analysis of structures is proposed in this paper. The basic framework of the methodology consists of a feed forward back propagation neural network algorithm with one hidden layer to represent the seismic potential of a region and soil amplification effects. The methodology is implemented and verified with parameters corresponding to Delhi city in India. For this purpose, strong ground motions are generated at bedrock level for a chosen site in Delhi due to earthquakes considered to originate from the central seismic gap of the Himalayan belt using necessary geological as well as geotechnical data. Surface level ground motions and corresponding site-specific response spectra are obtained by using a one-dimensional equivalent linear wave propagation model. Spectral acceleration values are considered as a target parameter to verify the performance of the methodology. Numerical studies carried out to validate the proposed methodology show that the errors in predicted spectral acceleration values are within acceptable limits for design purposes. The methodology is general in the sense that it can be applied to other seismically vulnerable regions and also can be updated by including more parameters depending on the state-of-the-art in the subject.
ERIC Educational Resources Information Center
Lockhart, Naorah C.
2017-01-01
Group counselors commonly collaborate in interdisciplinary settings in health care, substance abuse, and juvenile justice. Social network analysis is a methodology rarely used in counseling research yet has potential to examine task group dynamics in new ways. This case study explores the scholarly relationships among 36 members of an…
Bayesian network meta-analysis for cluster randomized trials with binary outcomes.
Uhlmann, Lorenz; Jensen, Katrin; Kieser, Meinhard
2017-06-01
Network meta-analysis is becoming a common approach to combine direct and indirect comparisons of several treatment arms. In recent research, there have been various developments and extensions of the standard methodology. Simultaneously, cluster randomized trials are experiencing an increased popularity, especially in the field of health services research, where, for example, medical practices are the units of randomization but the outcome is measured at the patient level. Combination of the results of cluster randomized trials is challenging. In this tutorial, we examine and compare different approaches for the incorporation of cluster randomized trials in a (network) meta-analysis. Furthermore, we provide practical insight on the implementation of the models. In simulation studies, it is shown that some of the examined approaches lead to unsatisfying results. However, there are alternatives which are suitable to combine cluster randomized trials in a network meta-analysis as they are unbiased and reach accurate coverage rates. In conclusion, the methodology can be extended in such a way that an adequate inclusion of the results obtained in cluster randomized trials becomes feasible. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
The added value of thorough economic evaluation of telemedicine networks.
Le Goff-Pronost, Myriam; Sicotte, Claude
2010-02-01
This paper proposes a thorough framework for the economic evaluation of telemedicine networks. A standard cost analysis methodology was used as the initial base, similar to the evaluation method currently being applied to telemedicine, and to which we suggest adding subsequent stages that enhance the scope and sophistication of the analytical methodology. We completed the methodology with a longitudinal and stakeholder analysis, followed by the calculation of a break-even threshold, a calculation of the economic outcome based on net present value (NPV), an estimate of the social gain through external effects, and an assessment of the probability of social benefits. In order to illustrate the advantages, constraints and limitations of the proposed framework, we tested it in a paediatric cardiology tele-expertise network. The results demonstrate that the project threshold was not reached after the 4 years of the study. Also, the calculation of the project's NPV remained negative. However, the additional analytical steps of the proposed framework allowed us to highlight alternatives that can make this service economically viable. These included: use over an extended period of time, extending the network to other telemedicine specialties, or including it in the services offered by other community hospitals. In sum, the results presented here demonstrate the usefulness of an economic evaluation framework as a way of offering decision makers the tools they need to make comprehensive evaluations of telemedicine networks.
Ge, Long; Tian, Jin-hui; Li, Xiu-xia; Song, Fujian; Li, Lun; Zhang, Jun; Li, Ge; Pei, Gai-qin; Qiu, Xia; Yang, Ke-hu
2016-01-01
Because of the methodological complexity of network meta-analyses (NMAs), NMAs may be more vulnerable to methodological risks than conventional pair-wise meta-analysis. Our study aims to investigate epidemiology characteristics, conduction of literature search, methodological quality and reporting of statistical analysis process in the field of cancer based on PRISMA extension statement and modified AMSTAR checklist. We identified and included 102 NMAs in the field of cancer. 61 NMAs were conducted using a Bayesian framework. Of them, more than half of NMAs did not report assessment of convergence (60.66%). Inconsistency was assessed in 27.87% of NMAs. Assessment of heterogeneity in traditional meta-analyses was more common (42.62%) than in NMAs (6.56%). Most of NMAs did not report assessment of similarity (86.89%) and did not used GRADE tool to assess quality of evidence (95.08%). 43 NMAs were adjusted indirect comparisons, the methods used were described in 53.49% NMAs. Only 4.65% NMAs described the details of handling of multi group trials and 6.98% described the methods of similarity assessment. The median total AMSTAR-score was 8.00 (IQR: 6.00–8.25). Methodological quality and reporting of statistical analysis did not substantially differ by selected general characteristics. Overall, the quality of NMAs in the field of cancer was generally acceptable. PMID:27848997
Rivera, José; Carrillo, Mariano; Chacón, Mario; Herrera, Gilberto; Bojorquez, Gilberto
2007-01-01
The development of smart sensors involves the design of reconfigurable systems capable of working with different input sensors. Reconfigurable systems ideally should spend the least possible amount of time in their calibration. An autocalibration algorithm for intelligent sensors should be able to fix major problems such as offset, variation of gain and lack of linearity, as accurately as possible. This paper describes a new autocalibration methodology for nonlinear intelligent sensors based on artificial neural networks, ANN. The methodology involves analysis of several network topologies and training algorithms. The proposed method was compared against the piecewise and polynomial linearization methods. Method comparison was achieved using different number of calibration points, and several nonlinear levels of the input signal. This paper also shows that the proposed method turned out to have a better overall accuracy than the other two methods. Besides, experimentation results and analysis of the complete study, the paper describes the implementation of the ANN in a microcontroller unit, MCU. In order to illustrate the method capability to build autocalibration and reconfigurable systems, a temperature measurement system was designed and tested. The proposed method is an improvement over the classic autocalibration methodologies, because it impacts on the design process of intelligent sensors, autocalibration methodologies and their associated factors, like time and cost.
A Radial Basis Function Approach to Financial Time Series Analysis
1993-12-01
including efficient methods for parameter estimation and pruning, a pointwise prediction error estimator, and a methodology for controlling the "data...collection of practical techniques to address these issues for a modeling methodology . Radial Basis Function networks. These techniques in- clude efficient... methodology often then amounts to a careful consideration of the interplay between model complexity and reliability. These will be recurrent themes
Backbone of complex networks of corporations: the flow of control.
Glattfelder, J B; Battiston, S
2009-09-01
We present a methodology to extract the backbone of complex networks based on the weight and direction of links, as well as on nontopological properties of nodes. We show how the methodology can be applied in general to networks in which mass or energy is flowing along the links. In particular, the procedure enables us to address important questions in economics, namely, how control and wealth are structured and concentrated across national markets. We report on the first cross-country investigation of ownership networks, focusing on the stock markets of 48 countries around the world. On the one hand, our analysis confirms results expected on the basis of the literature on corporate control, namely, that in Anglo-Saxon countries control tends to be dispersed among numerous shareholders. On the other hand, it also reveals that in the same countries, control is found to be highly concentrated at the global level, namely, lying in the hands of very few important shareholders. Interestingly, the exact opposite is observed for European countries. These results have previously not been reported as they are not observable without the kind of network analysis developed here.
Backbone of complex networks of corporations: The flow of control
NASA Astrophysics Data System (ADS)
Glattfelder, J. B.; Battiston, S.
2009-09-01
We present a methodology to extract the backbone of complex networks based on the weight and direction of links, as well as on nontopological properties of nodes. We show how the methodology can be applied in general to networks in which mass or energy is flowing along the links. In particular, the procedure enables us to address important questions in economics, namely, how control and wealth are structured and concentrated across national markets. We report on the first cross-country investigation of ownership networks, focusing on the stock markets of 48 countries around the world. On the one hand, our analysis confirms results expected on the basis of the literature on corporate control, namely, that in Anglo-Saxon countries control tends to be dispersed among numerous shareholders. On the other hand, it also reveals that in the same countries, control is found to be highly concentrated at the global level, namely, lying in the hands of very few important shareholders. Interestingly, the exact opposite is observed for European countries. These results have previously not been reported as they are not observable without the kind of network analysis developed here.
The application of complex network time series analysis in turbulent heated jets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Charakopoulos, A. K.; Karakasidis, T. E., E-mail: thkarak@uth.gr; Liakopoulos, A.
In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topologicalmore » properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.« less
The application of complex network time series analysis in turbulent heated jets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Charakopoulos, A. K.; Karakasidis, T. E., E-mail: thkarak@uth.gr; Liakopoulos, A.
2014-06-15
In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topologicalmore » properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics.« less
Díaz Córdova, Diego
2016-01-01
The aim of this article is to introduce two methodological strategies that have not often been utilized in the anthropology of food: agent-based models and social networks analysis. In order to illustrate these methods in action, two cases based in materials typical of the anthropology of food are presented. For the first strategy, fieldwork carried out in Quebrada de Humahuaca (province of Jujuy, Argentina) regarding meal recall was used, and for the second, elements of the concept of "domestic consumption strategies" applied by Aguirre were employed. The underlying idea is that, given that eating is recognized as a "total social fact" and, therefore, as a complex phenomenon, the methodological approach must also be characterized by complexity. The greater the number of methods utilized (with the appropriate rigor), the better able we will be to understand the dynamics of feeding in the social environment.
Information security system quality assessment through the intelligent tools
NASA Astrophysics Data System (ADS)
Trapeznikov, E. V.
2018-04-01
The technology development has shown the automated system information security comprehensive analysis necessity. The subject area analysis indicates the study relevance. The research objective is to develop the information security system quality assessment methodology based on the intelligent tools. The basis of the methodology is the information security assessment model in the information system through the neural network. The paper presents the security assessment model, its algorithm. The methodology practical implementation results in the form of the software flow diagram are represented. The practical significance of the model being developed is noted in conclusions.
Performance analysis of LAN bridges and routers
NASA Technical Reports Server (NTRS)
Hajare, Ankur R.
1991-01-01
Bridges and routers are used to interconnect Local Area Networks (LANs). The performance of these devices is important since they can become bottlenecks in large multi-segment networks. Performance metrics and test methodology for bridges and routers were not standardized. Performance data reported by vendors is not applicable to the actual scenarios encountered in an operational network. However, vendor-provided data can be used to calibrate models of bridges and routers that, along with other models, yield performance data for a network. Several tools are available for modeling bridges and routers - Network II.5 was used. The results of the analysis of some bridges and routers are presented.
A genomic regulatory network for development
NASA Technical Reports Server (NTRS)
Davidson, Eric H.; Rast, Jonathan P.; Oliveri, Paola; Ransick, Andrew; Calestani, Cristina; Yuh, Chiou-Hwa; Minokawa, Takuya; Amore, Gabriele; Hinman, Veronica; Arenas-Mena, Cesar;
2002-01-01
Development of the body plan is controlled by large networks of regulatory genes. A gene regulatory network that controls the specification of endoderm and mesoderm in the sea urchin embryo is summarized here. The network was derived from large-scale perturbation analyses, in combination with computational methodologies, genomic data, cis-regulatory analysis, and molecular embryology. The network contains over 40 genes at present, and each node can be directly verified at the DNA sequence level by cis-regulatory analysis. Its architecture reveals specific and general aspects of development, such as how given cells generate their ordained fates in the embryo and why the process moves inexorably forward in developmental time.
Analysis and Reduction of Complex Networks Under Uncertainty.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghanem, Roger G
2014-07-31
This effort was a collaboration with Youssef Marzouk of MIT, Omar Knio of Duke University (at the time at Johns Hopkins University) and Habib Najm of Sandia National Laboratories. The objective of this effort was to develop the mathematical and algorithmic capacity to analyze complex networks under uncertainty. Of interest were chemical reaction networks and smart grid networks. The statements of work for USC focused on the development of stochastic reduced models for uncertain networks. The USC team was led by Professor Roger Ghanem and consisted of one graduate student and a postdoc. The contributions completed by the USC teammore » consisted of 1) methodology and algorithms to address the eigenvalue problem, a problem of significance in the stability of networks under stochastic perturbations, 2) methodology and algorithms to characterize probability measures on graph structures with random flows. This is an important problem in characterizing random demand (encountered in smart grid) and random degradation (encountered in infrastructure systems), as well as modeling errors in Markov Chains (with ubiquitous relevance !). 3) methodology and algorithms for treating inequalities in uncertain systems. This is an important problem in the context of models for material failure and network flows under uncertainty where conditions of failure or flow are described in the form of inequalities between the state variables.« less
Validation of Networks Derived from Snowball Sampling of Municipal Science Education Actors
ERIC Educational Resources Information Center
von der Fehr, Ane; Sølberg, Jan; Bruun, Jesper
2018-01-01
Social network analysis (SNA) has been used in many educational studies in the past decade, but what these studies have in common is that the populations in question in most cases are defined and known to the researchers studying the networks. Snowball sampling is an SNA methodology most often used to study hidden populations, for example, groups…
Consensus-based methodology for detection communities in multilayered networks
NASA Astrophysics Data System (ADS)
Karimi-Majd, Amir-Mohsen; Fathian, Mohammad; Makrehchi, Masoud
2018-03-01
Finding groups of network users who are densely related with each other has emerged as an interesting problem in the area of social network analysis. These groups or so-called communities would be hidden behind the behavior of users. Most studies assume that such behavior could be understood by focusing on user interfaces, their behavioral attributes or a combination of these network layers (i.e., interfaces with their attributes). They also assume that all network layers refer to the same behavior. However, in real-life networks, users' behavior in one layer may differ from their behavior in another one. In order to cope with these issues, this article proposes a consensus-based community detection approach (CBC). CBC finds communities among nodes at each layer, in parallel. Then, the results of layers should be aggregated using a consensus clustering method. This means that different behavior could be detected and used in the analysis. As for other significant advantages, the methodology would be able to handle missing values. Three experiments on real-life and computer-generated datasets have been conducted in order to evaluate the performance of CBC. The results indicate superiority and stability of CBC in comparison to other approaches.
Pendular behavior of public transport networks
NASA Astrophysics Data System (ADS)
Izawa, Mirian M.; Oliveira, Fernando A.; Cajueiro, Daniel O.; Mello, Bernardo A.
2017-07-01
In this paper, we propose a methodology that bears close resemblance to the Fourier analysis of the first harmonic to study networks subjected to pendular behavior. In this context, pendular behavior is characterized by the phenomenon of people's dislocation from their homes to work in the morning and people's dislocation in the opposite direction in the afternoon. Pendular behavior is a relevant phenomenon that takes place in public transport networks because it may reduce the overall efficiency of the system as a result of the asymmetric utilization of the system in different directions. We apply this methodology to the bus transport system of Brasília, which is a city that has commercial and residential activities in distinct boroughs. We show that this methodology can be used to characterize the pendular behavior of this system, identifying the most critical nodes and times of the day when this system is in more severe demanded.
Validation and quantification of uncertainty in coupled climate models using network analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bracco, Annalisa
We developed a fast, robust and scalable methodology to examine, quantify, and visualize climate patterns and their relationships. It is based on a set of notions, algorithms and metrics used in the study of graphs, referred to as complex network analysis. This approach can be applied to explain known climate phenomena in terms of an underlying network structure and to uncover regional and global linkages in the climate system, while comparing general circulation models outputs with observations. The proposed method is based on a two-layer network representation, and is substantially new within the available network methodologies developed for climate studies.more » At the first layer, gridded climate data are used to identify ‘‘areas’’, i.e., geographical regions that are highly homogeneous in terms of the given climate variable. At the second layer, the identified areas are interconnected with links of varying strength, forming a global climate network. The robustness of the method (i.e. the ability to separate between topological distinct fields, while identifying correctly similarities) has been extensively tested. It has been proved that it provides a reliable, fast framework for comparing and ranking the ability of climate models of reproducing observed climate patterns and their connectivity. We further developed the methodology to account for lags in the connectivity between climate patterns and refined our area identification algorithm to account for autocorrelation in the data. The new methodology based on complex network analysis has been applied to state-of-the-art climate model simulations that participated to the last IPCC (International Panel for Climate Change) assessment to verify their performances, quantify uncertainties, and uncover changes in global linkages between past and future projections. Network properties of modeled sea surface temperature and rainfall over 1956–2005 have been constrained towards observations or reanalysis data sets, and their differences quantified using two metrics. Projected changes from 2051 to 2300 under the scenario with the highest representative and extended concentration pathways (RCP8.5 and ECP8.5) have then been determined. The network of models capable of reproducing well major climate modes in the recent past, changes little during this century. In contrast, among those models the uncertainties in the projections after 2100 remain substantial, and primarily associated with divergences in the representation of the modes of variability, particularly of the El Niño Southern Oscillation (ENSO), and their connectivity, and therefore with their intrinsic predictability, more so than with differences in the mean state evolution. Additionally, we evaluated the relation between the size and the ‘strength’ of the area identified by the network analysis as corresponding to ENSO noting that only a small subset of models can reproduce realistically the observations.« less
2008-09-01
software facilitate targeting problem understanding and the network analysis tool, Palantir , as an efficient and tailored semi-automated means to...the use of compendium software facilitate targeting problem understanding and the network analysis tool, Palantir , as an efficient and tailored semi...OBJECTIVES USING COMPENDIUM SOFTWARE .....63 E. HOT TARGET PRIORITIZATION AND DEVELOPMENT USING PALANTIR SOFTWARE .................................69 1
ERIC Educational Resources Information Center
Ardanuy, Jordi; Urbano, Cristobal; Quintana, Lluis
2009-01-01
Introduction: This paper studies the situation of research on Catalan literature between 1976 and 2003 by carrying out a bibliometric and social network analysis of PhD theses defended in Spain. It has a dual aim: to present interesting results for the discipline and to demonstrate the methodological efficacy of scientometric tools in the…
Time-Varying, Multi-Scale Adaptive System Reliability Analysis of Lifeline Infrastructure Networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gearhart, Jared Lee; Kurtz, Nolan Scot
2014-09-01
The majority of current societal and economic needs world-wide are met by the existing networked, civil infrastructure. Because the cost of managing such infrastructure is high and increases with time, risk-informed decision making is essential for those with management responsibilities for these systems. To address such concerns, a methodology that accounts for new information, deterioration, component models, component importance, group importance, network reliability, hierarchical structure organization, and efficiency concerns has been developed. This methodology analyzes the use of new information through the lens of adaptive Importance Sampling for structural reliability problems. Deterioration, multi-scale bridge models, and time-variant component importance aremore » investigated for a specific network. Furthermore, both bridge and pipeline networks are studied for group and component importance, as well as for hierarchical structures in the context of specific networks. Efficiency is the primary driver throughout this study. With this risk-informed approach, those responsible for management can address deteriorating infrastructure networks in an organized manner.« less
Liang, Geng
2015-01-01
In this paper, improving control performance of a networked control system by reducing DTD in a different perspective was investigated. Two different network architectures for system implementation were presented. Analysis and improvement dealing with DTD for the experimental control system were expounded. Effects of control scheme configuration on DTD in the form of FB were investigated and corresponding improvements by reallocation of FB and re-arrangement of schedule table are proposed. Issues of DTD in hybrid network were investigated and corresponding approaches to improve performance including (1) reducing DTD in PLC or PAC by way of IEC61499 and (2) cascade Smith predictive control with BPNN-based identification were proposed and investigated. Control effects under the proposed methodologies were also given. Experimental and field practices validated these methodologies. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
Brown, C. Hendricks; Kellam, Sheppard G.; Kaupert, Sheila; Muthén, Bengt O.; Wang, Wei; Muthén, Linda K.; Chamberlain, Patricia; PoVey, Craig L.; Cady, Rick; Valente, Thomas W.; Ogihara, Mitsunori; Prado, Guillermo J.; Pantin, Hilda M.; Gallo, Carlos G.; Szapocznik, José; Czaja, Sara J.; McManus, John W.
2012-01-01
What progress prevention research has made comes through strategic partnerships with communities and institutions that host this research, as well as professional and practice networks that facilitate the diffusion of knowledge about prevention. We discuss partnership issues related to the design, analysis, and implementation of prevention research and especially how rigorous designs, including random assignment, get resolved through a partnership between community stakeholders, institutions, and researchers. These partnerships shape not only study design, but they determine the data that can be collected and how results and new methods are disseminated. We also examine a second type of partnership to improve the implementation of effective prevention programs into practice. We draw on social networks to studying partnership formation and function. The experience of the Prevention Science and Methodology Group, which itself is a networked partnership between scientists and methodologists, is highlighted. PMID:22160786
Brown, C Hendricks; Kellam, Sheppard G; Kaupert, Sheila; Muthén, Bengt O; Wang, Wei; Muthén, Linda K; Chamberlain, Patricia; PoVey, Craig L; Cady, Rick; Valente, Thomas W; Ogihara, Mitsunori; Prado, Guillermo J; Pantin, Hilda M; Gallo, Carlos G; Szapocznik, José; Czaja, Sara J; McManus, John W
2012-07-01
What progress prevention research has made comes through strategic partnerships with communities and institutions that host this research, as well as professional and practice networks that facilitate the diffusion of knowledge about prevention. We discuss partnership issues related to the design, analysis, and implementation of prevention research and especially how rigorous designs, including random assignment, get resolved through a partnership between community stakeholders, institutions, and researchers. These partnerships shape not only study design, but they determine the data that can be collected and how results and new methods are disseminated. We also examine a second type of partnership to improve the implementation of effective prevention programs into practice. We draw on social networks to studying partnership formation and function. The experience of the Prevention Science and Methodology Group, which itself is a networked partnership between scientists and methodologists, is highlighted.
Prediction of road accidents: A Bayesian hierarchical approach.
Deublein, Markus; Schubert, Matthias; Adey, Bryan T; Köhler, Jochen; Faber, Michael H
2013-03-01
In this paper a novel methodology for the prediction of the occurrence of road accidents is presented. The methodology utilizes a combination of three statistical methods: (1) gamma-updating of the occurrence rates of injury accidents and injured road users, (2) hierarchical multivariate Poisson-lognormal regression analysis taking into account correlations amongst multiple dependent model response variables and effects of discrete accident count data e.g. over-dispersion, and (3) Bayesian inference algorithms, which are applied by means of data mining techniques supported by Bayesian Probabilistic Networks in order to represent non-linearity between risk indicating and model response variables, as well as different types of uncertainties which might be present in the development of the specific models. Prior Bayesian Probabilistic Networks are first established by means of multivariate regression analysis of the observed frequencies of the model response variables, e.g. the occurrence of an accident, and observed values of the risk indicating variables, e.g. degree of road curvature. Subsequently, parameter learning is done using updating algorithms, to determine the posterior predictive probability distributions of the model response variables, conditional on the values of the risk indicating variables. The methodology is illustrated through a case study using data of the Austrian rural motorway network. In the case study, on randomly selected road segments the methodology is used to produce a model to predict the expected number of accidents in which an injury has occurred and the expected number of light, severe and fatally injured road users. Additionally, the methodology is used for geo-referenced identification of road sections with increased occurrence probabilities of injury accident events on a road link between two Austrian cities. It is shown that the proposed methodology can be used to develop models to estimate the occurrence of road accidents for any road network provided that the required data are available. Copyright © 2012 Elsevier Ltd. All rights reserved.
Functional approximation using artificial neural networks in structural mechanics
NASA Technical Reports Server (NTRS)
Alam, Javed; Berke, Laszlo
1993-01-01
The artificial neural networks (ANN) methodology is an outgrowth of research in artificial intelligence. In this study, the feed-forward network model that was proposed by Rumelhart, Hinton, and Williams was applied to the mapping of functions that are encountered in structural mechanics problems. Several different network configurations were chosen to train the available data for problems in materials characterization and structural analysis of plates and shells. By using the recall process, the accuracy of these trained networks was assessed.
GFD-Net: A novel semantic similarity methodology for the analysis of gene networks.
Díaz-Montaña, Juan J; Díaz-Díaz, Norberto; Gómez-Vela, Francisco
2017-04-01
Since the popularization of biological network inference methods, it has become crucial to create methods to validate the resulting models. Here we present GFD-Net, the first methodology that applies the concept of semantic similarity to gene network analysis. GFD-Net combines the concept of semantic similarity with the use of gene network topology to analyze the functional dissimilarity of gene networks based on Gene Ontology (GO). The main innovation of GFD-Net lies in the way that semantic similarity is used to analyze gene networks taking into account the network topology. GFD-Net selects a functionality for each gene (specified by a GO term), weights each edge according to the dissimilarity between the nodes at its ends and calculates a quantitative measure of the network functional dissimilarity, i.e. a quantitative value of the degree of dissimilarity between the connected genes. The robustness of GFD-Net as a gene network validation tool was demonstrated by performing a ROC analysis on several network repositories. Furthermore, a well-known network was analyzed showing that GFD-Net can also be used to infer knowledge. The relevance of GFD-Net becomes more evident in Section "GFD-Net applied to the study of human diseases" where an example of how GFD-Net can be applied to the study of human diseases is presented. GFD-Net is available as an open-source Cytoscape app which offers a user-friendly interface to configure and execute the algorithm as well as the ability to visualize and interact with the results(http://apps.cytoscape.org/apps/gfdnet). Copyright © 2017 Elsevier Inc. All rights reserved.
Myneni, Sahiti; Cobb, Nathan K; Cohen, Trevor
2016-01-01
Analysis of user interactions in online communities could improve our understanding of health-related behaviors and inform the design of technological solutions that support behavior change. However, to achieve this we would need methods that provide granular perspective, yet are scalable. In this paper, we present a methodology for high-throughput semantic and network analysis of large social media datasets, combining semi-automated text categorization with social network analytics. We apply this method to derive content-specific network visualizations of 16,492 user interactions in an online community for smoking cessation. Performance of the categorization system was reasonable (average F-measure of 0.74, with system-rater reliability approaching rater-rater reliability). The resulting semantically specific network analysis of user interactions reveals content- and behavior-specific network topologies. Implications for socio-behavioral health and wellness platforms are also discussed.
Motif-Synchronization: A new method for analysis of dynamic brain networks with EEG
NASA Astrophysics Data System (ADS)
Rosário, R. S.; Cardoso, P. T.; Muñoz, M. A.; Montoya, P.; Miranda, J. G. V.
2015-12-01
The major aim of this work was to propose a new association method known as Motif-Synchronization. This method was developed to provide information about the synchronization degree and direction between two nodes of a network by counting the number of occurrences of some patterns between any two time series. The second objective of this work was to present a new methodology for the analysis of dynamic brain networks, by combining the Time-Varying Graph (TVG) method with a directional association method. We further applied the new algorithms to a set of human electroencephalogram (EEG) signals to perform a dynamic analysis of the brain functional networks (BFN).
Integrated Evaluation of Reliability and Power Consumption of Wireless Sensor Networks.
Dâmaso, Antônio; Rosa, Nelson; Maciel, Paulo
2017-11-05
Power consumption is a primary interest in Wireless Sensor Networks (WSNs), and a large number of strategies have been proposed to evaluate it. However, those approaches usually neither consider reliability issues nor the power consumption of applications executing in the network. A central concern is the lack of consolidated solutions that enable us to evaluate the power consumption of applications and the network stack also considering their reliabilities. To solve this problem, we introduce a fully automatic solution to design power consumption aware WSN applications and communication protocols. The solution presented in this paper comprises a methodology to evaluate the power consumption based on the integration of formal models, a set of power consumption and reliability models, a sensitivity analysis strategy to select WSN configurations and a toolbox named EDEN to fully support the proposed methodology. This solution allows accurately estimating the power consumption of WSN applications and the network stack in an automated way.
A modeling framework that can be used to evaluate sedimentation in stream networks is described. This methodology can be used to determine sediment Total Maximum Daily Loads (TMDLs) in sediment impaired waters, and provide the necessary hydrodynamic and sediment-related data t...
Doctoral Students' Identity Positioning in Networked Learning Environments
ERIC Educational Resources Information Center
Koole, Marguerite; Stack, Sara
2016-01-01
In this study, the authors explored identity positioning as perceived by doctoral learners in online, networked-learning environments. The study examined two distance doctoral programs at a Canadian university. It was a qualitative study based on methodologies involving open coding and discourse analysis. The social positioning cycle, based on…
Myneni, Sahiti; Cobb, Nathan K; Cohen, Trevor
2013-01-01
Unhealthy behaviors increase individual health risks and are a socioeconomic burden. Harnessing social influence is perceived as fundamental for interventions to influence health-related behaviors. However, the mechanisms through which social influence occurs are poorly understood. Online social networks provide the opportunity to understand these mechanisms as they digitally archive communication between members. In this paper, we present a methodology for content-based social network analysis, combining qualitative coding, automated text analysis, and formal network analysis such that network structure is determined by the content of messages exchanged between members. We apply this approach to characterize the communication between members of QuitNet, an online social network for smoking cessation. Results indicate that the method identifies meaningful theme-based social sub-networks. Modeling social network data using this method can provide us with theme-specific insights such as the identities of opinion leaders and sub-community clusters. Implications for design of targeted social interventions are discussed.
Detecting large-scale networks in the human brain using high-density electroencephalography.
Liu, Quanying; Farahibozorg, Seyedehrezvan; Porcaro, Camillo; Wenderoth, Nicole; Mantini, Dante
2017-09-01
High-density electroencephalography (hdEEG) is an emerging brain imaging technique that can be used to investigate fast dynamics of electrical activity in the healthy and the diseased human brain. Its applications are however currently limited by a number of methodological issues, among which the difficulty in obtaining accurate source localizations. In particular, these issues have so far prevented EEG studies from reporting brain networks similar to those previously detected by functional magnetic resonance imaging (fMRI). Here, we report for the first time a robust detection of brain networks from resting state (256-channel) hdEEG recordings. Specifically, we obtained 14 networks previously described in fMRI studies by means of realistic 12-layer head models and exact low-resolution brain electromagnetic tomography (eLORETA) source localization, together with independent component analysis (ICA) for functional connectivity analysis. Our analyses revealed three important methodological aspects. First, brain network reconstruction can be improved by performing source localization using the gray matter as source space, instead of the whole brain. Second, conducting EEG connectivity analyses in individual space rather than on concatenated datasets may be preferable, as it permits to incorporate realistic information on head modeling and electrode positioning. Third, the use of a wide frequency band leads to an unbiased and generally accurate reconstruction of several network maps, whereas filtering data in a narrow frequency band may enhance the detection of specific networks and penalize that of others. We hope that our methodological work will contribute to rise of hdEEG as a powerful tool for brain research. Hum Brain Mapp 38:4631-4643, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Social network analysis: Presenting an underused method for nursing research.
Parnell, James Michael; Robinson, Jennifer C
2018-06-01
This paper introduces social network analysis as a versatile method with many applications in nursing research. Social networks have been studied for years in many social science fields. The methods continue to advance but remain unknown to most nursing scholars. Discussion paper. English language and interpreted literature was searched from Ovid Healthstar, CINAHL, PubMed Central, Scopus and hard copy texts from 1965 - 2017. Social network analysis first emerged in nursing literature in 1995 and appears minimally through present day. To convey the versatility and applicability of social network analysis in nursing, hypothetical scenarios are presented. The scenarios are illustrative of three approaches to social network analysis and include key elements of social network research design. The methods of social network analysis are underused in nursing research, primarily because they are unknown to most scholars. However, there is methodological flexibility and epistemological versatility capable of supporting quantitative and qualitative research. The analytic techniques of social network analysis can add new insight into many areas of nursing inquiry, especially those influenced by cultural norms. Furthermore, visualization techniques associated with social network analysis can be used to generate new hypotheses. Social network analysis can potentially uncover findings not accessible through methods commonly used in nursing research. Social networks can be analysed based on individual-level attributes, whole networks and subgroups within networks. Computations derived from social network analysis may stand alone to answer a research question or incorporated as variables into robust statistical models. © 2018 John Wiley & Sons Ltd.
Goodson, Patricia
2015-01-01
Background. Documented trends in health-related risk behaviors among US adolescents have remained high over time. Studies indicate relationships among mutual friends are a major influence on adolescents’ risky behaviors. Social Network Analysis (SNA) can help understand friendship ties affecting individual adolescents’ engagement in these behaviors. Moreover, a systematic literature review can synthesize findings from a range of studies using SNA, as well as assess these studies’ methodological quality. Review findings also can help health educators and promoters develop more effective programs. Objective. This review systematically examined studies of the influence of friendship networks on adolescents’ risk behaviors, which utilized SNA and the Add Health data (a nationally representative sample). Methods. We employed the Matrix Method to synthesize and evaluate 15 published studies that met our inclusion and exclusion criteria, retrieved from the Add Health website and 3 major databases (Medline, Eric, and PsycINFO). Moreover, we assigned each study a methodological quality score (MQS). Results. In all studies, friendship networks among adolescents promoted their risky behaviors, including drinking alcohol, smoking, sexual intercourse, and marijuana use. The average MQS was 4.6, an indicator of methodological rigor (scale: 1–9). Conclusion. Better understanding of risky behaviors influenced by friends can be useful for health educators and promoters, as programs targeting friendships might be more effective. Additionally, the overall MQ of these reviewed studies was good, as average scores fell above the scale’s mid-point. PMID:26157622
Network Data: Statistical Theory and New Models
2016-02-17
SECURITY CLASSIFICATION OF: During this period of review, Bin Yu worked on many thrusts of high-dimensional statistical theory and methodologies. Her...research covered a wide range of topics in statistics including analysis and methods for spectral clustering for sparse and structured networks...2,7,8,21], sparse modeling (e.g. Lasso) [4,10,11,17,18,19], statistical guarantees for the EM algorithm [3], statistical analysis of algorithm leveraging
Borisjuk, Ljudmilla; Hajirezaei, Mohammad-Reza; Klukas, Christian; Rolletschek, Hardy; Schreiber, Falk
2005-01-01
Modern 'omics'-technologies result in huge amounts of data about life processes. For analysis and data mining purposes this data has to be considered in the context of the underlying biological networks. This work presents an approach for integrating data from biological experiments into metabolic networks by mapping the data onto network elements and visualising the data enriched networks automatically. This methodology is implemented in DBE, an information system that supports the analysis and visualisation of experimental data in the context of metabolic networks. It consists of five parts: (1) the DBE-Database for consistent data storage, (2) the Excel-Importer application for the data import, (3) the DBE-Website as the interface for the system, (4) the DBE-Pictures application for the up- and download of binary (e. g. image) files, and (5) DBE-Gravisto, a network analysis and graph visualisation system. The usability of this approach is demonstrated in two examples.
Kunz, Meik; Dandekar, Thomas; Naseem, Muhammad
2017-01-01
Cytokinins (CKs) play an important role in plant growth and development. Also, several studies highlight the modulatory implications of CKs for plant-pathogen interaction. However, the underlying mechanisms of CK mediating immune networks in plants are still not fully understood. A detailed analysis of high-throughput transcriptome (RNA-Seq and microarrays) datasets under modulated conditions of plant CKs and its mergence with cellular interactome (large-scale protein-protein interaction data) has the potential to unlock the contribution of CKs to plant defense. Here, we specifically describe a detailed systems biology methodology pertinent to the acquisition and analysis of various omics datasets that delineate the role of plant CKs in impacting immune pathways in Arabidopsis.
Evaluating multiple determinants of the structure of plant-animal mutualistic networks.
Vázquez, Diego P; Chacoff, Natacha P; Cagnolo, Luciano
2009-08-01
The structure of mutualistic networks is likely to result from the simultaneous influence of neutrality and the constraints imposed by complementarity in species phenotypes, phenologies, spatial distributions, phylogenetic relationships, and sampling artifacts. We develop a conceptual and methodological framework to evaluate the relative contributions of these potential determinants. Applying this approach to the analysis of a plant-pollinator network, we show that information on relative abundance and phenology suffices to predict several aggregate network properties (connectance, nestedness, interaction evenness, and interaction asymmetry). However, such information falls short of predicting the detailed network structure (the frequency of pairwise interactions), leaving a large amount of variation unexplained. Taken together, our results suggest that both relative species abundance and complementarity in spatiotemporal distribution contribute substantially to generate observed network patters, but that this information is by no means sufficient to predict the occurrence and frequency of pairwise interactions. Future studies could use our methodological framework to evaluate the generality of our findings in a representative sample of study systems with contrasting ecological conditions.
Development of task network models of human performance in microgravity
NASA Technical Reports Server (NTRS)
Diaz, Manuel F.; Adam, Susan
1992-01-01
This paper discusses the utility of task-network modeling for quantifying human performance variability in microgravity. The data are gathered for: (1) improving current methodologies for assessing human performance and workload in the operational space environment; (2) developing tools for assessing alternative system designs; and (3) developing an integrated set of methodologies for the evaluation of performance degradation during extended duration spaceflight. The evaluation entailed an analysis of the Remote Manipulator System payload-grapple task performed on many shuttle missions. Task-network modeling can be used as a tool for assessing and enhancing human performance in man-machine systems, particularly for modeling long-duration manned spaceflight. Task-network modeling can be directed toward improving system efficiency by increasing the understanding of basic capabilities of the human component in the system and the factors that influence these capabilities.
[Artificial neural networks for decision making in urologic oncology].
Remzi, M; Djavan, B
2007-06-01
This chapter presents a detailed introduction regarding Artificial Neural Networks (ANNs) and their contribution to modern Urologic Oncology. It includes a description of ANNs methodology and points out the differences between Artifical Intelligence and traditional statistic models in terms of usefulness for patients and clinicians, and its advantages over current statistical analysis.
[Social network analysis: a method to improve safety in healthcare organizations].
Marqués Sánchez, Pilar; González Pérez, Marta Eva; Agra Varela, Yolanda; Vega Núñez, Jorge; Pinto Carral, Arrate; Quiroga Sánchez, Enedina
2013-01-01
Patient safety depends on the culture of the healthcare organization involving relationships between professionals. This article proposes that the study of these relations should be conducted from a network perspective and using a methodology called Social Network Analysis (SNA). This methodology includes a set of mathematical constructs grounded in Graph Theory. With the SNA we can know aspects of the individual's position in the network (centrality) or cohesion among team members. Thus, the SNA allows to know aspects related to security such as the kind of links that can increase commitment among professionals, how to build those links, which nodes have more prestige in the team in generating confidence or collaborative network, which professionals serve as intermediaries between the subgroups of a team to transmit information or smooth conflicts, etc. Useful aspects in stablishing a safety culture. The SNA would analyze the relations among professionals, their level of communication to communicate errors and spontaneously seek help and coordination between departments to participate in projects that enhance safety. Thus, they related through a network, using the same language, a fact that helps to build a culture. In summary, we propose an approach to safety culture from a SNA perspective that would complement other commonly used methods.
The Use of Multi-Criteria Evaluation and Network Analysis in the Area Development Planning Process
2013-03-01
layouts. The alternative layout scoring process, base in multi-criteria evaluation, returns a quantitative score for each alternative layout and a...The purpose of this research was to develop improvements to the area development planning process. These plans are used to improve operations within...an installation sub-section by altering the physical layout of facilities. One methodology was developed based on apply network analysis concepts to
Social Network Analysis for Assessing College-Aged Adults' Health: A Systematic Review.
Patterson, Megan S; Go Odson, Patricia
2018-04-13
Social network analysis (SNA) is a useful, emerging method for studying health. College students are especially prone to social influence when it comes to health. This review aimed to identify network variables related to college student health and determine how SNA was used in the literature. A systematic review of relevant literature was conducted in October 2015. Studies employing egocentric or whole network analysis to study college student health were included. We used Garrard's Matrix Method to extract data from reviewed articles (n = 15). Drinking, smoking, aggression, homesickness, and stress were predicted by network variables in the reviewed literature. Methodological inconsistencies concerning boundary specification, data collection, nomination limits, and statistical analyses were revealed across studies. Results show the consistent relationship between network variables and college health outcomes, justifying further use of SNA to research college health. Suggestions and considerations for future use of SNA are provided.
Sparse representation of whole-brain fMRI signals for identification of functional networks.
Lv, Jinglei; Jiang, Xi; Li, Xiang; Zhu, Dajiang; Chen, Hanbo; Zhang, Tuo; Zhang, Shu; Hu, Xintao; Han, Junwei; Huang, Heng; Zhang, Jing; Guo, Lei; Liu, Tianming
2015-02-01
There have been several recent studies that used sparse representation for fMRI signal analysis and activation detection based on the assumption that each voxel's fMRI signal is linearly composed of sparse components. Previous studies have employed sparse coding to model functional networks in various modalities and scales. These prior contributions inspired the exploration of whether/how sparse representation can be used to identify functional networks in a voxel-wise way and on the whole brain scale. This paper presents a novel, alternative methodology of identifying multiple functional networks via sparse representation of whole-brain task-based fMRI signals. Our basic idea is that all fMRI signals within the whole brain of one subject are aggregated into a big data matrix, which is then factorized into an over-complete dictionary basis matrix and a reference weight matrix via an effective online dictionary learning algorithm. Our extensive experimental results have shown that this novel methodology can uncover multiple functional networks that can be well characterized and interpreted in spatial, temporal and frequency domains based on current brain science knowledge. Importantly, these well-characterized functional network components are quite reproducible in different brains. In general, our methods offer a novel, effective and unified solution to multiple fMRI data analysis tasks including activation detection, de-activation detection, and functional network identification. Copyright © 2014 Elsevier B.V. All rights reserved.
Kemp, Candace L.; Ball, Mary M.; Morgan, Jennifer Craft; Doyle, Patrick J.; Burgess, Elisabeth O.; Dillard, Joy A.; Barmon, Christina E.; Fitzroy, Andrea F.; Helmly, Victoria E.; Avent, Elizabeth S.; Perkins, Molly M.
2018-01-01
In this article, we analyze the research experiences associated with a longitudinal qualitative study of residents’ care networks in assisted living. Using data from researcher meetings, field notes, and memos, we critically examine our design and decision making and accompanying methodological implications. We focus on one complete wave of data collection involving 28 residents and 114 care network members in four diverse settings followed for 2 years. We identify study features that make our research innovative, but that also represent significant challenges. They include the focus and topic; settings and participants; scope and design complexity; nature, modes, frequency, and duration of data collection; and analytic approach. Each feature has methodological implications, including benefits and challenges pertaining to recruitment, retention, data collection, quality, and management, research team work, researcher roles, ethics, and dissemination. Our analysis demonstrates the value of our approach and of reflecting on and sharing methodological processes for cumulative knowledge building. PMID:27651072
Kemp, Candace L; Ball, Mary M; Morgan, Jennifer Craft; Doyle, Patrick J; Burgess, Elisabeth O; Dillard, Joy A; Barmon, Christina E; Fitzroy, Andrea F; Helmly, Victoria E; Avent, Elizabeth S; Perkins, Molly M
2017-07-01
In this article, we analyze the research experiences associated with a longitudinal qualitative study of residents' care networks in assisted living. Using data from researcher meetings, field notes, and memos, we critically examine our design and decision making and accompanying methodological implications. We focus on one complete wave of data collection involving 28 residents and 114 care network members in four diverse settings followed for 2 years. We identify study features that make our research innovative, but that also represent significant challenges. They include the focus and topic; settings and participants; scope and design complexity; nature, modes, frequency, and duration of data collection; and analytic approach. Each feature has methodological implications, including benefits and challenges pertaining to recruitment, retention, data collection, quality, and management, research team work, researcher roles, ethics, and dissemination. Our analysis demonstrates the value of our approach and of reflecting on and sharing methodological processes for cumulative knowledge building.
Social Networks, Engagement and Resilience in University Students.
Fernández-Martínez, Elena; Andina-Díaz, Elena; Fernández-Peña, Rosario; García-López, Rosa; Fulgueiras-Carril, Iván; Liébana-Presa, Cristina
2017-12-01
Analysis of social networks may be a useful tool for understanding the relationship between resilience and engagement, and this could be applied to educational methodologies, not only to improve academic performance, but also to create emotionally sustainable networks. This descriptive study was carried out on 134 university students. We collected the network structural variables, degree of resilience (CD-RISC 10), and engagement (UWES-S). The computer programs used were excel, UCINET for network analysis, and SPSS for statistical analysis. The analysis revealed results of means of 28.61 for resilience, 2.98 for absorption, 4.82 for dedication, and 3.13 for vigour. The students had two preferred places for sharing information: the classroom and WhatsApp. The greater the value for engagement, the greater the degree of centrality in the friendship network among students who are beginning their university studies. This relationship becomes reversed as the students move to later academic years. In terms of resilience, the highest values correspond to greater centrality in the friendship networks. The variables of engagement and resilience influenced the university students' support networks.
Social Networks, Engagement and Resilience in University Students
García-López, Rosa; Fulgueiras-Carril, Iván
2017-01-01
Analysis of social networks may be a useful tool for understanding the relationship between resilience and engagement, and this could be applied to educational methodologies, not only to improve academic performance, but also to create emotionally sustainable networks. This descriptive study was carried out on 134 university students. We collected the network structural variables, degree of resilience (CD-RISC 10), and engagement (UWES-S). The computer programs used were excel, UCINET for network analysis, and SPSS for statistical analysis. The analysis revealed results of means of 28.61 for resilience, 2.98 for absorption, 4.82 for dedication, and 3.13 for vigour. The students had two preferred places for sharing information: the classroom and WhatsApp. The greater the value for engagement, the greater the degree of centrality in the friendship network among students who are beginning their university studies. This relationship becomes reversed as the students move to later academic years. In terms of resilience, the highest values correspond to greater centrality in the friendship networks. The variables of engagement and resilience influenced the university students’ support networks. PMID:29194361
NASA Astrophysics Data System (ADS)
Mathivanan, N. Rajesh; Mouli, Chandra
2012-12-01
In this work, a new methodology based on artificial neural networks (ANN) has been developed to study the low-velocity impact characteristics of woven glass epoxy laminates of EP3 grade. To train and test the networks, multiple impact cases have been generated using statistical analysis of variance (ANOVA). Experimental tests were performed using an instrumented falling-weight impact-testing machine. Different impact velocities and impact energies on different thicknesses of laminates were considered as the input parameters of the ANN model. This model is a feed-forward back-propagation neural network. Using the input/output data of the experiments, the model was trained and tested. Further, the effects of the low-velocity impact response of the laminates at different energy levels were investigated by studying the cause-effect relationship among the influential factors using response surface methodology. The most significant parameter is determined from the other input variables through ANOVA.
Houston Cole Library Collection Assessment.
ERIC Educational Resources Information Center
Henderson, William Abbot, Ed.; McAbee, Sonja L., Ed.
This document reports on an assessment of the Jacksonville State University's Houston Cole Library collection that employed a variety of methodologies and tools, including list-checking, direct collection examination, shelflist measurement and analysis, WLN (Washington Library Network) conspectus sheets, analysis of OCLC/AMIGOS Collection Analysis…
New York City's Children First Networks: Turning Accountability on Its Head
ERIC Educational Resources Information Center
Wohlstetter, Priscilla; Smith, Joanna; Gallagher, Andrew
2013-01-01
Purpose: The purpose of this paper is to report findings from an exploratory study of New York's Children First Networks (CFNs); to examine what is known about the CFNs thus far, drawing on new empirical research, as well as document review and analysis of secondary sources. Design/methodology/approach: Organizational learning theory guided this…
A Quantitative Methodology for Vetting Dark Network Intelligence Sources for Social Network Analysis
2012-06-01
first algorithm by Erdös and Rényi (Erdös & Renyi , 1959). This earliest algorithm suffers from the fact that its degree distribution is not scale...Fundamental Media Understanding. Norderstedt: atpress. Erdös, P., & Renyi , A. (1959). On random graphs. Publicationes Mathematicae , 6, 290- 297. Erdös, P
Approaches to Forecasting Demands for Library Network Services. Report No. 10.
ERIC Educational Resources Information Center
Kang, Jong Hoa
The problem of forecasting monthly demands for library network services is considered in terms of using forecasts as inputs to policy analysis models, and in terms of using forecasts to aid in the making of budgeting and staffing decisions. Box-Jenkins time-series methodology, adaptive filtering, and regression approaches are examined and compared…
Integrated Evaluation of Reliability and Power Consumption of Wireless Sensor Networks
Dâmaso, Antônio; Maciel, Paulo
2017-01-01
Power consumption is a primary interest in Wireless Sensor Networks (WSNs), and a large number of strategies have been proposed to evaluate it. However, those approaches usually neither consider reliability issues nor the power consumption of applications executing in the network. A central concern is the lack of consolidated solutions that enable us to evaluate the power consumption of applications and the network stack also considering their reliabilities. To solve this problem, we introduce a fully automatic solution to design power consumption aware WSN applications and communication protocols. The solution presented in this paper comprises a methodology to evaluate the power consumption based on the integration of formal models, a set of power consumption and reliability models, a sensitivity analysis strategy to select WSN configurations and a toolbox named EDEN to fully support the proposed methodology. This solution allows accurately estimating the power consumption of WSN applications and the network stack in an automated way. PMID:29113078
Evaluating the Quality of Evidence from a Network Meta-Analysis
Salanti, Georgia; Del Giovane, Cinzia; Chaimani, Anna; Caldwell, Deborah M.; Higgins, Julian P. T.
2014-01-01
Systematic reviews that collate data about the relative effects of multiple interventions via network meta-analysis are highly informative for decision-making purposes. A network meta-analysis provides two types of findings for a specific outcome: the relative treatment effect for all pairwise comparisons, and a ranking of the treatments. It is important to consider the confidence with which these two types of results can enable clinicians, policy makers and patients to make informed decisions. We propose an approach to determining confidence in the output of a network meta-analysis. Our proposed approach is based on methodology developed by the Grading of Recommendations Assessment, Development and Evaluation (GRADE) Working Group for pairwise meta-analyses. The suggested framework for evaluating a network meta-analysis acknowledges (i) the key role of indirect comparisons (ii) the contributions of each piece of direct evidence to the network meta-analysis estimates of effect size; (iii) the importance of the transitivity assumption to the validity of network meta-analysis; and (iv) the possibility of disagreement between direct evidence and indirect evidence. We apply our proposed strategy to a systematic review comparing topical antibiotics without steroids for chronically discharging ears with underlying eardrum perforations. The proposed framework can be used to determine confidence in the results from a network meta-analysis. Judgements about evidence from a network meta-analysis can be different from those made about evidence from pairwise meta-analyses. PMID:24992266
Applications of artificial neural network in AIDS research and therapy.
Sardari, S; Sardari, D
2002-01-01
In recent years considerable effort has been devoted to applying pattern recognition techniques to the complex task of data analysis in drug research. Artificial neural networks (ANN) methodology is a modeling method with great ability to adapt to a new situation, or control an unknown system, using data acquired in previous experiments. In this paper, a brief history of ANN and the basic concepts behind the computing, the mathematical and algorithmic formulation of each of the techniques, and their developmental background is presented. Based on the abilities of ANNs in pattern recognition and estimation of system outputs from the known inputs, the neural network can be considered as a tool for molecular data analysis and interpretation. Analysis by neural networks improves the classification accuracy, data quantification and reduces the number of analogues necessary for correct classification of biologically active compounds. Conformational analysis and quantifying the components in mixtures using NMR spectra, aqueous solubility prediction and structure-activity correlation are among the reported applications of ANN as a new modeling method. Ranging from drug design and discovery to structure and dosage form design, the potential pharmaceutical applications of the ANN methodology are significant. In the areas of clinical monitoring, utilization of molecular simulation and design of bioactive structures, ANN would make the study of the status of the health and disease possible and brings their predicted chemotherapeutic response closer to reality.
An extensive assessment of network alignment algorithms for comparison of brain connectomes.
Milano, Marianna; Guzzi, Pietro Hiram; Tymofieva, Olga; Xu, Duan; Hess, Christofer; Veltri, Pierangelo; Cannataro, Mario
2017-06-06
Recently the study of the complex system of connections in neural systems, i.e. the connectome, has gained a central role in neurosciences. The modeling and analysis of connectomes are therefore a growing area. Here we focus on the representation of connectomes by using graph theory formalisms. Macroscopic human brain connectomes are usually derived from neuroimages; the analyzed brains are co-registered in the image domain and brought to a common anatomical space. An atlas is then applied in order to define anatomically meaningful regions that will serve as the nodes of the network - this process is referred to as parcellation. The atlas-based parcellations present some known limitations in cases of early brain development and abnormal anatomy. Consequently, it has been recently proposed to perform atlas-free random brain parcellation into nodes and align brains in the network space instead of the anatomical image space, as a way to deal with the unknown correspondences of the parcels. Such process requires modeling of the brain using graph theory and the subsequent comparison of the structure of graphs. The latter step may be modeled as a network alignment (NA) problem. In this work, we first define the problem formally, then we test six existing state of the art of network aligners on diffusion MRI-derived brain networks. We compare the performances of algorithms by assessing six topological measures. We also evaluated the robustness of algorithms to alterations of the dataset. The results confirm that NA algorithms may be applied in cases of atlas-free parcellation for a fully network-driven comparison of connectomes. The analysis shows MAGNA++ is the best global alignment algorithm. The paper presented a new analysis methodology that uses network alignment for validating atlas-free parcellation brain connectomes. The methodology has been experimented on several brain datasets.
A user exposure based approach for non-structural road network vulnerability analysis
Jin, Lei; Wang, Haizhong; Yu, Le; Liu, Lin
2017-01-01
Aiming at the dense urban road network vulnerability without structural negative consequences, this paper proposes a novel non-structural road network vulnerability analysis framework. Three aspects of the framework are mainly described: (i) the rationality of non-structural road network vulnerability, (ii) the metrics for negative consequences accounting for variant road conditions, and (iii) the introduction of a new vulnerability index based on user exposure. Based on the proposed methodology, a case study in the Sioux Falls network which was usually threatened by regular heavy snow during wintertime is detailedly discussed. The vulnerability ranking of links of Sioux Falls network with respect to heavy snow scenario is identified. As a result of non-structural consequences accompanied by conceivable degeneration of network, there are significant increases in generalized travel time costs which are measurements for “emotionally hurt” of topological road network. PMID:29176832
Detection of white matter lesion regions in MRI using SLIC0 and convolutional neural network.
Diniz, Pedro Henrique Bandeira; Valente, Thales Levi Azevedo; Diniz, João Otávio Bandeira; Silva, Aristófanes Corrêa; Gattass, Marcelo; Ventura, Nina; Muniz, Bernardo Carvalho; Gasparetto, Emerson Leandro
2018-04-19
White matter lesions are non-static brain lesions that have a prevalence rate up to 98% in the elderly population. Because they may be associated with several brain diseases, it is important that they are detected as soon as possible. Magnetic Resonance Imaging (MRI) provides three-dimensional data with the possibility to detect and emphasize contrast differences in soft tissues, providing rich information about the human soft tissue anatomy. However, the amount of data provided for these images is far too much for manual analysis/interpretation, representing a difficult and time-consuming task for specialists. This work presents a computational methodology capable of detecting regions of white matter lesions of the brain in MRI of FLAIR modality. The techniques highlighted in this methodology are SLIC0 clustering for candidate segmentation and convolutional neural networks for candidate classification. The methodology proposed here consists of four steps: (1) images acquisition, (2) images preprocessing, (3) candidates segmentation and (4) candidates classification. The methodology was applied on 91 magnetic resonance images provided by DASA, and achieved an accuracy of 98.73%, specificity of 98.77% and sensitivity of 78.79% with 0.005 of false positives, without any false positives reduction technique, in detection of white matter lesion regions. It is demonstrated the feasibility of the analysis of brain MRI using SLIC0 and convolutional neural network techniques to achieve success in detection of white matter lesions regions. Copyright © 2018. Published by Elsevier B.V.
Methodology to model the energy and greenhouse gas emissions of electronic software distributions.
Williams, Daniel R; Tang, Yinshan
2012-01-17
A new electronic software distribution (ESD) life cycle analysis (LCA) methodology and model structure were constructed to calculate energy consumption and greenhouse gas (GHG) emissions. In order to counteract the use of high level, top-down modeling efforts, and to increase result accuracy, a focus upon device details and data routes was taken. In order to compare ESD to a relevant physical distribution alternative, physical model boundaries and variables were described. The methodology was compiled from the analysis and operational data of a major online store which provides ESD and physical distribution options. The ESD method included the calculation of power consumption of data center server and networking devices. An in-depth method to calculate server efficiency and utilization was also included to account for virtualization and server efficiency features. Internet transfer power consumption was analyzed taking into account the number of data hops and networking devices used. The power consumed by online browsing and downloading was also factored into the model. The embedded CO(2)e of server and networking devices was proportioned to each ESD process. Three U.K.-based ESD scenarios were analyzed using the model which revealed potential CO(2)e savings of 83% when ESD was used over physical distribution. Results also highlighted the importance of server efficiency and utilization methods.
Detecting Disease Specific Pathway Substructures through an Integrated Systems Biology Approach
Alaimo, Salvatore; Marceca, Gioacchino Paolo; Ferro, Alfredo; Pulvirenti, Alfredo
2017-01-01
In the era of network medicine, pathway analysis methods play a central role in the prediction of phenotype from high throughput experiments. In this paper, we present a network-based systems biology approach capable of extracting disease-perturbed subpathways within pathway networks in connection with expression data taken from The Cancer Genome Atlas (TCGA). Our system extends pathways with missing regulatory elements, such as microRNAs, and their interactions with genes. The framework enables the extraction, visualization, and analysis of statistically significant disease-specific subpathways through an easy to use web interface. Our analysis shows that the methodology is able to fill the gap in current techniques, allowing a more comprehensive analysis of the phenomena underlying disease states. PMID:29657291
Analysis of optimal phenotypic space using elementary modes as applied to Corynebacterium glutamicum
Gayen, Kalyan; Venkatesh, KV
2006-01-01
Background Quantification of the metabolic network of an organism offers insights into possible ways of developing mutant strain for better productivity of an extracellular metabolite. The first step in this quantification is the enumeration of stoichiometries of all reactions occurring in a metabolic network. The structural details of the network in combination with experimentally observed accumulation rates of external metabolites can yield flux distribution at steady state. One such methodology for quantification is the use of elementary modes, which are minimal set of enzymes connecting external metabolites. Here, we have used a linear objective function subject to elementary modes as constraint to determine the fluxes in the metabolic network of Corynebacterium glutamicum. The feasible phenotypic space was evaluated at various combinations of oxygen and ammonia uptake rates. Results Quantification of the fluxes of the elementary modes in the metabolism of C. glutamicum was formulated as linear programming. The analysis demonstrated that the solution was dependent on the criteria of objective function when less than four accumulation rates of the external metabolites were considered. The analysis yielded feasible ranges of fluxes of elementary modes that satisfy the experimental accumulation rates. In C. glutamicum, the elementary modes relating to biomass synthesis through glycolysis and TCA cycle were predominantly operational in the initial growth phase. At a later time, the elementary modes contributing to lysine synthesis became active. The oxygen and ammonia uptake rates were shown to be bounded in the phenotypic space due to the stoichiometric constraint of the elementary modes. Conclusion We have demonstrated the use of elementary modes and the linear programming to quantify a metabolic network. We have used the methodology to quantify the network of C. glutamicum, which evaluates the set of operational elementary modes at different phases of fermentation. The methodology was also used to determine the feasible solution space for a given set of substrate uptake rates under specific optimization criteria. Such an approach can be used to determine the optimality of the accumulation rates of any metabolite in a given network. PMID:17038164
Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.
Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan
2015-01-01
The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics.
Sample size and power considerations in network meta-analysis
2012-01-01
Background Network meta-analysis is becoming increasingly popular for establishing comparative effectiveness among multiple interventions for the same disease. Network meta-analysis inherits all methodological challenges of standard pairwise meta-analysis, but with increased complexity due to the multitude of intervention comparisons. One issue that is now widely recognized in pairwise meta-analysis is the issue of sample size and statistical power. This issue, however, has so far only received little attention in network meta-analysis. To date, no approaches have been proposed for evaluating the adequacy of the sample size, and thus power, in a treatment network. Findings In this article, we develop easy-to-use flexible methods for estimating the ‘effective sample size’ in indirect comparison meta-analysis and network meta-analysis. The effective sample size for a particular treatment comparison can be interpreted as the number of patients in a pairwise meta-analysis that would provide the same degree and strength of evidence as that which is provided in the indirect comparison or network meta-analysis. We further develop methods for retrospectively estimating the statistical power for each comparison in a network meta-analysis. We illustrate the performance of the proposed methods for estimating effective sample size and statistical power using data from a network meta-analysis on interventions for smoking cessation including over 100 trials. Conclusion The proposed methods are easy to use and will be of high value to regulatory agencies and decision makers who must assess the strength of the evidence supporting comparative effectiveness estimates. PMID:22992327
Veilleux, Andrea G.; Stedinger, Jery R.; Eash, David A.
2012-01-01
This paper summarizes methodological advances in regional log-space skewness analyses that support flood-frequency analysis with the log Pearson Type III (LP3) distribution. A Bayesian Weighted Least Squares/Generalized Least Squares (B-WLS/B-GLS) methodology that relates observed skewness coefficient estimators to basin characteristics in conjunction with diagnostic statistics represents an extension of the previously developed B-GLS methodology. B-WLS/B-GLS has been shown to be effective in two California studies. B-WLS/B-GLS uses B-WLS to generate stable estimators of model parameters and B-GLS to estimate the precision of those B-WLS regression parameters, as well as the precision of the model. The study described here employs this methodology to develop a regional skewness model for the State of Iowa. To provide cost effective peak-flow data for smaller drainage basins in Iowa, the U.S. Geological Survey operates a large network of crest stage gages (CSGs) that only record flow values above an identified recording threshold (thus producing a censored data record). CSGs are different from continuous-record gages, which record almost all flow values and have been used in previous B-GLS and B-WLS/B-GLS regional skewness studies. The complexity of analyzing a large CSG network is addressed by using the B-WLS/B-GLS framework along with the Expected Moments Algorithm (EMA). Because EMA allows for the censoring of low outliers, as well as the use of estimated interval discharges for missing, censored, and historic data, it complicates the calculations of effective record length (and effective concurrent record length) used to describe the precision of sample estimators because the peak discharges are no longer solely represented by single values. Thus new record length calculations were developed. The regional skewness analysis for the State of Iowa illustrates the value of the new B-WLS/BGLS methodology with these new extensions.
Biederman, J; Hammerness, P; Sadeh, B; Peremen, Z; Amit, A; Or-Ly, H; Stern, Y; Reches, A; Geva, A; Faraone, S V
2017-05-01
A previous small study suggested that Brain Network Activation (BNA), a novel ERP-based brain network analysis, may have diagnostic utility in attention deficit hyperactivity disorder (ADHD). In this study we examined the diagnostic capability of a new advanced version of the BNA methodology on a larger population of adults with and without ADHD. Subjects were unmedicated right-handed 18- to 55-year-old adults of both sexes with and without a DSM-IV diagnosis of ADHD. We collected EEG while the subjects were performing a response inhibition task (Go/NoGo) and then applied a spatio-temporal Brain Network Activation (BNA) analysis of the EEG data. This analysis produced a display of qualitative measures of brain states (BNA scores) providing information on cortical connectivity. This complex set of scores was then fed into a machine learning algorithm. The BNA analysis of the EEG data recorded during the Go/NoGo task demonstrated a high discriminative capacity between ADHD patients and controls (AUC = 0.92, specificity = 0.95, sensitivity = 0.86 for the Go condition; AUC = 0.84, specificity = 0.91, sensitivity = 0.76 for the NoGo condition). BNA methodology can help differentiate between ADHD and healthy controls based on functional brain connectivity. The data support the utility of the tool to augment clinical examinations by objective evaluation of electrophysiological changes associated with ADHD. Results also support a network-based approach to the study of ADHD.
Building the Material Flow Networks of Aluminum in the 2007 U.S. Economy.
Chen, Wei-Qiang; Graedel, T E; Nuss, Philip; Ohno, Hajime
2016-04-05
Based on the combination of the U.S. economic input-output table and the stocks and flows framework for characterizing anthropogenic metal cycles, this study presents a methodology for building material flow networks of bulk metals in the U.S. economy and applies it to aluminum. The results, which we term the Input-Output Material Flow Networks (IO-MFNs), achieve a complete picture of aluminum flow in the entire U.S. economy and for any chosen industrial sector (illustrated for the Automobile Manufacturing sector). The results are compared with information from our former study on U.S. aluminum stocks and flows to demonstrate the robustness and value of this new methodology. We find that the IO-MFN approach has the following advantages: (1) it helps to uncover the network of material flows in the manufacturing stage in the life cycle of metals; (2) it provides a method that may be less time-consuming but more complete and accurate in estimating new scrap generation, process loss, domestic final demand, and trade of final products of metals, than existing material flow analysis approaches; and, most importantly, (3) it enables the analysis of the material flows of metals in the U.S. economy from a network perspective, rather than merely that of a life cycle chain.
NASA Technical Reports Server (NTRS)
Johnston, William; Tierney, Brian; Lee, Jason; Hoo, Gary; Thompson, Mary
1996-01-01
We have developed and deployed a distributed-parallel storage system (DPSS) in several high speed asynchronous transfer mode (ATM) wide area networks (WAN) testbeds to support several different types of data-intensive applications. Architecturally, the DPSS is a network striped disk array, but is fairly unique in that its implementation allows applications complete freedom to determine optimal data layout, replication and/or coding redundancy strategy, security policy, and dynamic reconfiguration. In conjunction with the DPSS, we have developed a 'top-to-bottom, end-to-end' performance monitoring and analysis methodology that has allowed us to characterize all aspects of the DPSS operating in high speed ATM networks. In particular, we have run a variety of performance monitoring experiments involving the DPSS in the MAGIC testbed, which is a large scale, high speed, ATM network and we describe our experience using the monitoring methodology to identify and correct problems that limit the performance of high speed distributed applications. Finally, the DPSS is part of an overall architecture for using high speed, WAN's for enabling the routine, location independent use of large data-objects. Since this is part of the motivation for a distributed storage system, we describe this architecture.
ERIC Educational Resources Information Center
Brewe, Eric; Bruun, Jesper; Bearden, Ian G.
2016-01-01
We describe "Module Analysis for Multiple Choice Responses" (MAMCR), a new methodology for carrying out network analysis on responses to multiple choice assessments. This method is used to identify modules of non-normative responses which can then be interpreted as an alternative to factor analysis. MAMCR allows us to identify conceptual…
Eilam, David; Portugali, Juval; Blumenfeld-Lieberthal, Efrat
2012-01-01
Background We set out to solve two inherent problems in the study of animal spatial cognition (i) What is a “place”?; and (ii) whether behaviors that are not revealed as differing by one methodology could be revealed as different when analyzed using a different approach. Methodology We applied network analysis to scrutinize spatial behavior of rats tested in either a symmetrical or asymmetrical layout of 4, 8, or 12 objects placed along the perimeter of a round arena. We considered locations as the units of the network (nodes), and passes between locations as the links within the network. Principal Findings While there were only minor activity differences between rats tested in the symmetrical or asymmetrical object layouts, network analysis revealed substantial differences. Viewing ‘location’ as a cluster of stopping coordinates, the key locations (large clusters of stopping coordinates) were at the objects in both layouts with 4 objects. However, in the asymmetrical layout with 4 objects, additional key locations were spaced by the rats between the objects, forming symmetry among the key locations. It was as if the rats had behaviorally imposed symmetry on the physically asymmetrical environment. Based on a previous finding that wayfinding is easier in symmetrical environments, we suggest that when the physical attributes of the environment were not symmetrical, the rats established a symmetric layout of key locations, thereby acquiring a more legible environment despite its complex physical structure. Conclusions and Significance The present study adds a behavioral definition for “location”, a term that so far has been mostly discussed according to its physical attributes or neurobiological correlates (e.g. - place and grid neurons). Moreover, network analysis enabled the assessment of the importance of a location, even when that location did not display any distinctive physical properties. PMID:22815808
A Collaborative Learning Network Approach to Improvement: The CUSP Learning Network.
Weaver, Sallie J; Lofthus, Jennifer; Sawyer, Melinda; Greer, Lee; Opett, Kristin; Reynolds, Catherine; Wyskiel, Rhonda; Peditto, Stephanie; Pronovost, Peter J
2015-04-01
Collaborative improvement networks draw on the science of collaborative organizational learning and communities of practice to facilitate peer-to-peer learning, coaching, and local adaption. Although significant improvements in patient safety and quality have been achieved through collaborative methods, insight regarding how collaborative networks are used by members is needed. Improvement Strategy: The Comprehensive Unit-based Safety Program (CUSP) Learning Network is a multi-institutional collaborative network that is designed to facilitate peer-to-peer learning and coaching specifically related to CUSP. Member organizations implement all or part of the CUSP methodology to improve organizational safety culture, patient safety, and care quality. Qualitative case studies developed by participating members examine the impact of network participation across three levels of analysis (unit, hospital, health system). In addition, results of a satisfaction survey designed to evaluate member experiences were collected to inform network development. Common themes across case studies suggest that members found value in collaborative learning and sharing strategies across organizational boundaries related to a specific improvement strategy. The CUSP Learning Network is an example of network-based collaborative learning in action. Although this learning network focuses on a particular improvement methodology-CUSP-there is clear potential for member-driven learning networks to grow around other methods or topic areas. Such collaborative learning networks may offer a way to develop an infrastructure for longer-term support of improvement efforts and to more quickly diffuse creative sustainment strategies.
[Study on the automatic parameters identification of water pipe network model].
Jia, Hai-Feng; Zhao, Qi-Feng
2010-01-01
Based on the problems analysis on development and application of water pipe network model, the model parameters automatic identification is regarded as a kernel bottleneck of model's application in water supply enterprise. The methodology of water pipe network model parameters automatic identification based on GIS and SCADA database is proposed. Then the kernel algorithm of model parameters automatic identification is studied, RSA (Regionalized Sensitivity Analysis) is used for automatic recognition of sensitive parameters, and MCS (Monte-Carlo Sampling) is used for automatic identification of parameters, the detail technical route based on RSA and MCS is presented. The module of water pipe network model parameters automatic identification is developed. At last, selected a typical water pipe network as a case, the case study on water pipe network model parameters automatic identification is conducted and the satisfied results are achieved.
Generalization of Clustering Coefficients to Signed Correlation Networks
Costantini, Giulio; Perugini, Marco
2014-01-01
The recent interest in network analysis applications in personality psychology and psychopathology has put forward new methodological challenges. Personality and psychopathology networks are typically based on correlation matrices and therefore include both positive and negative edge signs. However, some applications of network analysis disregard negative edges, such as computing clustering coefficients. In this contribution, we illustrate the importance of the distinction between positive and negative edges in networks based on correlation matrices. The clustering coefficient is generalized to signed correlation networks: three new indices are introduced that take edge signs into account, each derived from an existing and widely used formula. The performances of the new indices are illustrated and compared with the performances of the unsigned indices, both on a signed simulated network and on a signed network based on actual personality psychology data. The results show that the new indices are more resistant to sample variations in correlation networks and therefore have higher convergence compared with the unsigned indices both in simulated networks and with real data. PMID:24586367
Mobile Applications and 4G Wireless Networks: A Framework for Analysis
ERIC Educational Resources Information Center
Yang, Samuel C.
2012-01-01
Purpose: The use of mobile wireless data services continues to increase worldwide. New fourth-generation (4G) wireless networks can deliver data rates exceeding 2 Mbps. The purpose of this paper is to develop a framework of 4G mobile applications that utilize such high data rates and run on small form-factor devices. Design/methodology/approach:…
ERIC Educational Resources Information Center
Weyori, Alirah Emmanuel; Amare, Mulubrhan; Garming, Hildegard; Waibel, Hermann
2018-01-01
Purpose: We assess farm technology adoption in an integrated analysis of social networks and innovation in plantain production in Ghana. The paper explores the strength of social networks in the agricultural innovation systems (AISs) and the effect of AISs on adoption of improved farm technology. Methodology/Approach: The paper uses social network…
Flamm, Christoph; Graef, Andreas; Pirker, Susanne; Baumgartner, Christoph; Deistler, Manfred
2013-01-01
Granger causality is a useful concept for studying causal relations in networks. However, numerical problems occur when applying the corresponding methodology to high-dimensional time series showing co-movement, e.g. EEG recordings or economic data. In order to deal with these shortcomings, we propose a novel method for the causal analysis of such multivariate time series based on Granger causality and factor models. We present the theoretical background, successfully assess our methodology with the help of simulated data and show a potential application in EEG analysis of epileptic seizures. PMID:23354014
ERIC Educational Resources Information Center
2003
The Communication Theory & Methodology Division of the proceedings contains the following 14 papers: "Interaction As a Unit of Analysis for Interactive Media Research: A Conceptualization" (Joo-Hyun Lee and Hairong Li); "Towards a Network Approach of Human Action: Theoretical Concepts and Empirical Observations in Media…
Design and architecture of the Mars relay network planning and analysis framework
NASA Technical Reports Server (NTRS)
Cheung, K. M.; Lee, C. H.
2002-01-01
In this paper we describe the design and architecture of the Mars Network planning and analysis framework that supports generation and validation of efficient planning and scheduling strategy. The goals are to minimize the transmitting time, minimize the delaying time, and/or maximize the network throughputs. The proposed framework would require (1) a client-server architecture to support interactive, batch, WEB, and distributed analysis and planning applications for the relay network analysis scheme, (2) a high-fidelity modeling and simulation environment that expresses link capabilities between spacecraft to spacecraft and spacecraft to Earth stations as time-varying resources, and spacecraft activities, link priority, Solar System dynamic events, the laws of orbital mechanics, and other limiting factors as spacecraft power and thermal constraints, (3) an optimization methodology that casts the resource and constraint models into a standard linear and nonlinear constrained optimization problem that lends itself to commercial off-the-shelf (COTS)planning and scheduling algorithms.
Making Supply Chains Resilient to Floods Using a Bayesian Network
NASA Astrophysics Data System (ADS)
Haraguchi, M.
2015-12-01
Natural hazards distress the global economy by disrupting the interconnected supply chain networks. Manufacturing companies have created cost-efficient supply chains by reducing inventories, streamlining logistics and limiting the number of suppliers. As a result, today's supply chains are profoundly susceptible to systemic risks. In Thailand, for example, the GDP growth rate declined by 76 % in 2011 due to prolonged flooding. Thailand incurred economic damage including the loss of USD 46.5 billion, approximately 70% of which was caused by major supply chain disruptions in the manufacturing sector. Similar problems occurred after the Great East Japan Earthquake and Tsunami in 2011, the Mississippi River floods and droughts during 2011 - 2013, and Hurricane Sandy in 2012. This study proposes a methodology for modeling supply chain disruptions using a Bayesian network analysis (BNA) to estimate expected values of countermeasures of floods, such as inventory management, supplier management and hard infrastructure management. We first performed a spatio-temporal correlation analysis between floods and extreme precipitation data for the last 100 years at a global scale. Then we used a BNA to create synthetic networks that include variables associated with the magnitude and duration of floods, major components of supply chains and market demands. We also included decision variables of countermeasures that would mitigate potential losses caused by supply chain disruptions. Finally, we conducted a cost-benefit analysis by estimating the expected values of these potential countermeasures while conducting a sensitivity analysis. The methodology was applied to supply chain disruptions caused by the 2011 Thailand floods. Our study demonstrates desirable typical data requirements for the analysis, such as anonymized supplier network data (i.e. critical dependencies, vulnerability information of suppliers) and sourcing data(i.e. locations of suppliers, and production rates and volume), and data from previous experiences (i.e. companies' risk mitigation strategy decisions).
Graph analysis of functional brain networks: practical issues in translational neuroscience
De Vico Fallani, Fabrizio; Richiardi, Jonas; Chavez, Mario; Achard, Sophie
2014-01-01
The brain can be regarded as a network: a connected system where nodes, or units, represent different specialized regions and links, or connections, represent communication pathways. From a functional perspective, communication is coded by temporal dependence between the activities of different brain areas. In the last decade, the abstract representation of the brain as a graph has allowed to visualize functional brain networks and describe their non-trivial topological properties in a compact and objective way. Nowadays, the use of graph analysis in translational neuroscience has become essential to quantify brain dysfunctions in terms of aberrant reconfiguration of functional brain networks. Despite its evident impact, graph analysis of functional brain networks is not a simple toolbox that can be blindly applied to brain signals. On the one hand, it requires the know-how of all the methodological steps of the pipeline that manipulate the input brain signals and extract the functional network properties. On the other hand, knowledge of the neural phenomenon under study is required to perform physiologically relevant analysis. The aim of this review is to provide practical indications to make sense of brain network analysis and contrast counterproductive attitudes. PMID:25180301
Fortuna, Cinira Magali; Mesquita, Luana Pinho de; Matumoto, Silvia; Monceau, Gilles
2016-09-19
This qualitative study is based on institutional analysis as the methodological theoretical reference with the objective of analyzing researchers' implication during a research-intervention and the interferences caused by this analysis. The study involved researchers from courses in medicine, nursing, and dentistry at two universities and workers from a Regional Health Department in follow-up on the implementation of the Stork Network in São Paulo State, Brazil. The researchers worked together in the intervention and in analysis workshops, supported by an external institutional analysis. Two institutions stood out in the analysis: the research, established mainly with characteristics of neutrality, and management, with Taylorist characteristics. Differences between researchers and difficulties in identifying actions proper to network management and research were some of the interferences that were identified. The study concludes that implication analysis is a powerful tool for such studies.
Comparison analysis on vulnerability of metro networks based on complex network
NASA Astrophysics Data System (ADS)
Zhang, Jianhua; Wang, Shuliang; Wang, Xiaoyuan
2018-04-01
This paper analyzes the networked characteristics of three metro networks, and two malicious attacks are employed to investigate the vulnerability of metro networks based on connectivity vulnerability and functionality vulnerability. Meanwhile, the networked characteristics and vulnerability of three metro networks are compared with each other. The results show that Shanghai metro network has the largest transport capacity, Beijing metro network has the best local connectivity and Guangzhou metro network has the best global connectivity, moreover Beijing metro network has the best homogeneous degree distribution. Furthermore, we find that metro networks are very vulnerable subjected to malicious attacks, and Guangzhou metro network has the best topological structure and reliability among three metro networks. The results indicate that the proposed methodology is feasible and effective to investigate the vulnerability and to explore better topological structure of metro networks.
Robust nonlinear variable selective control for networked systems
NASA Astrophysics Data System (ADS)
Rahmani, Behrooz
2016-10-01
This paper is concerned with the networked control of a class of uncertain nonlinear systems. In this way, Takagi-Sugeno (T-S) fuzzy modelling is used to extend the previously proposed variable selective control (VSC) methodology to nonlinear systems. This extension is based upon the decomposition of the nonlinear system to a set of fuzzy-blended locally linearised subsystems and further application of the VSC methodology to each subsystem. To increase the applicability of the T-S approach for uncertain nonlinear networked control systems, this study considers the asynchronous premise variables in the plant and the controller, and then introduces a robust stability analysis and control synthesis. The resulting optimal switching-fuzzy controller provides a minimum guaranteed cost on an H2 performance index. Simulation studies on three nonlinear benchmark problems demonstrate the effectiveness of the proposed method.
Energy consumption analysis for various memristive networks under different learning strategies
NASA Astrophysics Data System (ADS)
Deng, Lei; Wang, Dong; Zhang, Ziyang; Tang, Pei; Li, Guoqi; Pei, Jing
2016-02-01
Recently, various memristive systems emerge to emulate the efficient computing paradigm of the brain cortex; whereas, how to make them energy efficient still remains unclear, especially from an overall perspective. Here, a systematical and bottom-up energy consumption analysis is demonstrated, including the memristor device level and the network learning level. We propose an energy estimating methodology when modulating the memristive synapses, which is simulated in three typical neural networks with different synaptic structures and learning strategies for both offline and online learning. These results provide an in-depth insight to create energy efficient brain-inspired neuromorphic devices in the future.
2018-01-01
Signaling pathways represent parts of the global biological molecular network which connects them into a seamless whole through complex direct and indirect (hidden) crosstalk whose structure can change during development or in pathological conditions. We suggest a novel methodology, called Googlomics, for the structural analysis of directed biological networks using spectral analysis of their Google matrices, using parallels with quantum scattering theory, developed for nuclear and mesoscopic physics and quantum chaos. We introduce analytical “reduced Google matrix” method for the analysis of biological network structure. The method allows inferring hidden causal relations between the members of a signaling pathway or a functionally related group of genes. We investigate how the structure of hidden causal relations can be reprogrammed as a result of changes in the transcriptional network layer during cancerogenesis. The suggested Googlomics approach rigorously characterizes complex systemic changes in the wiring of large causal biological networks in a computationally efficient way. PMID:29370181
Recurrence Density Enhanced Complex Networks for Nonlinear Time Series Analysis
NASA Astrophysics Data System (ADS)
Costa, Diego G. De B.; Reis, Barbara M. Da F.; Zou, Yong; Quiles, Marcos G.; Macau, Elbert E. N.
We introduce a new method, which is entitled Recurrence Density Enhanced Complex Network (RDE-CN), to properly analyze nonlinear time series. Our method first transforms a recurrence plot into a figure of a reduced number of points yet preserving the main and fundamental recurrence properties of the original plot. This resulting figure is then reinterpreted as a complex network, which is further characterized by network statistical measures. We illustrate the computational power of RDE-CN approach by time series by both the logistic map and experimental fluid flows, which show that our method distinguishes different dynamics sufficiently well as the traditional recurrence analysis. Therefore, the proposed methodology characterizes the recurrence matrix adequately, while using a reduced set of points from the original recurrence plots.
1982-02-23
segregate the computer and storage from the outside world 2. Administrative security to control access to secure computer facilities 3. Network security to...Classification Alternative A- 8 NETWORK KG GENSER DSSCS AMPE TERMINALS TP No. 022-4668-A Figure A-2. Dedicated Switching Architecture Alternative A- 9...communications protocol with the network and GENSER message transmission to the - I-S/A AMPE processor. 7. DSSCS TPU - Handles communications protocol with
NASA Technical Reports Server (NTRS)
Dhas, Chris
2000-01-01
NASAs Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to examine protocols and architectures for an In-Space Internet Node. CNS has developed a methodology for network reference models to support NASAs four mission areas: Earth Science, Space Science, Human Exploration and Development of Space (REDS), Aerospace Technology. CNS previously developed a report which applied the methodology, to three space Internet-based communications scenarios for future missions. CNS conceptualized, designed, and developed space Internet-based communications protocols and architectures for each of the independent scenarios. GRC selected for further analysis the scenario that involved unicast communications between a Low-Earth-Orbit (LEO) International Space Station (ISS) and a ground terminal Internet node via a Tracking and Data Relay Satellite (TDRS) transfer. This report contains a tradeoff analysis on the selected scenario. The analysis examines the performance characteristics of the various protocols and architectures. The tradeoff analysis incorporates the results of a CNS developed analytical model that examined performance parameters.
Logical Modeling and Dynamical Analysis of Cellular Networks
Abou-Jaoudé, Wassim; Traynard, Pauline; Monteiro, Pedro T.; Saez-Rodriguez, Julio; Helikar, Tomáš; Thieffry, Denis; Chaouiya, Claudine
2016-01-01
The logical (or logic) formalism is increasingly used to model regulatory and signaling networks. Complementing these applications, several groups contributed various methods and tools to support the definition and analysis of logical models. After an introduction to the logical modeling framework and to several of its variants, we review here a number of recent methodological advances to ease the analysis of large and intricate networks. In particular, we survey approaches to determine model attractors and their reachability properties, to assess the dynamical impact of variations of external signals, and to consistently reduce large models. To illustrate these developments, we further consider several published logical models for two important biological processes, namely the differentiation of T helper cells and the control of mammalian cell cycle. PMID:27303434
Kovács, István A.; Palotai, Robin; Szalay, Máté S.; Csermely, Peter
2010-01-01
Background Network communities help the functional organization and evolution of complex networks. However, the development of a method, which is both fast and accurate, provides modular overlaps and partitions of a heterogeneous network, has proven to be rather difficult. Methodology/Principal Findings Here we introduce the novel concept of ModuLand, an integrative method family determining overlapping network modules as hills of an influence function-based, centrality-type community landscape, and including several widely used modularization methods as special cases. As various adaptations of the method family, we developed several algorithms, which provide an efficient analysis of weighted and directed networks, and (1) determine pervasively overlapping modules with high resolution; (2) uncover a detailed hierarchical network structure allowing an efficient, zoom-in analysis of large networks; (3) allow the determination of key network nodes and (4) help to predict network dynamics. Conclusions/Significance The concept opens a wide range of possibilities to develop new approaches and applications including network routing, classification, comparison and prediction. PMID:20824084
An Analysis of USSPACECOM’s Space Surveillance Network (SSN) Sensor Tasking Methodology
1992-12-01
2-6 2.3.2 Collateral Sensors .......................... 2- 7 2.3.3 Contributing Sensors ........................ 2-8 2.4 Space Surveillance Network...3I 3.1.1 T"hr State, Solution . ...... . ................... 3.:1 Page 3.1.2 The State-Transition Matrix... ............ 3- 7 3.2 Differential...Execution ........................... 4- 7 4.3.3 Model Verification ......................... 4-10 4.41 Differential Corrector
Young, April M.; Halgin, Daniel S.; DiClemente, Ralph J.; Sterk, Claire E.; Havens, Jennifer R.
2014-01-01
Background An HIV vaccine could substantially impact the epidemic. However, risk compensation (RC), or post-vaccination increase in risk behavior, could present a major challenge. The methodology used in previous studies of risk compensation has been almost exclusively individual-level in focus, and has not explored how increased risk behavior could affect the connectivity of risk networks. This study examined the impact of anticipated HIV vaccine-related RC on the structure of high-risk drug users' sexual and injection risk network. Methods A sample of 433 rural drug users in the US provided data on their risk relationships (i.e., those involving recent unprotected sex and/or injection equipment sharing). Dyad-specific data were collected on likelihood of increasing/initiating risk behavior if they, their partner, or they and their partner received an HIV vaccine. Using these data and social network analysis, a "post-vaccination network" was constructed and compared to the current network on measures relevant to HIV transmission, including network size, cohesiveness (e.g., diameter, component structure, density), and centrality. Results Participants reported 488 risk relationships. Few reported an intention to decrease condom use or increase equipment sharing (4% and 1%, respectively). RC intent was reported in 30 existing risk relationships and vaccination was anticipated to elicit the formation of five new relationships. RC resulted in a 5% increase in risk network size (n = 142 to n = 149) and a significant increase in network density. The initiation of risk relationships resulted in the connection of otherwise disconnected network components, with the largest doubling in size from five to ten. Conclusions This study demonstrates a new methodological approach to studying RC and reveals that behavior change following HIV vaccination could potentially impact risk network connectivity. These data will be valuable in parameterizing future network models that can determine if network-level change precipitated by RC would appreciably impact the vaccine's population-level effectiveness. PMID:24992659
Plazas-Nossa, Leonardo; Hofer, Thomas; Gruber, Günter; Torres, Andres
2017-02-01
This work proposes a methodology for the forecasting of online water quality data provided by UV-Vis spectrometry. Therefore, a combination of principal component analysis (PCA) to reduce the dimensionality of a data set and artificial neural networks (ANNs) for forecasting purposes was used. The results obtained were compared with those obtained by using discrete Fourier transform (DFT). The proposed methodology was applied to four absorbance time series data sets composed by a total number of 5705 UV-Vis spectra. Absolute percentage errors obtained by applying the proposed PCA/ANN methodology vary between 10% and 13% for all four study sites. In general terms, the results obtained were hardly generalizable, as they appeared to be highly dependent on specific dynamics of the water system; however, some trends can be outlined. PCA/ANN methodology gives better results than PCA/DFT forecasting procedure by using a specific spectra range for the following conditions: (i) for Salitre wastewater treatment plant (WWTP) (first hour) and Graz West R05 (first 18 min), from the last part of UV range to all visible range; (ii) for Gibraltar pumping station (first 6 min) for all UV-Vis absorbance spectra; and (iii) for San Fernando WWTP (first 24 min) for all of UV range to middle part of visible range.
Djomba, Janet Klara; Zaletel-Kragelj, Lijana
2016-12-01
Research on social networks in public health focuses on how social structures and relationships influence health and health-related behaviour. While the sociocentric approach is used to study complete social networks, the egocentric approach is gaining popularity because of its focus on individuals, groups and communities. One of the participants of the healthy lifestyle health education workshop 'I'm moving', included in the study of social support for exercise was randomly selected. The participant was denoted as the ego and members of her/his social network as the alteri. Data were collected by personal interviews using a self-made questionnaire. Numerical methods and computer programmes for the analysis of social networks were used for the demonstration of analysis. The size, composition and structure of the egocentric social network were obtained by a numerical analysis. The analysis of composition included homophily and homogeneity. Moreover, the analysis of the structure included the degree of the egocentric network, the strength of the ego-alter ties and the average strength of ties. Visualisation of the network was performed by three freely available computer programmes, namely: Egonet.QF, E-net and Pajek. The computer programmes were described and compared by their usefulness. Both numerical analysis and visualisation have their benefits. The decision what approach to use is depending on the purpose of the social network analysis. While the numerical analysis can be used in large-scale population-based studies, visualisation of personal networks can help health professionals at creating, performing and evaluation of preventive programmes, especially if focused on behaviour change.
Masè, Michela; Cristoforetti, Alessandro; Avogaro, Laura; Tessarolo, Francesco; Piccoli, Federico; Caola, Iole; Pederzolli, Carlo; Graffigna, Angelo; Ravelli, Flavia
2015-01-01
The assessment of collagen structure in cardiac pathology, such as atrial fibrillation (AF), is essential for a complete understanding of the disease. This paper introduces a novel methodology for the quantitative description of collagen network properties, based on the combination of nonlinear optical microscopy with a spectral approach of image processing and analysis. Second-harmonic generation (SHG) microscopy was applied to atrial tissue samples from cardiac surgery patients, providing label-free, selective visualization of the collagen structure. The spectral analysis framework, based on 2D-FFT, was applied to the SHG images, yielding a multiparametric description of collagen fiber orientation (angle and anisotropy indexes) and texture scale (dominant wavelength and peak dispersion indexes). The proof-of-concept application of the methodology showed the capability of our approach to detect and quantify differences in the structural properties of the collagen network in AF versus sinus rhythm patients. These results suggest the potential of our approach in the assessment of collagen properties in cardiac pathologies related to a fibrotic structural component.
Dynamic modeling and optimization for space logistics using time-expanded networks
NASA Astrophysics Data System (ADS)
Ho, Koki; de Weck, Olivier L.; Hoffman, Jeffrey A.; Shishko, Robert
2014-12-01
This research develops a dynamic logistics network formulation for lifecycle optimization of mission sequences as a system-level integrated method to find an optimal combination of technologies to be used at each stage of the campaign. This formulation can find the optimal transportation architecture considering its technology trades over time. The proposed methodologies are inspired by the ground logistics analysis techniques based on linear programming network optimization. Particularly, the time-expanded network and its extension are developed for dynamic space logistics network optimization trading the quality of the solution with the computational load. In this paper, the methodologies are applied to a human Mars exploration architecture design problem. The results reveal multiple dynamic system-level trades over time and give recommendation of the optimal strategy for the human Mars exploration architecture. The considered trades include those between In-Situ Resource Utilization (ISRU) and propulsion technologies as well as the orbit and depot location selections over time. This research serves as a precursor for eventual permanent settlement and colonization of other planets by humans and us becoming a multi-planet species.
Topological data analysis of contagion maps for examining spreading processes on networks.
Taylor, Dane; Klimm, Florian; Harrington, Heather A; Kramár, Miroslav; Mischaikow, Konstantin; Porter, Mason A; Mucha, Peter J
2015-07-21
Social and biological contagions are influenced by the spatial embeddedness of networks. Historically, many epidemics spread as a wave across part of the Earth's surface; however, in modern contagions long-range edges-for example, due to airline transportation or communication media-allow clusters of a contagion to appear in distant locations. Here we study the spread of contagions on networks through a methodology grounded in topological data analysis and nonlinear dimension reduction. We construct 'contagion maps' that use multiple contagions on a network to map the nodes as a point cloud. By analysing the topology, geometry and dimensionality of manifold structure in such point clouds, we reveal insights to aid in the modelling, forecast and control of spreading processes. Our approach highlights contagion maps also as a viable tool for inferring low-dimensional structure in networks.
Topological data analysis of contagion maps for examining spreading processes on networks
NASA Astrophysics Data System (ADS)
Taylor, Dane; Klimm, Florian; Harrington, Heather A.; Kramár, Miroslav; Mischaikow, Konstantin; Porter, Mason A.; Mucha, Peter J.
2015-07-01
Social and biological contagions are influenced by the spatial embeddedness of networks. Historically, many epidemics spread as a wave across part of the Earth's surface; however, in modern contagions long-range edges--for example, due to airline transportation or communication media--allow clusters of a contagion to appear in distant locations. Here we study the spread of contagions on networks through a methodology grounded in topological data analysis and nonlinear dimension reduction. We construct `contagion maps' that use multiple contagions on a network to map the nodes as a point cloud. By analysing the topology, geometry and dimensionality of manifold structure in such point clouds, we reveal insights to aid in the modelling, forecast and control of spreading processes. Our approach highlights contagion maps also as a viable tool for inferring low-dimensional structure in networks.
Moschen, Sebastián; Higgins, Janet; Di Rienzo, Julio A; Heinz, Ruth A; Paniego, Norma; Fernandez, Paula
2016-06-06
In recent years, high throughput technologies have led to an increase of datasets from omics disciplines allowing the understanding of the complex regulatory networks associated with biological processes. Leaf senescence is a complex mechanism controlled by multiple genetic and environmental variables, which has a strong impact on crop yield. Transcription factors (TFs) are key proteins in the regulation of gene expression, regulating different signaling pathways; their function is crucial for triggering and/or regulating different aspects of the leaf senescence process. The study of TF interactions and their integration with metabolic profiles under different developmental conditions, especially for a non-model organism such as sunflower, will open new insights into the details of gene regulation of leaf senescence. Weighted Gene Correlation Network Analysis (WGCNA) and BioSignature Discoverer (BioSD, Gnosis Data Analysis, Heraklion, Greece) were used to integrate transcriptomic and metabolomic data. WGCNA allowed the detection of 10 metabolites and 13 TFs whereas BioSD allowed the detection of 1 metabolite and 6 TFs as potential biomarkers. The comparative analysis demonstrated that three transcription factors were detected through both methodologies, highlighting them as potentially robust biomarkers associated with leaf senescence in sunflower. The complementary use of network and BioSignature Discoverer analysis of transcriptomic and metabolomic data provided a useful tool for identifying candidate genes and metabolites which may have a role during the triggering and development of the leaf senescence process. The WGCNA tool allowed us to design and test a hypothetical network in order to infer relationships across selected transcription factor and metabolite candidate biomarkers involved in leaf senescence, whereas BioSignature Discoverer selected transcripts and metabolites which discriminate between different ages of sunflower plants. The methodology presented here would help to elucidate and predict novel networks and potential biomarkers of leaf senescence in sunflower.
Júnez-Ferreira, H E; Herrera, G S
2013-04-01
This paper presents a new methodology for the optimal design of space-time hydraulic head monitoring networks and its application to the Valle de Querétaro aquifer in Mexico. The selection of the space-time monitoring points is done using a static Kalman filter combined with a sequential optimization method. The Kalman filter requires as input a space-time covariance matrix, which is derived from a geostatistical analysis. A sequential optimization method that selects the space-time point that minimizes a function of the variance, in each step, is used. We demonstrate the methodology applying it to the redesign of the hydraulic head monitoring network of the Valle de Querétaro aquifer with the objective of selecting from a set of monitoring positions and times, those that minimize the spatiotemporal redundancy. The database for the geostatistical space-time analysis corresponds to information of 273 wells located within the aquifer for the period 1970-2007. A total of 1,435 hydraulic head data were used to construct the experimental space-time variogram. The results show that from the existing monitoring program that consists of 418 space-time monitoring points, only 178 are not redundant. The implied reduction of monitoring costs was possible because the proposed method is successful in propagating information in space and time.
Influence of the Time Scale on the Construction of Financial Networks
Emmert-Streib, Frank; Dehmer, Matthias
2010-01-01
Background In this paper we investigate the definition and formation of financial networks. Specifically, we study the influence of the time scale on their construction. Methodology/Principal Findings For our analysis we use correlation-based networks obtained from the daily closing prices of stock market data. More precisely, we use the stocks that currently comprise the Dow Jones Industrial Average (DJIA) and estimate financial networks where nodes correspond to stocks and edges correspond to none vanishing correlation coefficients. That means only if a correlation coefficient is statistically significant different from zero, we include an edge in the network. This construction procedure results in unweighted, undirected networks. By separating the time series of stock prices in non-overlapping intervals, we obtain one network per interval. The length of these intervals corresponds to the time scale of the data, whose influence on the construction of the networks will be studied in this paper. Conclusions/Significance Numerical analysis of four different measures in dependence on the time scale for the construction of networks allows us to gain insights about the intrinsic time scale of the stock market with respect to a meaningful graph-theoretical analysis. PMID:20949124
Resting-State Functional Connectivity in the Infant Brain: Methods, Pitfalls, and Potentiality.
Mongerson, Chandler R L; Jennings, Russell W; Borsook, David; Becerra, Lino; Bajic, Dusica
2017-01-01
Early brain development is characterized by rapid growth and perpetual reconfiguration, driven by a dynamic milieu of heterogeneous processes. Postnatal brain plasticity is associated with increased vulnerability to environmental stimuli. However, little is known regarding the ontogeny and temporal manifestations of inter- and intra-regional functional connectivity that comprise functional brain networks. Resting-state functional magnetic resonance imaging (rs-fMRI) has emerged as a promising non-invasive neuroinvestigative tool, measuring spontaneous fluctuations in blood oxygen level dependent (BOLD) signal at rest that reflect baseline neuronal activity. Over the past decade, its application has expanded to infant populations providing unprecedented insight into functional organization of the developing brain, as well as early biomarkers of abnormal states. However, many methodological issues of rs-fMRI analysis need to be resolved prior to standardization of the technique to infant populations. As a primary goal, this methodological manuscript will (1) present a robust methodological protocol to extract and assess resting-state networks in early infancy using independent component analysis (ICA), such that investigators without previous knowledge in the field can implement the analysis and reliably obtain viable results consistent with previous literature; (2) review the current methodological challenges and ethical considerations associated with emerging field of infant rs-fMRI analysis; and (3) discuss the significance of rs-fMRI application in infants for future investigations of neurodevelopment in the context of early life stressors and pathological processes. The overarching goal is to catalyze efforts toward development of robust, infant-specific acquisition, and preprocessing pipelines, as well as promote greater transparency by researchers regarding methods used.
Resting-State Functional Connectivity in the Infant Brain: Methods, Pitfalls, and Potentiality
Mongerson, Chandler R. L.; Jennings, Russell W.; Borsook, David; Becerra, Lino; Bajic, Dusica
2017-01-01
Early brain development is characterized by rapid growth and perpetual reconfiguration, driven by a dynamic milieu of heterogeneous processes. Postnatal brain plasticity is associated with increased vulnerability to environmental stimuli. However, little is known regarding the ontogeny and temporal manifestations of inter- and intra-regional functional connectivity that comprise functional brain networks. Resting-state functional magnetic resonance imaging (rs-fMRI) has emerged as a promising non-invasive neuroinvestigative tool, measuring spontaneous fluctuations in blood oxygen level dependent (BOLD) signal at rest that reflect baseline neuronal activity. Over the past decade, its application has expanded to infant populations providing unprecedented insight into functional organization of the developing brain, as well as early biomarkers of abnormal states. However, many methodological issues of rs-fMRI analysis need to be resolved prior to standardization of the technique to infant populations. As a primary goal, this methodological manuscript will (1) present a robust methodological protocol to extract and assess resting-state networks in early infancy using independent component analysis (ICA), such that investigators without previous knowledge in the field can implement the analysis and reliably obtain viable results consistent with previous literature; (2) review the current methodological challenges and ethical considerations associated with emerging field of infant rs-fMRI analysis; and (3) discuss the significance of rs-fMRI application in infants for future investigations of neurodevelopment in the context of early life stressors and pathological processes. The overarching goal is to catalyze efforts toward development of robust, infant-specific acquisition, and preprocessing pipelines, as well as promote greater transparency by researchers regarding methods used. PMID:28856131
Canalization and Control in Automata Networks: Body Segmentation in Drosophila melanogaster
Marques-Pita, Manuel; Rocha, Luis M.
2013-01-01
We present schema redescription as a methodology to characterize canalization in automata networks used to model biochemical regulation and signalling. In our formulation, canalization becomes synonymous with redundancy present in the logic of automata. This results in straightforward measures to quantify canalization in an automaton (micro-level), which is in turn integrated into a highly scalable framework to characterize the collective dynamics of large-scale automata networks (macro-level). This way, our approach provides a method to link micro- to macro-level dynamics – a crux of complexity. Several new results ensue from this methodology: uncovering of dynamical modularity (modules in the dynamics rather than in the structure of networks), identification of minimal conditions and critical nodes to control the convergence to attractors, simulation of dynamical behaviour from incomplete information about initial conditions, and measures of macro-level canalization and robustness to perturbations. We exemplify our methodology with a well-known model of the intra- and inter cellular genetic regulation of body segmentation in Drosophila melanogaster. We use this model to show that our analysis does not contradict any previous findings. But we also obtain new knowledge about its behaviour: a better understanding of the size of its wild-type attractor basin (larger than previously thought), the identification of novel minimal conditions and critical nodes that control wild-type behaviour, and the resilience of these to stochastic interventions. Our methodology is applicable to any complex network that can be modelled using automata, but we focus on biochemical regulation and signalling, towards a better understanding of the (decentralized) control that orchestrates cellular activity – with the ultimate goal of explaining how do cells and tissues ‘compute’. PMID:23520449
Canalization and control in automata networks: body segmentation in Drosophila melanogaster.
Marques-Pita, Manuel; Rocha, Luis M
2013-01-01
We present schema redescription as a methodology to characterize canalization in automata networks used to model biochemical regulation and signalling. In our formulation, canalization becomes synonymous with redundancy present in the logic of automata. This results in straightforward measures to quantify canalization in an automaton (micro-level), which is in turn integrated into a highly scalable framework to characterize the collective dynamics of large-scale automata networks (macro-level). This way, our approach provides a method to link micro- to macro-level dynamics--a crux of complexity. Several new results ensue from this methodology: uncovering of dynamical modularity (modules in the dynamics rather than in the structure of networks), identification of minimal conditions and critical nodes to control the convergence to attractors, simulation of dynamical behaviour from incomplete information about initial conditions, and measures of macro-level canalization and robustness to perturbations. We exemplify our methodology with a well-known model of the intra- and inter cellular genetic regulation of body segmentation in Drosophila melanogaster. We use this model to show that our analysis does not contradict any previous findings. But we also obtain new knowledge about its behaviour: a better understanding of the size of its wild-type attractor basin (larger than previously thought), the identification of novel minimal conditions and critical nodes that control wild-type behaviour, and the resilience of these to stochastic interventions. Our methodology is applicable to any complex network that can be modelled using automata, but we focus on biochemical regulation and signalling, towards a better understanding of the (decentralized) control that orchestrates cellular activity--with the ultimate goal of explaining how do cells and tissues 'compute'.
Research synergy and drug development: Bright stars in neighboring constellations.
Keserci, Samet; Livingston, Eric; Wan, Lingtian; Pico, Alexander R; Chacko, George
2017-11-01
Drug discovery and subsequent availability of a new breakthrough therapeutic or 'cure' is a compelling example of societal benefit from research advances. These advances are invariably collaborative, involving the contributions of many scientists to a discovery network in which theory and experiment are built upon. To document and understand such scientific advances, data mining of public and commercial data sources coupled with network analysis can be used as a digital methodology to assemble and analyze component events in the history of a therapeutic. This methodology is extensible beyond the history of therapeutics and its use more generally supports (i) efficiency in exploring the scientific history of a research advance (ii) documenting and understanding collaboration (iii) portfolio analysis, planning and optimization (iv) communication of the societal value of research. Building upon prior art, we have conducted a case study of five anti-cancer therapeutics to identify the collaborations that resulted in the successful development of these therapeutics both within and across their respective networks. We have linked the work of over 235,000 authors in roughly 106,000 scientific publications that capture the research crucial for the development of these five therapeutics. Applying retrospective citation discovery, we have identified a core set of publications cited in the networks of all five therapeutics and additional intersections in combinations of networks. We have enriched the content of these networks by annotating them with information on research awards from the US National Institutes of Health (NIH). Lastly, we have mapped these awards to their cognate peer review panels, identifying another layer of collaborative scientific activity that influenced the research represented in these networks.
Theoretical Notes on the Sociological Analysis of School Reform Networks
ERIC Educational Resources Information Center
Ladwig, James G.
2014-01-01
Nearly two decades ago, Ladwig outlined the theoretical and methodological implications of Bourdieu's concept of the social field for sociological analyses of educational policy and school reform. The current analysis extends this work to consider the sociological import of one of the most ubiquitous forms of educational reform found around…
Learning Principal Component Analysis by Using Data from Air Quality Networks
ERIC Educational Resources Information Center
Perez-Arribas, Luis Vicente; Leon-González, María Eugenia; Rosales-Conrado, Noelia
2017-01-01
With the final objective of using computational and chemometrics tools in the chemistry studies, this paper shows the methodology and interpretation of the Principal Component Analysis (PCA) using pollution data from different cities. This paper describes how students can obtain data on air quality and process such data for additional information…
The US EPA’s ToxCastTM program seeks to combine advances in high-throughput screening technology with methodologies from statistics and computer science to develop high-throughput decision support tools for assessing chemical hazard and risk. To develop new methods of analysis of...
Fathollah Bayati, Mohsen; Sadjadi, Seyed Jafar
2017-01-01
In this paper, new Network Data Envelopment Analysis (NDEA) models are developed to evaluate the efficiency of regional electricity power networks. The primary objective of this paper is to consider perturbation in data and develop new NDEA models based on the adaptation of robust optimization methodology. Furthermore, in this paper, the efficiency of the entire networks of electricity power, involving generation, transmission and distribution stages is measured. While DEA has been widely used to evaluate the efficiency of the components of electricity power networks during the past two decades, there is no study to evaluate the efficiency of the electricity power networks as a whole. The proposed models are applied to evaluate the efficiency of 16 regional electricity power networks in Iran and the effect of data uncertainty is also investigated. The results are compared with the traditional network DEA and parametric SFA methods. Validity and verification of the proposed models are also investigated. The preliminary results indicate that the proposed models were more reliable than the traditional Network DEA model.
Sadjadi, Seyed Jafar
2017-01-01
In this paper, new Network Data Envelopment Analysis (NDEA) models are developed to evaluate the efficiency of regional electricity power networks. The primary objective of this paper is to consider perturbation in data and develop new NDEA models based on the adaptation of robust optimization methodology. Furthermore, in this paper, the efficiency of the entire networks of electricity power, involving generation, transmission and distribution stages is measured. While DEA has been widely used to evaluate the efficiency of the components of electricity power networks during the past two decades, there is no study to evaluate the efficiency of the electricity power networks as a whole. The proposed models are applied to evaluate the efficiency of 16 regional electricity power networks in Iran and the effect of data uncertainty is also investigated. The results are compared with the traditional network DEA and parametric SFA methods. Validity and verification of the proposed models are also investigated. The preliminary results indicate that the proposed models were more reliable than the traditional Network DEA model. PMID:28953900
The use of hierarchical clustering for the design of optimized monitoring networks
NASA Astrophysics Data System (ADS)
Soares, Joana; Makar, Paul Andrew; Aklilu, Yayne; Akingunola, Ayodeji
2018-05-01
Associativity analysis is a powerful tool to deal with large-scale datasets by clustering the data on the basis of (dis)similarity and can be used to assess the efficacy and design of air quality monitoring networks. We describe here our use of Kolmogorov-Zurbenko filtering and hierarchical clustering of NO2 and SO2 passive and continuous monitoring data to analyse and optimize air quality networks for these species in the province of Alberta, Canada. The methodology applied in this study assesses dissimilarity between monitoring station time series based on two metrics: 1 - R, R being the Pearson correlation coefficient, and the Euclidean distance; we find that both should be used in evaluating monitoring site similarity. We have combined the analytic power of hierarchical clustering with the spatial information provided by deterministic air quality model results, using the gridded time series of model output as potential station locations, as a proxy for assessing monitoring network design and for network optimization. We demonstrate that clustering results depend on the air contaminant analysed, reflecting the difference in the respective emission sources of SO2 and NO2 in the region under study. Our work shows that much of the signal identifying the sources of NO2 and SO2 emissions resides in shorter timescales (hourly to daily) due to short-term variation of concentrations and that longer-term averages in data collection may lose the information needed to identify local sources. However, the methodology identifies stations mainly influenced by seasonality, if larger timescales (weekly to monthly) are considered. We have performed the first dissimilarity analysis based on gridded air quality model output and have shown that the methodology is capable of generating maps of subregions within which a single station will represent the entire subregion, to a given level of dissimilarity. We have also shown that our approach is capable of identifying different sampling methodologies as well as outliers (stations' time series which are markedly different from all others in a given dataset).
Functional network alterations and their structural substrate in drug-resistant epilepsy
Caciagli, Lorenzo; Bernhardt, Boris C.; Hong, Seok-Jun; Bernasconi, Andrea; Bernasconi, Neda
2014-01-01
The advent of MRI has revolutionized the evaluation and management of drug-resistant epilepsy by allowing the detection of the lesion associated with the region that gives rise to seizures. Recent evidence indicates marked chronic alterations in the functional organization of lesional tissue and large-scale cortico-subcortical networks. In this review, we focus on recent methodological developments in functional MRI (fMRI) analysis techniques and their application to the two most common drug-resistant focal epilepsies, i.e., temporal lobe epilepsy related to mesial temporal sclerosis and extra-temporal lobe epilepsy related to focal cortical dysplasia. We put particular emphasis on methodological developments in the analysis of task-free or “resting-state” fMRI to probe the integrity of intrinsic networks on a regional, inter-regional, and connectome-wide level. In temporal lobe epilepsy, these techniques have revealed disrupted connectivity of the ipsilateral mesiotemporal lobe, together with contralateral compensatory reorganization and striking reconfigurations of large-scale networks. In cortical dysplasia, initial observations indicate functional alterations in lesional, peri-lesional, and remote neocortical regions. While future research is needed to critically evaluate the reliability, sensitivity, and specificity, fMRI mapping promises to lend distinct biomarkers for diagnosis, presurgical planning, and outcome prediction. PMID:25565942
Actor-network theory: a tool to support ethical analysis of commercial genetic testing.
Williams-Jones, Bryn; Graham, Janice E
2003-12-01
Social, ethical and policy analysis of the issues arising from gene patenting and commercial genetic testing is enhanced by the application of science and technology studies, and Actor-Network Theory (ANT) in particular. We suggest the potential for transferring ANT's flexible nature to an applied heuristic methodology for gathering empirical information and for analysing the complex networks involved in the development of genetic technologies. Three concepts are explored in this paper--actor-networks, translation, and drift--and applied to the case of Myriad Genetics and their commercial BRACAnalysis genetic susceptibility test for hereditary breast cancer. Treating this test as an active participant in socio-technical networks clarifies the extent to which it interacts with, shapes and is shaped by people, other technologies, and institutions. Such an understanding enables more sophisticated and nuanced technology assessment, academic analysis, as well as public debate about the social, ethical and policy implications of the commercialization of new genetic technologies.
Inference and Prediction of Metabolic Network Fluxes
Nikoloski, Zoran; Perez-Storey, Richard; Sweetlove, Lee J.
2015-01-01
In this Update, we cover the basic principles of the estimation and prediction of the rates of the many interconnected biochemical reactions that constitute plant metabolic networks. This includes metabolic flux analysis approaches that utilize the rates or patterns of redistribution of stable isotopes of carbon and other atoms to estimate fluxes, as well as constraints-based optimization approaches such as flux balance analysis. Some of the major insights that have been gained from analysis of fluxes in plants are discussed, including the functioning of metabolic pathways in a network context, the robustness of the metabolic phenotype, the importance of cell maintenance costs, and the mechanisms that enable energy and redox balancing at steady state. We also discuss methodologies to exploit 'omic data sets for the construction of tissue-specific metabolic network models and to constrain the range of permissible fluxes in such models. Finally, we consider the future directions and challenges faced by the field of metabolic network flux phenotyping. PMID:26392262
Analysis of local bus markets : volume I – methodology and findings : final report.
DOT National Transportation Integrated Search
2017-07-04
Despite having an extensive network of public transit, traffic congestion and transportation-related greenhouse gas (GHG) emissions are significant concerns in New Jersey. This research hypothesizes that traffic congestion and air quality concerns in...
Identification of functional modules using network topology and high-throughput data.
Ulitsky, Igor; Shamir, Ron
2007-01-26
With the advent of systems biology, biological knowledge is often represented today by networks. These include regulatory and metabolic networks, protein-protein interaction networks, and many others. At the same time, high-throughput genomics and proteomics techniques generate very large data sets, which require sophisticated computational analysis. Usually, separate and different analysis methodologies are applied to each of the two data types. An integrated investigation of network and high-throughput information together can improve the quality of the analysis by accounting simultaneously for topological network properties alongside intrinsic features of the high-throughput data. We describe a novel algorithmic framework for this challenge. We first transform the high-throughput data into similarity values, (e.g., by computing pairwise similarity of gene expression patterns from microarray data). Then, given a network of genes or proteins and similarity values between some of them, we seek connected sub-networks (or modules) that manifest high similarity. We develop algorithms for this problem and evaluate their performance on the osmotic shock response network in S. cerevisiae and on the human cell cycle network. We demonstrate that focused, biologically meaningful and relevant functional modules are obtained. In comparison with extant algorithms, our approach has higher sensitivity and higher specificity. We have demonstrated that our method can accurately identify functional modules. Hence, it carries the promise to be highly useful in analysis of high throughput data.
Complex Dynamics in Information Sharing Networks
NASA Astrophysics Data System (ADS)
Cronin, Bruce
This study examines the roll-out of an electronic knowledge base in a medium-sized professional services firm over a six year period. The efficiency of such implementation is a key business problem in IT systems of this type. Data from usage logs provides the basis for analysis of the dynamic evolution of social networks around the depository during this time. The adoption pattern follows an "s-curve" and usage exhibits something of a power law distribution, both attributable to network effects, and network position is associated with organisational performance on a number of indicators. But periodicity in usage is evident and the usage distribution displays an exponential cut-off. Further analysis provides some evidence of mathematical complexity in the periodicity. Some implications of complex patterns in social network data for research and management are discussed. The study provides a case study demonstrating the utility of the broad methodological approach.
Coevolution of Epidemics, Social Networks, and Individual Behavior: A Case Study
NASA Astrophysics Data System (ADS)
Chen, Jiangzhuo; Marathe, Achla; Marathe, Madhav
This research shows how a limited supply of antivirals can be distributed optimally between the hospitals and the market so that the attack rate is minimized and enough revenue is generated to recover the cost of the antivirals. Results using an individual based model find that prevalence elastic demand behavior delays the epidemic and change in the social contact network induced by isolation reduces the peak of the epidemic significantly. A microeconomic analysis methodology combining behavioral economics and agent-based simulation is a major contribution of this work. In this paper we apply this methodology to analyze the fairness of the stockpile distribution, and the response of human behavior to disease prevalence level and its interaction with the market.
Charge transport network dynamics in molecular aggregates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jackson, Nicholas E.; Chen, Lin X.; Ratner, Mark A.
2016-07-20
Due to the nonperiodic nature of charge transport in disordered systems, generating insight into static charge transport networks, as well as analyzing the network dynamics, can be challenging. Here, we apply time-dependent network analysis to scrutinize the charge transport networks of two representative molecular semiconductors: a rigid n-type molecule, perylenediimide, and a flexible p-type molecule, bBDT(TDPP)2. Simulations reveal the relevant timescale for local transfer integral decorrelation to be ~100 fs, which is shown to be faster than that of a crystalline morphology of the same molecule. Using a simple graph metric, global network changes are observed over timescales competitive withmore » charge carrier lifetimes. These insights demonstrate that static charge transport networks are qualitatively inadequate, whereas average networks often overestimate network connectivity. Finally, a simple methodology for tracking dynamic charge transport properties is proposed.« less
Challenges in the estimation of Net SURvival: The CENSUR working survival group.
Giorgi, R
2016-10-01
Net survival, the survival probability that would be observed, in a hypothetical world, where the cancer of interest would be the only possible cause of death, is a key indicator in population-based cancer studies. Accounting for mortality due to other causes, it allows cross-country comparisons or trends analysis and provides a useful indicator for public health decision-making. The objective of this study was to show how the creation and formalization of a network comprising established research teams, which already had substantial and complementary experience in both cancer survival analysis and methodological development, make it possible to meet challenges and thus provide more adequate tools, to improve the quality and the comparability of cancer survival data, and to promote methodological transfers in areas of emerging interest. The Challenges in the Estimation of Net SURvival (CENSUR) working survival group is composed of international researchers highly skilled in biostatistics, methodology, and epidemiology, from different research organizations in France, the United Kingdom, Italy, Slovenia, and Canada, and involved in French (FRANCIM) and European (EUROCARE) cancer registry networks. The expected advantages are an interdisciplinary, international, synergistic network capable of addressing problems in public health, for decision-makers at different levels; tools for those in charge of net survival analyses; a common methodology that makes unbiased cross-national comparisons of cancer survival feasible; transfer of methods for net survival estimations to other specific applications (clinical research, occupational epidemiology); and dissemination of results during an international training course. The formalization of the international CENSUR working survival group was motivated by a need felt by scientists conducting population-based cancer research to discuss, develop, and monitor implementation of a common methodology to analyze net survival in order to provide useful information for cancer control and cancer policy. A "team science" approach is necessary to address new challenges concerning the estimation of net survival. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Identifying and tracking attacks on networks: C3I displays and related technologies
NASA Astrophysics Data System (ADS)
Manes, Gavin W.; Dawkins, J.; Shenoi, Sujeet; Hale, John C.
2003-09-01
Converged network security is extremely challenging for several reasons; expanded system and technology perimeters, unexpected feature interaction, and complex interfaces all conspire to provide hackers with greater opportunities for compromising large networks. Preventive security services and architectures are essential, but in and of themselves do not eliminate all threat of compromise. Attack management systems mitigate this residual risk by facilitating incident detection, analysis and response. There are a wealth of attack detection and response tools for IP networks, but a dearth of such tools for wireless and public telephone networks. Moreover, methodologies and formalisms have yet to be identified that can yield a common model for vulnerabilities and attacks in converged networks. A comprehensive attack management system must coordinate detection tools for converged networks, derive fully-integrated attack and network models, perform vulnerability and multi-stage attack analysis, support large-scale attack visualization, and orchestrate strategic responses to cyber attacks that cross network boundaries. We present an architecture that embodies these principles for attack management. The attack management system described engages a suite of detection tools for various networking domains, feeding real-time attack data to a comprehensive modeling, analysis and visualization subsystem. The resulting early warning system not only provides network administrators with a heads-up cockpit display of their entire network, it also supports guided response and predictive capabilities for multi-stage attacks in converged networks.
Proteomics and Systems Biology: Current and Future Applications in the Nutritional Sciences1
Moore, J. Bernadette; Weeks, Mark E.
2011-01-01
In the last decade, advances in genomics, proteomics, and metabolomics have yielded large-scale datasets that have driven an interest in global analyses, with the objective of understanding biological systems as a whole. Systems biology integrates computational modeling and experimental biology to predict and characterize the dynamic properties of biological systems, which are viewed as complex signaling networks. Whereas the systems analysis of disease-perturbed networks holds promise for identification of drug targets for therapy, equally the identified critical network nodes may be targeted through nutritional intervention in either a preventative or therapeutic fashion. As such, in the context of the nutritional sciences, it is envisioned that systems analysis of normal and nutrient-perturbed signaling networks in combination with knowledge of underlying genetic polymorphisms will lead to a future in which the health of individuals will be improved through predictive and preventative nutrition. Although high-throughput transcriptomic microarray data were initially most readily available and amenable to systems analysis, recent technological and methodological advances in MS have contributed to a linear increase in proteomic investigations. It is now commonplace for combined proteomic technologies to generate complex, multi-faceted datasets, and these will be the keystone of future systems biology research. This review will define systems biology, outline current proteomic methodologies, highlight successful applications of proteomics in nutrition research, and discuss the challenges for future applications of systems biology approaches in the nutritional sciences. PMID:22332076
Cellular neural network-based hybrid approach toward automatic image registration
NASA Astrophysics Data System (ADS)
Arun, Pattathal VijayaKumar; Katiyar, Sunil Kumar
2013-01-01
Image registration is a key component of various image processing operations that involve the analysis of different image data sets. Automatic image registration domains have witnessed the application of many intelligent methodologies over the past decade; however, inability to properly model object shape as well as contextual information has limited the attainable accuracy. A framework for accurate feature shape modeling and adaptive resampling using advanced techniques such as vector machines, cellular neural network (CNN), scale invariant feature transform (SIFT), coreset, and cellular automata is proposed. CNN has been found to be effective in improving feature matching as well as resampling stages of registration and complexity of the approach has been considerably reduced using coreset optimization. The salient features of this work are cellular neural network approach-based SIFT feature point optimization, adaptive resampling, and intelligent object modelling. Developed methodology has been compared with contemporary methods using different statistical measures. Investigations over various satellite images revealed that considerable success was achieved with the approach. This system has dynamically used spectral and spatial information for representing contextual knowledge using CNN-prolog approach. This methodology is also illustrated to be effective in providing intelligent interpretation and adaptive resampling.
Efficient Analysis of Complex Structures
NASA Technical Reports Server (NTRS)
Kapania, Rakesh K.
2000-01-01
Last various accomplishments achieved during this project are : (1) A Survey of Neural Network (NN) applications using MATLAB NN Toolbox on structural engineering especially on equivalent continuum models (Appendix A). (2) Application of NN and GAs to simulate and synthesize substructures: 1-D and 2-D beam problems (Appendix B). (3) Development of an equivalent plate-model analysis method (EPA) for static and vibration analysis of general trapezoidal built-up wing structures composed of skins, spars and ribs. Calculation of all sorts of test cases and comparison with measurements or FEA results. (Appendix C). (4) Basic work on using second order sensitivities on simulating wing modal response, discussion of sensitivity evaluation approaches, and some results (Appendix D). (5) Establishing a general methodology of simulating the modal responses by direct application of NN and by sensitivity techniques, in a design space composed of a number of design points. Comparison is made through examples using these two methods (Appendix E). (6) Establishing a general methodology of efficient analysis of complex wing structures by indirect application of NN: the NN-aided Equivalent Plate Analysis. Training of the Neural Networks for this purpose in several cases of design spaces, which can be applicable for actual design of complex wings (Appendix F).
Alamian, Golnoush; Hincapié, Ana-Sofía; Pascarella, Annalisa; Thiery, Thomas; Combrisson, Etienne; Saive, Anne-Lise; Martel, Véronique; Althukov, Dmitrii; Haesebaert, Frédéric; Jerbi, Karim
2017-09-01
Neuroimaging studies provide evidence of disturbed resting-state brain networks in Schizophrenia (SZ). However, untangling the neuronal mechanisms that subserve these baseline alterations requires measurement of their electrophysiological underpinnings. This systematic review specifically investigates the contributions of resting-state Magnetoencephalography (MEG) in elucidating abnormal neural organization in SZ patients. A systematic literature review of resting-state MEG studies in SZ was conducted. This literature is discussed in relation to findings from resting-state fMRI and EEG, as well as to task-based MEG research in SZ population. Importantly, methodological limitations are considered and recommendations to overcome current limitations are proposed. Resting-state MEG literature in SZ points towards altered local and long-range oscillatory network dynamics in various frequency bands. Critical methodological challenges with respect to experiment design, and data collection and analysis need to be taken into consideration. Spontaneous MEG data show that local and global neural organization is altered in SZ patients. MEG is a highly promising tool to fill in knowledge gaps about the neurophysiology of SZ. However, to reach its fullest potential, basic methodological challenges need to be overcome. MEG-based resting-state power and connectivity findings could be great assets to clinical and translational research in psychiatry, and SZ in particular. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.
Analysing efficiency of IPv6 packet transmission over 6LoWPAN network
NASA Astrophysics Data System (ADS)
Kozłowski, Adam; Sosnowski, Janusz
2017-08-01
Practical proliferation of Internet of Things (IoT) concept depends upon communication efficiency in the related network. In the paper we outline basic features of wireless communication protocols used in IoT and concentrate on analysing communication overheads. In particular, we discuss the impact of IPv6 packet length on 6LoWPAN network operation with physical and MAC layer defined by IEEE 802.15.4 standard. The presented analysis methodology is useful in estimation of the total goodput (throughput at the application level) and energy consumptions within the whole traffic model which are the crucial features of IoT networks.
Kamal, Noreen; Fels, Sidney
2013-01-01
Positive health behaviour is critical to preventing illness and managing chronic conditions. A user-centred methodology was employed to design an online social network to motivate health behaviour change. The methodology was augmented by utilizing the Appeal, Belonging, Commitment (ABC) Framework, which is based on theoretical models for health behaviour change and use of online social networks. The user-centred methodology included four phases: 1) initial user inquiry on health behaviour and use of online social networks; 2) interview feedback on paper prototypes; 2) laboratory study on medium fidelity prototype; and 4) a field study on the high fidelity prototype. The points of inquiry through these phases were based on the ABC Framework. This yielded an online social network system that linked to external third party databases to deploy to users via an interactive website.
Ma, Chuang; Xin, Mingming; Feldmann, Kenneth A.; Wang, Xiangfeng
2014-01-01
Machine learning (ML) is an intelligent data mining technique that builds a prediction model based on the learning of prior knowledge to recognize patterns in large-scale data sets. We present an ML-based methodology for transcriptome analysis via comparison of gene coexpression networks, implemented as an R package called machine learning–based differential network analysis (mlDNA) and apply this method to reanalyze a set of abiotic stress expression data in Arabidopsis thaliana. The mlDNA first used a ML-based filtering process to remove nonexpressed, constitutively expressed, or non-stress-responsive “noninformative” genes prior to network construction, through learning the patterns of 32 expression characteristics of known stress-related genes. The retained “informative” genes were subsequently analyzed by ML-based network comparison to predict candidate stress-related genes showing expression and network differences between control and stress networks, based on 33 network topological characteristics. Comparative evaluation of the network-centric and gene-centric analytic methods showed that mlDNA substantially outperformed traditional statistical testing–based differential expression analysis at identifying stress-related genes, with markedly improved prediction accuracy. To experimentally validate the mlDNA predictions, we selected 89 candidates out of the 1784 predicted salt stress–related genes with available SALK T-DNA mutagenesis lines for phenotypic screening and identified two previously unreported genes, mutants of which showed salt-sensitive phenotypes. PMID:24520154
Naegle, Kristen M; Welsch, Roy E; Yaffe, Michael B; White, Forest M; Lauffenburger, Douglas A
2011-07-01
Advances in proteomic technologies continue to substantially accelerate capability for generating experimental data on protein levels, states, and activities in biological samples. For example, studies on receptor tyrosine kinase signaling networks can now capture the phosphorylation state of hundreds to thousands of proteins across multiple conditions. However, little is known about the function of many of these protein modifications, or the enzymes responsible for modifying them. To address this challenge, we have developed an approach that enhances the power of clustering techniques to infer functional and regulatory meaning of protein states in cell signaling networks. We have created a new computational framework for applying clustering to biological data in order to overcome the typical dependence on specific a priori assumptions and expert knowledge concerning the technical aspects of clustering. Multiple clustering analysis methodology ('MCAM') employs an array of diverse data transformations, distance metrics, set sizes, and clustering algorithms, in a combinatorial fashion, to create a suite of clustering sets. These sets are then evaluated based on their ability to produce biological insights through statistical enrichment of metadata relating to knowledge concerning protein functions, kinase substrates, and sequence motifs. We applied MCAM to a set of dynamic phosphorylation measurements of the ERRB network to explore the relationships between algorithmic parameters and the biological meaning that could be inferred and report on interesting biological predictions. Further, we applied MCAM to multiple phosphoproteomic datasets for the ERBB network, which allowed us to compare independent and incomplete overlapping measurements of phosphorylation sites in the network. We report specific and global differences of the ERBB network stimulated with different ligands and with changes in HER2 expression. Overall, we offer MCAM as a broadly-applicable approach for analysis of proteomic data which may help increase the current understanding of molecular networks in a variety of biological problems. © 2011 Naegle et al.
Revealing the hidden language of complex networks.
Yaveroğlu, Ömer Nebil; Malod-Dognin, Noël; Davis, Darren; Levnajic, Zoran; Janjic, Vuk; Karapandza, Rasa; Stojmirovic, Aleksandar; Pržulj, Nataša
2014-04-01
Sophisticated methods for analysing complex networks promise to be of great benefit to almost all scientific disciplines, yet they elude us. In this work, we make fundamental methodological advances to rectify this. We discover that the interaction between a small number of roles, played by nodes in a network, can characterize a network's structure and also provide a clear real-world interpretation. Given this insight, we develop a framework for analysing and comparing networks, which outperforms all existing ones. We demonstrate its strength by uncovering novel relationships between seemingly unrelated networks, such as Facebook, metabolic, and protein structure networks. We also use it to track the dynamics of the world trade network, showing that a country's role of a broker between non-trading countries indicates economic prosperity, whereas peripheral roles are associated with poverty. This result, though intuitive, has escaped all existing frameworks. Finally, our approach translates network topology into everyday language, bringing network analysis closer to domain scientists.
NASA Astrophysics Data System (ADS)
Wang, Ting; Plecháč, Petr
2017-12-01
Stochastic reaction networks that exhibit bistable behavior are common in systems biology, materials science, and catalysis. Sampling of stationary distributions is crucial for understanding and characterizing the long-time dynamics of bistable stochastic dynamical systems. However, simulations are often hindered by the insufficient sampling of rare transitions between the two metastable regions. In this paper, we apply the parallel replica method for a continuous time Markov chain in order to improve sampling of the stationary distribution in bistable stochastic reaction networks. The proposed method uses parallel computing to accelerate the sampling of rare transitions. Furthermore, it can be combined with the path-space information bounds for parametric sensitivity analysis. With the proposed methodology, we study three bistable biological networks: the Schlögl model, the genetic switch network, and the enzymatic futile cycle network. We demonstrate the algorithmic speedup achieved in these numerical benchmarks. More significant acceleration is expected when multi-core or graphics processing unit computer architectures and programming tools such as CUDA are employed.
Biffi, E; Menegon, A; Regalia, G; Maida, S; Ferrigno, G; Pedrocchi, A
2011-08-15
Modern drug discovery for Central Nervous System pathologies has recently focused its attention to in vitro neuronal networks as models for the study of neuronal activities. Micro Electrode Arrays (MEAs), a widely recognized tool for pharmacological investigations, enable the simultaneous study of the spiking activity of discrete regions of a neuronal culture, providing an insight into the dynamics of networks. Taking advantage of MEAs features and making the most of the cross-correlation analysis to assess internal parameters of a neuronal system, we provide an efficient method for the evaluation of comprehensive neuronal network activity. We developed an intra network burst correlation algorithm, we evaluated its sensitivity and we explored its potential use in pharmacological studies. Our results demonstrate the high sensitivity of this algorithm and the efficacy of this methodology in pharmacological dose-response studies, with the advantage of analyzing the effect of drugs on the comprehensive correlative properties of integrated neuronal networks. Copyright © 2011 Elsevier B.V. All rights reserved.
Lessons from social network analyses for behavioral medicine.
Rosenquist, James N
2011-03-01
This study presents an overview of the rapidly expanding field of social network analysis, with an emphasis placed on work relevant to behavioral health clinicians and researchers. I outline how social network analysis is a distinct empirical methodology within the social sciences that has the potential to deepen our understanding of how mental health and addiction are influenced by social environmental factors. Whereas there have been a number of recent studies in the mental health literature that discuss social influences on mental illness and addiction, and a number of studies looking at how social networks influence health and behaviors, there are still relatively few studies that combine the two. Those that have suggest that mood symptoms as well as alcohol consumption are clustered within, and may travel along, social networks. Social networks appear to have an important influence on a variety of mental health conditions. This avenue of research has the potential to influence both clinical practice and public policy.
Haile, Sarah R; Guerra, Beniamino; Soriano, Joan B; Puhan, Milo A
2017-12-21
Prediction models and prognostic scores have been increasingly popular in both clinical practice and clinical research settings, for example to aid in risk-based decision making or control for confounding. In many medical fields, a large number of prognostic scores are available, but practitioners may find it difficult to choose between them due to lack of external validation as well as lack of comparisons between them. Borrowing methodology from network meta-analysis, we describe an approach to Multiple Score Comparison meta-analysis (MSC) which permits concurrent external validation and comparisons of prognostic scores using individual patient data (IPD) arising from a large-scale international collaboration. We describe the challenges in adapting network meta-analysis to the MSC setting, for instance the need to explicitly include correlations between the scores on a cohort level, and how to deal with many multi-score studies. We propose first using IPD to make cohort-level aggregate discrimination or calibration scores, comparing all to a common comparator. Then, standard network meta-analysis techniques can be applied, taking care to consider correlation structures in cohorts with multiple scores. Transitivity, consistency and heterogeneity are also examined. We provide a clinical application, comparing prognostic scores for 3-year mortality in patients with chronic obstructive pulmonary disease using data from a large-scale collaborative initiative. We focus on the discriminative properties of the prognostic scores. Our results show clear differences in performance, with ADO and eBODE showing higher discrimination with respect to mortality than other considered scores. The assumptions of transitivity and local and global consistency were not violated. Heterogeneity was small. We applied a network meta-analytic methodology to externally validate and concurrently compare the prognostic properties of clinical scores. Our large-scale external validation indicates that the scores with the best discriminative properties to predict 3 year mortality in patients with COPD are ADO and eBODE.
Saramago, Pedro; Woods, Beth; Weatherly, Helen; Manca, Andrea; Sculpher, Mark; Khan, Kamran; Vickers, Andrew J; MacPherson, Hugh
2016-10-06
Network meta-analysis methods, which are an extension of the standard pair-wise synthesis framework, allow for the simultaneous comparison of multiple interventions and consideration of the entire body of evidence in a single statistical model. There are well-established advantages to using individual patient data to perform network meta-analysis and methods for network meta-analysis of individual patient data have already been developed for dichotomous and time-to-event data. This paper describes appropriate methods for the network meta-analysis of individual patient data on continuous outcomes. This paper introduces and describes network meta-analysis of individual patient data models for continuous outcomes using the analysis of covariance framework. Comparisons are made between this approach and change score and final score only approaches, which are frequently used and have been proposed in the methodological literature. A motivating example on the effectiveness of acupuncture for chronic pain is used to demonstrate the methods. Individual patient data on 28 randomised controlled trials were synthesised. Consistency of endpoints across the evidence base was obtained through standardisation and mapping exercises. Individual patient data availability avoided the use of non-baseline-adjusted models, allowing instead for analysis of covariance models to be applied and thus improving the precision of treatment effect estimates while adjusting for baseline imbalance. The network meta-analysis of individual patient data using the analysis of covariance approach is advocated to be the most appropriate modelling approach for network meta-analysis of continuous outcomes, particularly in the presence of baseline imbalance. Further methods developments are required to address the challenge of analysing aggregate level data in the presence of baseline imbalance.
Review of Recent Methodological Developments in Group-Randomized Trials: Part 2-Analysis.
Turner, Elizabeth L; Prague, Melanie; Gallis, John A; Li, Fan; Murray, David M
2017-07-01
In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have updated that review with developments in analysis of the past 13 years, with a companion article to focus on developments in design. We discuss developments in the topics of the earlier review (e.g., methods for parallel-arm GRTs, individually randomized group-treatment trials, and missing data) and in new topics, including methods to account for multiple-level clustering and alternative estimation methods (e.g., augmented generalized estimating equations, targeted maximum likelihood, and quadratic inference functions). In addition, we describe developments in analysis of alternative group designs (including stepped-wedge GRTs, network-randomized trials, and pseudocluster randomized trials), which require clustering to be accounted for in their design and analysis.
NASA Technical Reports Server (NTRS)
Paul, Arthur S.; Gill, Tepper L.; Maclin, Arlene P.
1989-01-01
A study of NASA's Systems Management Policy (SMP) concluded that the primary methodology being used by the Mission Operations and Data Systems Directorate and its subordinate, the Networks Division, is very effective. Still some unmet needs were identified. This study involved evaluating methodologies, tools, and techniques with the potential for resolving the previously identified deficiencies. Six preselected methodologies being used by other organizations with similar development problems were studied. The study revealed a wide range of significant differences in structure. Each system had some strengths but none will satisfy all of the needs of the Networks Division. Areas for improvement of the methodology being used by the Networks Division are listed with recommendations for specific action.
Cognitive task analysis of network analysts and managers for network situational awareness
NASA Astrophysics Data System (ADS)
Erbacher, Robert F.; Frincke, Deborah A.; Wong, Pak Chung; Moody, Sarah; Fink, Glenn
2010-01-01
The goal of our project is to create a set of next-generation cyber situational-awareness capabilities with applications to other domains in the long term. The situational-awareness capabilities being developed focus on novel visualization techniques as well as data analysis techniques designed to improve the comprehensibility of the visualizations. The objective is to improve the decision-making process to enable decision makers to choose better actions. To this end, we put extensive effort into ensuring we had feedback from network analysts and managers and understanding what their needs truly are. This paper discusses the cognitive task analysis methodology we followed to acquire feedback from the analysts. This paper also provides the details we acquired from the analysts on their processes, goals, concerns, etc. A final result we describe is the generation of a task-flow diagram.
Quantifiable and objective approach to organizational performance enhancement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholand, Andrew Joseph; Tausczik, Yla R.
This report describes a new methodology, social language network analysis (SLNA), that combines tools from social language processing and network analysis to identify socially situated relationships between individuals which, though subtle, are highly influential. Specifically, SLNA aims to identify and characterize the nature of working relationships by processing artifacts generated with computer-mediated communication systems, such as instant message texts or emails. Because social language processing is able to identify psychological, social, and emotional processes that individuals are not able to fully mask, social language network analysis can clarify and highlight complex interdependencies between group members, even when these relationships aremore » latent or unrecognized. This report outlines the philosophical antecedents of SLNA, the mechanics of preprocessing, processing, and post-processing stages, and some example results obtained by applying this approach to a 15-month corporate discussion archive.« less
The Aeronautical Data Link: Taxonomy, Architectural Analysis, and Optimization
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Goode, Plesent W.
2002-01-01
The future Communication, Navigation, and Surveillance/Air Traffic Management (CNS/ATM) System will rely on global satellite navigation, and ground-based and satellite based communications via Multi-Protocol Networks (e.g. combined Aeronautical Telecommunications Network (ATN)/Internet Protocol (IP)) to bring about needed improvements in efficiency and safety of operations to meet increasing levels of air traffic. This paper will discuss the development of an approach that completely describes optimal data link architecture configuration and behavior to meet the multiple conflicting objectives of concurrent and different operations functions. The practical application of the approach enables the design and assessment of configurations relative to airspace operations phases. The approach includes a formal taxonomic classification, an architectural analysis methodology, and optimization techniques. The formal taxonomic classification provides a multidimensional correlation of data link performance with data link service, information protocol, spectrum, and technology mode; and to flight operations phase and environment. The architectural analysis methodology assesses the impact of a specific architecture configuration and behavior on the local ATM system performance. Deterministic and stochastic optimization techniques maximize architectural design effectiveness while addressing operational, technology, and policy constraints.
Vein matching using artificial neural network in vein authentication systems
NASA Astrophysics Data System (ADS)
Noori Hoshyar, Azadeh; Sulaiman, Riza
2011-10-01
Personal identification technology as security systems is developing rapidly. Traditional authentication modes like key; password; card are not safe enough because they could be stolen or easily forgotten. Biometric as developed technology has been applied to a wide range of systems. According to different researchers, vein biometric is a good candidate among other biometric traits such as fingerprint, hand geometry, voice, DNA and etc for authentication systems. Vein authentication systems can be designed by different methodologies. All the methodologies consist of matching stage which is too important for final verification of the system. Neural Network is an effective methodology for matching and recognizing individuals in authentication systems. Therefore, this paper explains and implements the Neural Network methodology for finger vein authentication system. Neural Network is trained in Matlab to match the vein features of authentication system. The Network simulation shows the quality of matching as 95% which is a good performance for authentication system matching.
Approaching human language with complex networks
NASA Astrophysics Data System (ADS)
Cong, Jin; Liu, Haitao
2014-12-01
The interest in modeling and analyzing human language with complex networks is on the rise in recent years and a considerable body of research in this area has already been accumulated. We survey three major lines of linguistic research from the complex network approach: 1) characterization of human language as a multi-level system with complex network analysis; 2) linguistic typological research with the application of linguistic networks and their quantitative measures; and 3) relationships between the system-level complexity of human language (determined by the topology of linguistic networks) and microscopic linguistic (e.g., syntactic) features (as the traditional concern of linguistics). We show that the models and quantitative tools of complex networks, when exploited properly, can constitute an operational methodology for linguistic inquiry, which contributes to the understanding of human language and the development of linguistics. We conclude our review with suggestions for future linguistic research from the complex network approach: 1) relationships between the system-level complexity of human language and microscopic linguistic features; 2) expansion of research scope from the global properties to other levels of granularity of linguistic networks; and 3) combination of linguistic network analysis with other quantitative studies of language (such as quantitative linguistics).
Munksgaard, Rasmus; Demant, Jakob; Branwen, Gwern
2016-09-01
The development of cryptomarkets has gained increasing attention from academics, including growing scientific literature on the distribution of illegal goods using cryptomarkets. Dolliver's 2015 article "Evaluating drug trafficking on the Tor Network: Silk Road 2, the Sequel" addresses this theme by evaluating drug trafficking on one of the most well-known cryptomarkets, Silk Road 2.0. The research on cryptomarkets in general-particularly in Dolliver's article-poses a number of new questions for methodologies. This commentary is structured around a replication of Dolliver's original study. The replication study is not based on Dolliver's original dataset, but on a second dataset collected applying the same methodology. We have found that the results produced by Dolliver differ greatly from our replicated study. While a margin of error is to be expected, the inconsistencies we found are too great to attribute to anything other than methodological issues. The analysis and conclusions drawn from studies using these methods are promising and insightful. However, based on the replication of Dolliver's study, we suggest that researchers using these methodologies consider and that datasets be made available for other researchers, and that methodology and dataset metrics (e.g. number of downloaded pages, error logs) are described thoroughly in the context of web-o-metrics and web crawling. Copyright © 2016 Elsevier B.V. All rights reserved.
Gis-Based Accessibility Analysis of Urban Emergency Shelters: the Case of Adana City
NASA Astrophysics Data System (ADS)
Unal, M.; Uslu, C.
2016-10-01
Accessibility analysis of urban emergency shelters can help support urban disaster prevention planning. Pre-disaster emergency evacuation zoning has become a significant topic on disaster prevention and mitigation research. In this study, we assessed the level of serviceability of urban emergency shelters within maximum capacity, usability, sufficiency and a certain walking time limit by employing spatial analysis techniques of GIS-Network Analyst. The methodology included the following aspects: the distribution analysis of emergency evacuation demands, the calculation of shelter space accessibility and the optimization of evacuation destinations. This methodology was applied to Adana, a city in Turkey, which is located within the Alpine-Himalayan orogenic system, the second major earthquake belt after the Pacific-Belt. It was found that the proposed methodology could be useful in aiding to understand the spatial distribution of urban emergency shelters more accurately and establish effective future urban disaster prevention planning. Additionally, this research provided a feasible way for supporting emergency management in terms of shelter construction, pre-disaster evacuation drills and rescue operations.
Social networks of patients with chronic skin lesions: nursing care.
Bandeira, Luciana Alves; Santos, Maxuel Cruz Dos; Duarte, Êrica Rosalba Mallmann; Bandeira, Andrea Gonçalves; Riquinho, Deise Lisboa; Vieira, Letícia Becker
2018-01-01
To describe the social networks of patients with chronic skin damages. A qualitative study conducted through semi-structured interviews with nine subjects with chronic skin lesions from June 2016 to March 2017; we used the theoretical-methodological framework of Lia Sanicola's Social Network. The analysis of the relational maps revealed that the primary network was formed mainly by relatives and neighbors; its characteristics, such as: reduced size, low density and few exchanges/relationships, configures fragility in these links. The secondary network was essentially described by health services, and the nurse was cited as a linker in the therapeutic process. Faced with the fragility of the links and social isolation, the primary health care professionals are fundamental foundations for the construction of networks of social support and care for patients with chronic skin lesions.
Symptoms of posttraumatic stress disorder in a clinical sample of refugees: a network analysis
Spiller, Tobias R.; Schick, Matthis; Schnyder, Ulrich; Bryant, Richard A.; Nickerson, Angela; Morina, Naser
2017-01-01
ABSTRACT Background: Network analysis is an emerging methodology for investigating psychopathological symptoms. Given the unprecedented number of refugees and the increased prevalence of mental disorders such as posttraumatic stress disorder (PTSD) in this population, new methodologies that help us better to understand psychopathology in refugees are crucial. Objective: The objective of this study was to explore the network structure and centrality indices of DSM-5 PTSD symptoms in a cross-sectional clinical sample of 151 severely traumatized refugees with and without a formal PTSD diagnosis. Method: The R-packages qgraph and bootnet were used to estimate the structure of a PTSD symptom network and its centrality indices. In addition, robustness and significance analyses for the edges weights and the order of centrality were performed. Results: Three pairs of symptoms showed significantly stronger connections than at least half of the other connections: hypervigilance and exaggerated startle response, intrusion and difficulties falling asleep, and irritability or outbursts of anger and self-destructive or reckless behaviour. Emotional cue reactivity had the highest centrality and trauma-related amnesia the lowest. Conclusion: Although only 51.0% of participants fulfilled criteria for a probable PTSD diagnosis, emotional cue reactivity showed the highest centrality, emphasizing the importance of emotional trauma reminders in severely traumatized refugees attending an outpatient clinic. However, due to the small sample size, the results should be interpreted with care. PMID:29038688
Toroody, Ahmad Bahoo; Abaei, Mohammad Mahdy; Gholamnia, Reza
2016-12-01
Risk assessment can be classified into two broad categories: traditional and modern. This paper is aimed at contrasting the functional resonance analysis method (FRAM) as a modern approach with the fault tree analysis (FTA) as a traditional method, regarding assessing the risks of a complex system. Applied methodology by which the risk assessment is carried out, is presented in each approach. Also, FRAM network is executed with regard to nonlinear interaction of human and organizational levels to assess the safety of technological systems. The methodology is implemented for lifting structures deep offshore. The main finding of this paper is that the combined application of FTA and FRAM during risk assessment, could provide complementary perspectives and may contribute to a more comprehensive understanding of an incident. Finally, it is shown that coupling a FRAM network with a suitable quantitative method will result in a plausible outcome for a predefined accident scenario.
Participatory Design Methods for C2 Systems (Proceedings/Presentation)
2006-01-01
Cognitive Task Analysis (CTA) 16. SECURITY CLASSIFICATION OF: 17. LIMITATION 18. NUMBER 19a. NAME OF RESPONSIBLE PERSON OF ABSTRACT OF PAGES Janet E. Miller...systems to support cognitive work such as is accomplished in a network-centric -environment. Cognitive task analysis (CTA) methods are used to...of cognitive task analysis methodologies exist (Schraagen et al., 2000). However, many of these methods are skeptically viewed by a domain’s
Concepts of Connectivity and Human Epileptic Activity
Lemieux, Louis; Daunizeau, Jean; Walker, Matthew C.
2011-01-01
This review attempts to place the concept of connectivity from increasingly sophisticated neuroimaging data analysis methodologies within the field of epilepsy research. We introduce the more principled connectivity terminology developed recently in neuroimaging and review some of the key concepts related to the characterization of propagation of epileptic activity using what may be called traditional correlation-based studies based on EEG. We then show how essentially similar methodologies, and more recently models addressing causality, have been used to characterize whole-brain and regional networks using functional MRI data. Following a discussion of our current understanding of the neuronal system aspects of the onset and propagation of epileptic discharges and seizures, we discuss the most advanced and ambitious framework to attempt to fully characterize epileptic networks based on neuroimaging data. PMID:21472027
Investigating System Dependability Modeling Using AADL
NASA Technical Reports Server (NTRS)
Hall, Brendan; Driscoll, Kevin R.; Madl, Gabor
2013-01-01
This report describes Architecture Analysis & Design Language (AADL) models for a diverse set of fault-tolerant, embedded data networks and describes the methods and tools used to created these models. It also includes error models per the AADL Error Annex. Some networks were modeled using Error Detection Isolation Containment Types (EDICT). This report gives a brief description for each of the networks, a description of its modeling, the model itself, and evaluations of the tools used for creating the models. The methodology includes a naming convention that supports a systematic way to enumerate all of the potential failure modes.
Mapping Learning and Game Mechanics for Serious Games Analysis
ERIC Educational Resources Information Center
Arnab, Sylvester; Lim, Theodore; Carvalho, Maira B.; Bellotti, Francesco; de Freitas, Sara; Louchart, Sandy; Suttie, Neil; Berta, Riccardo; De Gloria, Alessandro
2015-01-01
Although there is a consensus on the instructional potential of Serious Games (SGs), there is still a lack of methodologies and tools not only for design but also to support analysis and assessment. Filling this gap is one of the main aims of the Games and Learning Alliance (http://www.galanoe.eu) European Network of Excellence on Serious Games,…
Fat fractal scaling of drainage networks from a random spatial network model
Karlinger, Michael R.; Troutman, Brent M.
1992-01-01
An alternative quantification of the scaling properties of river channel networks is explored using a spatial network model. Whereas scaling descriptions of drainage networks previously have been presented using a fractal analysis primarily of the channel lengths, we illustrate the scaling of the surface area of the channels defining the network pattern with an exponent which is independent of the fractal dimension but not of the fractal nature of the network. The methodology presented is a fat fractal analysis in which the drainage basin minus the channel area is considered the fat fractal. Random channel networks within a fixed basin area are generated on grids of different scales. The sample channel networks generated by the model have a common outlet of fixed width and a rule of upstream channel narrowing specified by a diameter branching exponent using hydraulic and geomorphologic principles. Scaling exponents are computed for each sample network on a given grid size and are regressed against network magnitude. Results indicate that the size of the exponents are related to magnitude of the networks and generally decrease as network magnitude increases. Cases showing differences in scaling exponents with like magnitudes suggest a direction of future work regarding other topologic basin characteristics as potential explanatory variables.
Visualizando el desarrollo de la nanomedicina en México.
Robles-Belmont, Eduardo; Gortari-Rabiela, Rebeca de; Galarza-Barrios, Pilar; Siqueiros-García, Jesús Mario; Ruiz-León, Alejandro Arnulfo
2017-01-01
In this article we present a set of different visualizations of Mexico's nanomedicine scientific production data. Visualizations were developed using different methodologies for data analysis and visualization such as social network analysis, geography of science maps, and complex network communities analysis. Results are a multi-dimensional overview of the evolution of nanomedicine in Mexico. Moreover, visualizations allowed to identify trends and patterns of collaboration at the national and international level. Trends are also found in the knowledge structure of themes and disciplines. Finally, we identified the scientific communities in Mexico that are responsible for the new knowledge production in this emergent field of science. Copyright: © 2017 SecretarÍa de Salud
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2016-12-01
Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.
Parsing Social Network Survey Data from Hidden Populations Using Stochastic Context-Free Grammars
Poon, Art F. Y.; Brouwer, Kimberly C.; Strathdee, Steffanie A.; Firestone-Cruz, Michelle; Lozada, Remedios M.; Kosakovsky Pond, Sergei L.; Heckathorn, Douglas D.; Frost, Simon D. W.
2009-01-01
Background Human populations are structured by social networks, in which individuals tend to form relationships based on shared attributes. Certain attributes that are ambiguous, stigmatized or illegal can create a ÔhiddenÕ population, so-called because its members are difficult to identify. Many hidden populations are also at an elevated risk of exposure to infectious diseases. Consequently, public health agencies are presently adopting modern survey techniques that traverse social networks in hidden populations by soliciting individuals to recruit their peers, e.g., respondent-driven sampling (RDS). The concomitant accumulation of network-based epidemiological data, however, is rapidly outpacing the development of computational methods for analysis. Moreover, current analytical models rely on unrealistic assumptions, e.g., that the traversal of social networks can be modeled by a Markov chain rather than a branching process. Methodology/Principal Findings Here, we develop a new methodology based on stochastic context-free grammars (SCFGs), which are well-suited to modeling tree-like structure of the RDS recruitment process. We apply this methodology to an RDS case study of injection drug users (IDUs) in Tijuana, México, a hidden population at high risk of blood-borne and sexually-transmitted infections (i.e., HIV, hepatitis C virus, syphilis). Survey data were encoded as text strings that were parsed using our custom implementation of the inside-outside algorithm in a publicly-available software package (HyPhy), which uses either expectation maximization or direct optimization methods and permits constraints on model parameters for hypothesis testing. We identified significant latent variability in the recruitment process that violates assumptions of Markov chain-based methods for RDS analysis: firstly, IDUs tended to emulate the recruitment behavior of their own recruiter; and secondly, the recruitment of like peers (homophily) was dependent on the number of recruits. Conclusions SCFGs provide a rich probabilistic language that can articulate complex latent structure in survey data derived from the traversal of social networks. Such structure that has no representation in Markov chain-based models can interfere with the estimation of the composition of hidden populations if left unaccounted for, raising critical implications for the prevention and control of infectious disease epidemics. PMID:19738904
Assessing the Climate Resilience of Transport Infrastructure Investments in Tanzania
NASA Astrophysics Data System (ADS)
Hall, J. W.; Pant, R.; Koks, E.; Thacker, S.; Russell, T.
2017-12-01
Whilst there is an urgent need for infrastructure investment in developing countries, there is a risk that poorly planned and built infrastructure will introduce new vulnerabilities. As climate change increases the magnitudes and frequency of natural hazard events, incidence of disruptive infrastructure failures are likely to become more frequent. Therefore, it is important that infrastructure planning and investment is underpinned by climate risk assessment that can inform adaptation planning. Tanzania's rapid economic growth is placing considerable strain on the country's transportation infrastructure (roads, railways, shipping and aviation); especially at the port of Dar es Salaam and its linking transport corridors. A growing number of natural hazard events, in particular flooding, are impacting the reliability of this already over-used network. Here we report on new methodology to analyse vulnerabilities and risks due to failures of key locations in the intermodal transport network of Tanzania, including strategic connectivity to neighboring countries. To perform the national-scale risk analysis we will utilize a system-of-systems methodology. The main components of this general risk assessment, when applied to transportation systems, include: (1) Assembling data on: spatially coherent extreme hazards and intermodal transportation networks; (2) Intersecting hazards with transport network models to initiate failure conditions that trigger failure propagation across interdependent networks; (3) Quantifying failure outcomes in terms of social impacts (customers/passengers disrupted) and/or macroeconomic consequences (across multiple sectors); and (4) Simulating, testing and collecting multiple failure scenarios to perform an exhaustive risk assessment in terms of probabilities and consequences. The methodology is being used to pinpoint vulnerability and reduce climate risks to transport infrastructure investments.
Unsupervised user similarity mining in GSM sensor networks.
Shad, Shafqat Ali; Chen, Enhong
2013-01-01
Mobility data has attracted the researchers for the past few years because of its rich context and spatiotemporal nature, where this information can be used for potential applications like early warning system, route prediction, traffic management, advertisement, social networking, and community finding. All the mentioned applications are based on mobility profile building and user trend analysis, where mobility profile building is done through significant places extraction, user's actual movement prediction, and context awareness. However, significant places extraction and user's actual movement prediction for mobility profile building are a trivial task. In this paper, we present the user similarity mining-based methodology through user mobility profile building by using the semantic tagging information provided by user and basic GSM network architecture properties based on unsupervised clustering approach. As the mobility information is in low-level raw form, our proposed methodology successfully converts it to a high-level meaningful information by using the cell-Id location information rather than previously used location capturing methods like GPS, Infrared, and Wifi for profile mining and user similarity mining.
Identifying Rodent Resting-State Brain Networks with Independent Component Analysis
Bajic, Dusica; Craig, Michael M.; Mongerson, Chandler R. L.; Borsook, David; Becerra, Lino
2017-01-01
Rodent models have opened the door to a better understanding of the neurobiology of brain disorders and increased our ability to evaluate novel treatments. Resting-state functional magnetic resonance imaging (rs-fMRI) allows for in vivo exploration of large-scale brain networks with high spatial resolution. Its application in rodents affords researchers a powerful translational tool to directly assess/explore the effects of various pharmacological, lesion, and/or disease states on known neural circuits within highly controlled settings. Integration of animal and human research at the molecular-, systems-, and behavioral-levels using diverse neuroimaging techniques empowers more robust interrogations of abnormal/ pathological processes, critical for evolving our understanding of neuroscience. We present a comprehensive protocol to evaluate resting-state brain networks using Independent Component Analysis (ICA) in rodent model. Specifically, we begin with a brief review of the physiological basis for rs-fMRI technique and overview of rs-fMRI studies in rodents to date, following which we provide a robust step-by-step approach for rs-fMRI investigation including data collection, computational preprocessing, and brain network analysis. Pipelines are interwoven with underlying theory behind each step and summarized methodological considerations, such as alternative methods available and current consensus in the literature for optimal results. The presented protocol is designed in such a way that investigators without previous knowledge in the field can implement the analysis and obtain viable results that reliably detect significant differences in functional connectivity between experimental groups. Our goal is to empower researchers to implement rs-fMRI in their respective fields by incorporating technical considerations to date into a workable methodological framework. PMID:29311770
Ecological Network Analysis for a Low-Carbon and High-Tech Industrial Park
Lu, Yi; Su, Meirong; Liu, Gengyuan; Chen, Bin; Zhou, Shiyi; Jiang, Meiming
2012-01-01
Industrial sector is one of the indispensable contributors in global warming. Even if the occurrence of ecoindustrial parks (EIPs) seems to be a good improvement in saving ecological crises, there is still a lack of definitional clarity and in-depth researches on low-carbon industrial parks. In order to reveal the processes of carbon metabolism in a low-carbon high-tech industrial park, we selected Beijing Development Area (BDA) International Business Park in Beijing, China as case study, establishing a seven-compartment- model low-carbon metabolic network based on the methodology of Ecological Network Analysis (ENA). Integrating the Network Utility Analysis (NUA), Network Control Analysis (NCA), and system-wide indicators, we compartmentalized system sectors into ecological structure and analyzed dependence and control degree based on carbon metabolism. The results suggest that indirect flows reveal more mutuality and exploitation relation between system compartments and they are prone to positive sides for the stability of the whole system. The ecological structure develops well as an approximate pyramidal structure, and the carbon metabolism of BDA proves self-mutualistic and sustainable. Construction and waste management were found to be two active sectors impacting carbon metabolism, which was mainly regulated by internal and external environment. PMID:23365516
Mining the modular structure of protein interaction networks.
Berenstein, Ariel José; Piñero, Janet; Furlong, Laura Inés; Chernomoretz, Ariel
2015-01-01
Cluster-based descriptions of biological networks have received much attention in recent years fostered by accumulated evidence of the existence of meaningful correlations between topological network clusters and biological functional modules. Several well-performing clustering algorithms exist to infer topological network partitions. However, due to respective technical idiosyncrasies they might produce dissimilar modular decompositions of a given network. In this contribution, we aimed to analyze how alternative modular descriptions could condition the outcome of follow-up network biology analysis. We considered a human protein interaction network and two paradigmatic cluster recognition algorithms, namely: the Clauset-Newman-Moore and the infomap procedures. We analyzed to what extent both methodologies yielded different results in terms of granularity and biological congruency. In addition, taking into account Guimera's cartographic role characterization of network nodes, we explored how the adoption of a given clustering methodology impinged on the ability to highlight relevant network meso-scale connectivity patterns. As a case study we considered a set of aging related proteins and showed that only the high-resolution modular description provided by infomap, could unveil statistically significant associations between them and inter/intra modular cartographic features. Besides reporting novel biological insights that could be gained from the discovered associations, our contribution warns against possible technical concerns that might affect the tools used to mine for interaction patterns in network biology studies. In particular our results suggested that sub-optimal partitions from the strict point of view of their modularity levels might still be worth being analyzed when meso-scale features were to be explored in connection with external source of biological knowledge.
González-Alcaide, Gregorio; Calafat, Amador; Becoña, Elisardo; Thijs, Bart; Glänzel, Wolfgang
2016-09-01
The purpose of this study is to introduce a new methodology in the field of substance abuse, namely, co-citation analysis, which uses the bibliographic references of publications to establish the main thematic areas being researched and to identify the seminal documents that have contributed to establishing the intellectual foundation of the discipline at the present time. We identified all bibliographic references that were cited in documents published in the substance abuse journals included in the Journal Citation Reports in the 2001-2012 period, generating a co-citation matrix. This matrix was used to perform a co-citation network analysis. The co-citation network analysis led to the identification of 56 prominent research clusters that bring together 698 documents; their subject matter constitutes the foundation of the discipline in the field's journals. Substance abuse research is dominated by a few core topics; chief among them are tools for measuring and diagnosing dependence, as well as therapeutic approaches to treat alcohol abuse and nicotine addiction. Other areas of note include epidemiological studies, research on drug user motivation (particularly among young people), binge drinking, social support mediators and networks, opioid dependence, consumption and effects of cannabis, basic research on brain damage, genetic factors associated with substance use, and the physiological and neurological determinants of abstinence syndrome. The main works of reference that we identified were published in a small number of journals, which establish the intellectual, conceptual, and methodological basis of the discipline.
2016-12-22
assumptions of behavior. This research proposes an information theoretic methodology to discover such complex network structures and dynamics while overcoming...the difficulties historically associated with their study. Indeed, this was the first application of an information theoretic methodology as a tool...1 Research Objectives and Questions..............................................................................2 Methodology
Psychology and social networks: a dynamic network theory perspective.
Westaby, James D; Pfaff, Danielle L; Redding, Nicholas
2014-04-01
Research on social networks has grown exponentially in recent years. However, despite its relevance, the field of psychology has been relatively slow to explain the underlying goal pursuit and resistance processes influencing social networks in the first place. In this vein, this article aims to demonstrate how a dynamic network theory perspective explains the way in which social networks influence these processes and related outcomes, such as goal achievement, performance, learning, and emotional contagion at the interpersonal level of analysis. The theory integrates goal pursuit, motivation, and conflict conceptualizations from psychology with social network concepts from sociology and organizational science to provide a taxonomy of social network role behaviors, such as goal striving, system supporting, goal preventing, system negating, and observing. This theoretical perspective provides psychologists with new tools to map social networks (e.g., dynamic network charts), which can help inform the development of change interventions. Implications for social, industrial-organizational, and counseling psychology as well as conflict resolution are discussed, and new opportunities for research are highlighted, such as those related to dynamic network intelligence (also known as cognitive accuracy), levels of analysis, methodological/ethical issues, and the need to theoretically broaden the study of social networking and social media behavior. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
How can social network analysis contribute to social behavior research in applied ethology?
Makagon, Maja M; McCowan, Brenda; Mench, Joy A
2012-05-01
Social network analysis is increasingly used by behavioral ecologists and primatologists to describe the patterns and quality of interactions among individuals. We provide an overview of this methodology, with examples illustrating how it can be used to study social behavior in applied contexts. Like most kinds of social interaction analyses, social network analysis provides information about direct relationships (e.g. dominant-subordinate relationships). However, it also generates a more global model of social organization that determines how individual patterns of social interaction relate to individual and group characteristics. A particular strength of this approach is that it provides standardized mathematical methods for calculating metrics of sociality across levels of social organization, from the population and group levels to the individual level. At the group level these metrics can be used to track changes in social network structures over time, evaluate the effect of the environment on social network structure, or compare social structures across groups, populations or species. At the individual level, the metrics allow quantification of the heterogeneity of social experience within groups and identification of individuals who may play especially important roles in maintaining social stability or information flow throughout the network.
Patterson, Megan S; Goodson, Patricia
2017-05-01
Compulsive exercise, a form of unhealthy exercise often associated with prioritizing exercise and feeling guilty when exercise is missed, is a common precursor to and symptom of eating disorders. College-aged women are at high risk of exercising compulsively compared with other groups. Social network analysis (SNA) is a theoretical perspective and methodology allowing researchers to observe the effects of relational dynamics on the behaviors of people. SNA was used to assess the relationship between compulsive exercise and body dissatisfaction, physical activity, and network variables. Descriptive statistics were conducted using SPSS, and quadratic assignment procedure (QAP) analyses were conducted using UCINET. QAP regression analysis revealed a statistically significant model (R 2 = .375, P < .0001) predicting compulsive exercise behavior. Physical activity, body dissatisfaction, and network variables were statistically significant predictor variables in the QAP regression model. In our sample, women who are connected to "important" or "powerful" people in their network are likely to have higher compulsive exercise scores. This result provides healthcare practitioners key target points for intervention within similar groups of women. For scholars researching eating disorders and associated behaviors, this study supports looking into group dynamics and network structure in conjunction with body dissatisfaction and exercise frequency.
Ladstätter, Felix; Garrosa, Eva; Moreno-Jiménez, Bernardo; Ponsoda, Vicente; Reales Aviles, José Manuel; Dai, Junming
2016-01-01
Artificial neural networks are sophisticated modelling and prediction tools capable of extracting complex, non-linear relationships between predictor (input) and predicted (output) variables. This study explores this capacity by modelling non-linearities in the hardiness-modulated burnout process with a neural network. Specifically, two multi-layer feed-forward artificial neural networks are concatenated in an attempt to model the composite non-linear burnout process. Sensitivity analysis, a Monte Carlo-based global simulation technique, is then utilised to examine the first-order effects of the predictor variables on the burnout sub-dimensions and consequences. Results show that (1) this concatenated artificial neural network approach is feasible to model the burnout process, (2) sensitivity analysis is a prolific method to study the relative importance of predictor variables and (3) the relationships among variables involved in the development of burnout and its consequences are to different degrees non-linear. Many relationships among variables (e.g., stressors and strains) are not linear, yet researchers use linear methods such as Pearson correlation or linear regression to analyse these relationships. Artificial neural network analysis is an innovative method to analyse non-linear relationships and in combination with sensitivity analysis superior to linear methods.
A Security Assessment Mechanism for Software-Defined Networking-Based Mobile Networks.
Luo, Shibo; Dong, Mianxiong; Ota, Kaoru; Wu, Jun; Li, Jianhua
2015-12-17
Software-Defined Networking-based Mobile Networks (SDN-MNs) are considered the future of 5G mobile network architecture. With the evolving cyber-attack threat, security assessments need to be performed in the network management. Due to the distinctive features of SDN-MNs, such as their dynamic nature and complexity, traditional network security assessment methodologies cannot be applied directly to SDN-MNs, and a novel security assessment methodology is needed. In this paper, an effective security assessment mechanism based on attack graphs and an Analytic Hierarchy Process (AHP) is proposed for SDN-MNs. Firstly, this paper discusses the security assessment problem of SDN-MNs and proposes a methodology using attack graphs and AHP. Secondly, to address the diversity and complexity of SDN-MNs, a novel attack graph definition and attack graph generation algorithm are proposed. In order to quantify security levels, the Node Minimal Effort (NME) is defined to quantify attack cost and derive system security levels based on NME. Thirdly, to calculate the NME of an attack graph that takes the dynamic factors of SDN-MN into consideration, we use AHP integrated with the Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS) as the methodology. Finally, we offer a case study to validate the proposed methodology. The case study and evaluation show the advantages of the proposed security assessment mechanism.
A Security Assessment Mechanism for Software-Defined Networking-Based Mobile Networks
Luo, Shibo; Dong, Mianxiong; Ota, Kaoru; Wu, Jun; Li, Jianhua
2015-01-01
Software-Defined Networking-based Mobile Networks (SDN-MNs) are considered the future of 5G mobile network architecture. With the evolving cyber-attack threat, security assessments need to be performed in the network management. Due to the distinctive features of SDN-MNs, such as their dynamic nature and complexity, traditional network security assessment methodologies cannot be applied directly to SDN-MNs, and a novel security assessment methodology is needed. In this paper, an effective security assessment mechanism based on attack graphs and an Analytic Hierarchy Process (AHP) is proposed for SDN-MNs. Firstly, this paper discusses the security assessment problem of SDN-MNs and proposes a methodology using attack graphs and AHP. Secondly, to address the diversity and complexity of SDN-MNs, a novel attack graph definition and attack graph generation algorithm are proposed. In order to quantify security levels, the Node Minimal Effort (NME) is defined to quantify attack cost and derive system security levels based on NME. Thirdly, to calculate the NME of an attack graph that takes the dynamic factors of SDN-MN into consideration, we use AHP integrated with the Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS) as the methodology. Finally, we offer a case study to validate the proposed methodology. The case study and evaluation show the advantages of the proposed security assessment mechanism. PMID:26694409
Mapping Sustainability Efforts at the Claremont Colleges
ERIC Educational Resources Information Center
Srebotnjak, Tanja; Norgaard, Lee Michelle
2017-01-01
Purpose: The purpose of this study is to map and analyze sustainability activities and relationships at the seven Claremont Colleges and graduate institutions using social network analysis (SNA) to inform sustainability planning and programming. Design/methodology/approach: Online surveys and interviews were conducted among faculty, staff and…
Visualization and Hierarchical Analysis of Flow in Discrete Fracture Network Models
NASA Astrophysics Data System (ADS)
Aldrich, G. A.; Gable, C. W.; Painter, S. L.; Makedonska, N.; Hamann, B.; Woodring, J.
2013-12-01
Flow and transport in low permeability fractured rock is primary in interconnected fracture networks. Prediction and characterization of flow and transport in fractured rock has important implications in underground repositories for hazardous materials (eg. nuclear and chemical waste), contaminant migration and remediation, groundwater resource management, and hydrocarbon extraction. We have developed methods to explicitly model flow in discrete fracture networks and track flow paths using passive particle tracking algorithms. Visualization and analysis of particle trajectory through the fracture network is important to understanding fracture connectivity, flow patterns, potential contaminant pathways and fast paths through the network. However, occlusion due to the large number of highly tessellated and intersecting fracture polygons preclude the effective use of traditional visualization methods. We would also like quantitative analysis methods to characterize the trajectory of a large number of particle paths. We have solved these problems by defining a hierarchal flow network representing the topology of particle flow through the fracture network. This approach allows us to analyses the flow and the dynamics of the system as a whole. We are able to easily query the flow network, and use paint-and-link style framework to filter the fracture geometry and particle traces based on the flow analytics. This allows us to greatly reduce occlusion while emphasizing salient features such as the principal transport pathways. Examples are shown that demonstrate the methodology and highlight how use of this new method allows quantitative analysis and characterization of flow and transport in a number of representative fracture networks.
Cinner, Joshua E.; Bodin, Örjan
2010-01-01
Background Diverse livelihood portfolios are frequently viewed as a critical component of household economies in developing countries. Within the context of natural resources governance in particular, the capacity of individual households to engage in multiple occupations has been shown to influence important issues such as whether fishers would exit a declining fishery, how people react to policy, the types of resource management systems that may be applicable, and other decisions about natural resource use. Methodology/Principal Findings This paper uses network analysis to provide a novel methodological framework for detailed systemic analysis of household livelihood portfolios. Paying particular attention to the role of natural resource-based occupations such as fisheries, we use network analyses to map occupations and their interrelationships- what we refer to as ‘livelihood landscapes’. This network approach allows for the visualization of complex information about dependence on natural resources that can be aggregated at different scales. We then examine how the role of natural resource-based occupations changes along spectra of socioeconomic development and population density in 27 communities in 5 western Indian Ocean countries. Network statistics, including in- and out-degree centrality, the density of the network, and the level of network centralization are compared along a multivariate index of community-level socioeconomic development and a gradient of human population density. The combination of network analyses suggests an increase in household-level specialization with development for most occupational sectors, including fishing and farming, but that at the community-level, economies remained diversified. Conclusions/Significance The novel modeling approach introduced here provides for various types of livelihood portfolio analyses at different scales of social aggregation. Our livelihood landscapes approach provides insights into communities' dependencies and usages of natural resources, and shows how patterns of occupational interrelationships relate to socioeconomic development and population density. A key question for future analysis is how the reduction of household occupational diversity, but maintenance of community diversity we see with increasing socioeconomic development influences key aspects of societies' vulnerability to environmental change or disasters. PMID:20711442
Augustin, J; Austermann, J; Erasmi, S
2016-10-18
Background: One of the overall objectives of the legislator is to ensure an overall "homogeneous", and easily accessible medical care for the population. The physician-patient ratio can be used to describe the regional health care situation. But this method does not provide information concerning the availability of, for instance, the nearest doctor. Therefore, further parameters such as accessibility must be taken into consideration. For this purpose, network analyses are an appropriate method. The objective of this study is to present methodological tools to evaluate the healthcare situation in the metropolitan region of Hamburg, primarily focusing on accessibility using dermatologists as an example. Methods: Analyzing data of 20 counties, the geographical distribution of N=357 dermatologists and the physician-patient ratio were calculated. In a second step, a network analysis regarding accessibility was performed. In order to calculate accessibility, address data (physicians) were transformed into coordinates, consisting of defined places (N=303) and restrictions (e. g. speed, turn restrictions) of the network. The calculation of population-based accessibility is based on grid cells for the population density. Results: Despite adequacy of the overall medical situation, differences in the availability of the nearest dermatologists in the metropolitan region are remarkable, particularly when use of public transport is taken into consideration. In some counties, over 60% of the population require at least one hour to get to the nearest dermatologist using public transportation. In rural regions within the metropolitan area are particularly affected. Conclusion: The network analysis has shown that the choice and availability of transportation in combination with the location (rural/urban) is essential for health care access. Especially elderly people in rural areas with restricted mobility are at a disadvantage. Therefore, modern health care approaches (e. g. telemedicine) are necessary to optimize the health care situation in rural areas. Network analyses can make a valuable methodological contribution to the analysis of regional health care disparities. © Georg Thieme Verlag KG Stuttgart · New York.
Application of neural networks and sensitivity analysis to improved prediction of trauma survival.
Hunter, A; Kennedy, L; Henry, J; Ferguson, I
2000-05-01
The performance of trauma departments is widely audited by applying predictive models that assess probability of survival, and examining the rate of unexpected survivals and deaths. Although the TRISS methodology, a logistic regression modelling technique, is still the de facto standard, it is known that neural network models perform better. A key issue when applying neural network models is the selection of input variables. This paper proposes a novel form of sensitivity analysis, which is simpler to apply than existing techniques, and can be used for both numeric and nominal input variables. The technique is applied to the audit survival problem, and used to analyse the TRISS variables. The conclusions discuss the implications for the design of further improved scoring schemes and predictive models.
Amezquita-Sanchez, Juan P; Adeli, Anahita; Adeli, Hojjat
2016-05-15
Mild cognitive impairment (MCI) is a cognitive disorder characterized by memory impairment, greater than expected by age. A new methodology is presented to identify MCI patients during a working memory task using MEG signals. The methodology consists of four steps: In step 1, the complete ensemble empirical mode decomposition (CEEMD) is used to decompose the MEG signal into a set of adaptive sub-bands according to its contained frequency information. In step 2, a nonlinear dynamics measure based on permutation entropy (PE) analysis is employed to analyze the sub-bands and detect features to be used for MCI detection. In step 3, an analysis of variation (ANOVA) is used for feature selection. In step 4, the enhanced probabilistic neural network (EPNN) classifier is applied to the selected features to distinguish between MCI and healthy patients. The usefulness and effectiveness of the proposed methodology are validated using the sensed MEG data obtained experimentally from 18 MCI and 19 control patients. Copyright © 2016 Elsevier B.V. All rights reserved.
Metabolomics analysis: Finding out metabolic building blocks
2017-01-01
In this paper we propose a new methodology for the analysis of metabolic networks. We use the notion of strongly connected components of a graph, called in this context metabolic building blocks. Every strongly connected component is contracted to a single node in such a way that the resulting graph is a directed acyclic graph, called a metabolic DAG, with a considerably reduced number of nodes. The property of being a directed acyclic graph brings out a background graph topology that reveals the connectivity of the metabolic network, as well as bridges, isolated nodes and cut nodes. Altogether, it becomes a key information for the discovery of functional metabolic relations. Our methodology has been applied to the glycolysis and the purine metabolic pathways for all organisms in the KEGG database, although it is general enough to work on any database. As expected, using the metabolic DAGs formalism, a considerable reduction on the size of the metabolic networks has been obtained, specially in the case of the purine pathway due to its relative larger size. As a proof of concept, from the information captured by a metabolic DAG and its corresponding metabolic building blocks, we obtain the core of the glycolysis pathway and the core of the purine metabolism pathway and detect some essential metabolic building blocks that reveal the key reactions in both pathways. Finally, the application of our methodology to the glycolysis pathway and the purine metabolism pathway reproduce the tree of life for the whole set of the organisms represented in the KEGG database which supports the utility of this research. PMID:28493998
Ross, Jana; Murphy, Dominic; Armour, Cherie
2018-05-28
Network analysis is a relatively new methodology for studying psychological disorders. It focuses on the associations between individual symptoms which are hypothesized to mutually interact with each other. The current study represents the first network analysis conducted with treatment-seeking military veterans in UK. The study aimed to examine the network structure of posttraumatic stress disorder (PTSD) symptoms and four domains of functional impairment by identifying the most central (i.e., important) symptoms of PTSD and by identifying those symptoms of PTSD that are related to functional impairment. Participants were 331 military veterans with probable PTSD. In the first step, a network of PTSD symptoms based on the PTSD Checklist for DSM-5 was estimated. In the second step, functional impairment items were added to the network. The most central symptoms of PTSD were recurrent thoughts, nightmares, negative emotional state, detachment and exaggerated startle response. Functional impairment was related to a number of different PTSD symptoms. Impairments in close relationships were associated primarily with the negative alterations in cognitions and mood symptoms and impairments in home management were associated primarily with the reexperiencing symptoms. The results are discussed in relation to previous PTSD network studies and include implications for clinical practice. Copyright © 2018 Elsevier Ltd. All rights reserved.
Seismic Hazard Analysis on a Complex, Interconnected Fault Network
NASA Astrophysics Data System (ADS)
Page, M. T.; Field, E. H.; Milner, K. R.
2017-12-01
In California, seismic hazard models have evolved from simple, segmented prescriptive models to much more complex representations of multi-fault and multi-segment earthquakes on an interconnected fault network. During the development of the 3rd Uniform California Earthquake Rupture Forecast (UCERF3), the prevalence of multi-fault ruptures in the modeling was controversial. Yet recent earthquakes, for example, the Kaikora earthquake - as well as new research on the potential of multi-fault ruptures (e.g., Nissen et al., 2016; Sahakian et al. 2017) - have validated this approach. For large crustal earthquakes, multi-fault ruptures may be the norm rather than the exception. As datasets improve and we can view the rupture process at a finer scale, the interconnected, fractal nature of faults is revealed even by individual earthquakes. What is the proper way to model earthquakes on a fractal fault network? We show multiple lines of evidence that connectivity even in modern models such as UCERF3 may be underestimated, although clustering in UCERF3 mitigates some modeling simplifications. We need a methodology that can be applied equally well where the fault network is well-mapped and where it is not - an extendable methodology that allows us to "fill in" gaps in the fault network and in our knowledge.
A network approach to decentralized coordination of energy production-consumption grids.
Omodei, Elisa; Arenas, Alex
2018-01-01
Energy grids are facing a relatively new paradigm consisting in the formation of local distributed energy sources and loads that can operate in parallel independently from the main power grid (usually called microgrids). One of the main challenges in microgrid-like networks management is that of self-adapting to the production and demands in a decentralized coordinated way. Here, we propose a stylized model that allows to analytically predict the coordination of the elements in the network, depending on the network topology. Surprisingly, almost global coordination is attained when users interact locally, with a small neighborhood, instead of the obvious but more costly all-to-all coordination. We compute analytically the optimal value of coordinated users in random homogeneous networks. The methodology proposed opens a new way of confronting the analysis of energy demand-side management in networked systems.
Corpus Linguistics, Network Analysis and Co-Occurrence Matrices
ERIC Educational Resources Information Center
Stuart, Keith; Botella, Ana
2009-01-01
This article describes research undertaken in order to design a methodology for the reticular representation of knowledge of a specific discourse community. To achieve this goal, a representative corpus of the scientific production of the members of this discourse community (Universidad Politecnica de Valencia, UPV) was created. This article…
ERIC Educational Resources Information Center
Turan, Fikret Korhan; Cetinkaya, Saadet; Ustun, Ceyda
2016-01-01
Building sustainable universities calls for participative management and collaboration among stakeholders. Combining analytic hierarchy and network processes (AHP/ANP) with statistical analysis, this research proposes a framework that can be used in higher education institutions for integrating stakeholder preferences into strategic decisions. The…
Systemic Analysis Approaches for Air Transportation
NASA Technical Reports Server (NTRS)
Conway, Sheila
2005-01-01
Air transportation system designers have had only limited success using traditional operations research and parametric modeling approaches in their analyses of innovations. They need a systemic methodology for modeling of safety-critical infrastructure that is comprehensive, objective, and sufficiently concrete, yet simple enough to be used with reasonable investment. The methodology must also be amenable to quantitative analysis so issues of system safety and stability can be rigorously addressed. However, air transportation has proven itself an extensive, complex system whose behavior is difficult to describe, no less predict. There is a wide range of system analysis techniques available, but some are more appropriate for certain applications than others. Specifically in the area of complex system analysis, the literature suggests that both agent-based models and network analysis techniques may be useful. This paper discusses the theoretical basis for each approach in these applications, and explores their historic and potential further use for air transportation analysis.
Integrated Genomic and Network-Based Analyses of Complex Diseases and Human Disease Network.
Al-Harazi, Olfat; Al Insaif, Sadiq; Al-Ajlan, Monirah A; Kaya, Namik; Dzimiri, Nduna; Colak, Dilek
2016-06-20
A disease phenotype generally reflects various pathobiological processes that interact in a complex network. The highly interconnected nature of the human protein interaction network (interactome) indicates that, at the molecular level, it is difficult to consider diseases as being independent of one another. Recently, genome-wide molecular measurements, data mining and bioinformatics approaches have provided the means to explore human diseases from a molecular basis. The exploration of diseases and a system of disease relationships based on the integration of genome-wide molecular data with the human interactome could offer a powerful perspective for understanding the molecular architecture of diseases. Recently, subnetwork markers have proven to be more robust and reliable than individual biomarker genes selected based on gene expression profiles alone, and achieve higher accuracy in disease classification. We have applied one of these methodologies to idiopathic dilated cardiomyopathy (IDCM) data that we have generated using a microarray and identified significant subnetworks associated with the disease. In this paper, we review the recent endeavours in this direction, and summarize the existing methodologies and computational tools for network-based analysis of complex diseases and molecular relationships among apparently different disorders and human disease network. We also discuss the future research trends and topics of this promising field. Copyright © 2015 Institute of Genetics and Developmental Biology, Chinese Academy of Sciences, and Genetics Society of China. Published by Elsevier Ltd. All rights reserved.
Multidisciplinary Concurrent Design Optimization via the Internet
NASA Technical Reports Server (NTRS)
Woodard, Stanley E.; Kelkar, Atul G.; Koganti, Gopichand
2001-01-01
A methodology is presented which uses commercial design and analysis software and the Internet to perform concurrent multidisciplinary optimization. The methodology provides a means to develop multidisciplinary designs without requiring that all software be accessible from the same local network. The procedures are amenable to design and development teams whose members, expertise and respective software are not geographically located together. This methodology facilitates multidisciplinary teams working concurrently on a design problem of common interest. Partition of design software to different machines allows each constituent software to be used on the machine that provides the most economy and efficiency. The methodology is demonstrated on the concurrent design of a spacecraft structure and attitude control system. Results are compared to those derived from performing the design with an autonomous FORTRAN program.
The relation between global migration and trade networks
NASA Astrophysics Data System (ADS)
Sgrignoli, Paolo; Metulini, Rodolfo; Schiavo, Stefano; Riccaboni, Massimo
2015-01-01
In this paper we develop a methodology to analyze and compare multiple global networks, focusing our analysis on the relation between human migration and trade. First, we identify the subset of products for which the presence of a community of migrants significantly increases trade intensity, where to assure comparability across networks we apply a hypergeometric filter that lets us identify those links which intensity is significantly higher than expected. Next, proposing a new way to define country neighbors based on the most intense links in the trade network, we use spatial econometrics techniques to measure the effect of migration on international trade, while controlling for network interdependences. Overall, we find that migration significantly boosts trade across countries and we are able to identify product categories for which this effect is particularly strong.
Application of two neural network paradigms to the study of voluntary employee turnover.
Somers, M J
1999-04-01
Two neural network paradigms--multilayer perceptron and learning vector quantization--were used to study voluntary employee turnover with a sample of 577 hospital employees. The objectives of the study were twofold. The 1st was to assess whether neural computing techniques offered greater predictive accuracy than did conventional turnover methodologies. The 2nd was to explore whether computer models of turnover based on neural network technologies offered new insights into turnover processes. When compared with logistic regression analysis, both neural network paradigms provided considerably more accurate predictions of turnover behavior, particularly with respect to the correct classification of leavers. In addition, these neural network paradigms captured nonlinear relationships that are relevant for theory development. Results are discussed in terms of their implications for future research.
Mitochondrial network complexity emerges from fission/fusion dynamics.
Zamponi, Nahuel; Zamponi, Emiliano; Cannas, Sergio A; Billoni, Orlando V; Helguera, Pablo R; Chialvo, Dante R
2018-01-10
Mitochondrial networks exhibit a variety of complex behaviors, including coordinated cell-wide oscillations of energy states as well as a phase transition (depolarization) in response to oxidative stress. Since functional and structural properties are often interwinded, here we characterized the structure of mitochondrial networks in mouse embryonic fibroblasts using network tools and percolation theory. Subsequently we perturbed the system either by promoting the fusion of mitochondrial segments or by inducing mitochondrial fission. Quantitative analysis of mitochondrial clusters revealed that structural parameters of healthy mitochondria laid in between the extremes of highly fragmented and completely fusioned networks. We confirmed our results by contrasting our empirical findings with the predictions of a recently described computational model of mitochondrial network emergence based on fission-fusion kinetics. Altogether these results offer not only an objective methodology to parametrize the complexity of this organelle but also support the idea that mitochondrial networks behave as critical systems and undergo structural phase transitions.
Revealing the Hidden Language of Complex Networks
Yaveroğlu, Ömer Nebil; Malod-Dognin, Noël; Davis, Darren; Levnajic, Zoran; Janjic, Vuk; Karapandza, Rasa; Stojmirovic, Aleksandar; Pržulj, Nataša
2014-01-01
Sophisticated methods for analysing complex networks promise to be of great benefit to almost all scientific disciplines, yet they elude us. In this work, we make fundamental methodological advances to rectify this. We discover that the interaction between a small number of roles, played by nodes in a network, can characterize a network's structure and also provide a clear real-world interpretation. Given this insight, we develop a framework for analysing and comparing networks, which outperforms all existing ones. We demonstrate its strength by uncovering novel relationships between seemingly unrelated networks, such as Facebook, metabolic, and protein structure networks. We also use it to track the dynamics of the world trade network, showing that a country's role of a broker between non-trading countries indicates economic prosperity, whereas peripheral roles are associated with poverty. This result, though intuitive, has escaped all existing frameworks. Finally, our approach translates network topology into everyday language, bringing network analysis closer to domain scientists. PMID:24686408
A multi-criteria decision aid methodology to design electric vehicles public charging networks
NASA Astrophysics Data System (ADS)
Raposo, João; Rodrigues, Ana; Silva, Carlos; Dentinho, Tomaz
2015-05-01
This article presents a new multi-criteria decision aid methodology, dynamic-PROMETHEE, here used to design electric vehicle charging networks. In applying this methodology to a Portuguese city, results suggest that it is effective in designing electric vehicle charging networks, generating time and policy based scenarios, considering offer and demand and the city's urban structure. Dynamic-PROMETHE adds to the already known PROMETHEE's characteristics other useful features, such as decision memory over time, versatility and adaptability. The case study, used here to present the dynamic-PROMETHEE, served as inspiration and base to create this new methodology. It can be used to model different problems and scenarios that may present similar requirement characteristics.
Wood texture classification by fuzzy neural networks
NASA Astrophysics Data System (ADS)
Gonzaga, Adilson; de Franca, Celso A.; Frere, Annie F.
1999-03-01
The majority of scientific papers focusing on wood classification for pencil manufacturing take into account defects and visual appearance. Traditional methodologies are base don texture analysis by co-occurrence matrix, by image modeling, or by tonal measures over the plate surface. In this work, we propose to classify plates of wood without biological defects like insect holes, nodes, and cracks, by analyzing their texture. By this methodology we divide the plate image in several rectangular windows or local areas and reduce the number of gray levels. From each local area, we compute the histogram of difference sand extract texture features, given them as input to a Local Neuro-Fuzzy Network. Those features are from the histogram of differences instead of the image pixels due to their better performance and illumination independence. Among several features like media, contrast, second moment, entropy, and IDN, the last three ones have showed better results for network training. Each LNN output is taken as input to a Partial Neuro-Fuzzy Network (PNFN) classifying a pencil region on the plate. At last, the outputs from the PNFN are taken as input to a Global Fuzzy Logic doing the plate classification. Each pencil classification within the plate is done taking into account each quality index.
Fluxes through plant metabolic networks: measurements, predictions, insights and challenges.
Kruger, Nicholas J; Ratcliffe, R George
2015-01-01
Although the flows of material through metabolic networks are central to cell function, they are not easy to measure other than at the level of inputs and outputs. This is particularly true in plant cells, where the network spans multiple subcellular compartments and where the network may function either heterotrophically or photoautotrophically. For many years, kinetic modelling of pathways provided the only method for describing the operation of fragments of the network. However, more recently, it has become possible to map the fluxes in central carbon metabolism using the stable isotope labelling techniques of metabolic flux analysis (MFA), and to predict intracellular fluxes using constraints-based modelling procedures such as flux balance analysis (FBA). These approaches were originally developed for the analysis of microbial metabolism, but over the last decade, they have been adapted for the more demanding analysis of plant metabolic networks. Here, the principal features of MFA and FBA as applied to plants are outlined, followed by a discussion of the insights that have been gained into plant metabolic networks through the application of these time-consuming and non-trivial methods. The discussion focuses on how a system-wide view of plant metabolism has increased our understanding of network structure, metabolic perturbations and the provision of reducing power and energy for cell function. Current methodological challenges that limit the scope of plant MFA are discussed and particular emphasis is placed on the importance of developing methods for cell-specific MFA.
Behavioral networks as a model for intelligent agents
NASA Technical Reports Server (NTRS)
Sliwa, Nancy E.
1990-01-01
On-going work at NASA Langley Research Center in the development and demonstration of a paradigm called behavioral networks as an architecture for intelligent agents is described. This work focuses on the need to identify a methodology for smoothly integrating the characteristics of low-level robotic behavior, including actuation and sensing, with intelligent activities such as planning, scheduling, and learning. This work assumes that all these needs can be met within a single methodology, and attempts to formalize this methodology in a connectionist architecture called behavioral networks. Behavioral networks are networks of task processes arranged in a task decomposition hierarchy. These processes are connected by both command/feedback data flow, and by the forward and reverse propagation of weights which measure the dynamic utility of actions and beliefs.
Nonlinear neural control with power systems applications
NASA Astrophysics Data System (ADS)
Chen, Dingguo
1998-12-01
Extensive studies have been undertaken on the transient stability of large interconnected power systems with flexible ac transmission systems (FACTS) devices installed. Varieties of control methodologies have been proposed to stabilize the postfault system which would otherwise eventually lose stability without a proper control. Generally speaking, regular transient stability is well understood, but the mechanism of load-driven voltage instability or voltage collapse has not been well understood. The interaction of generator dynamics and load dynamics makes synthesis of stabilizing controllers even more challenging. There is currently increasing interest in the research of neural networks as identifiers and controllers for dealing with dynamic time-varying nonlinear systems. This study focuses on the development of novel artificial neural network architectures for identification and control with application to dynamic electric power systems so that the stability of the interconnected power systems, following large disturbances, and/or with the inclusion of uncertain loads, can be largely enhanced, and stable operations are guaranteed. The latitudinal neural network architecture is proposed for the purpose of system identification. It may be used for identification of nonlinear static/dynamic loads, which can be further used for static/dynamic voltage stability analysis. The properties associated with this architecture are investigated. A neural network methodology is proposed for dealing with load modeling and voltage stability analysis. Based on the neural network models of loads, voltage stability analysis evolves, and modal analysis is performed. Simulation results are also provided. The transient stability problem is studied with consideration of load effects. The hierarchical neural control scheme is developed. Trajectory-following policy is used so that the hierarchical neural controller performs as almost well for non-nominal cases as they do for the nominal cases. The adaptive hierarchical neural control scheme is also proposed to deal with the time-varying nature of loads. Further, adaptive neural control, which is based on the on-line updating of the weights and biases of the neural networks, is studied. Simulations provided on the faulted power systems with unknown loads suggest that the proposed adaptive hierarchical neural control schemes should be useful for practical power applications.
Approaching human language with complex networks.
Cong, Jin; Liu, Haitao
2014-12-01
The interest in modeling and analyzing human language with complex networks is on the rise in recent years and a considerable body of research in this area has already been accumulated. We survey three major lines of linguistic research from the complex network approach: 1) characterization of human language as a multi-level system with complex network analysis; 2) linguistic typological research with the application of linguistic networks and their quantitative measures; and 3) relationships between the system-level complexity of human language (determined by the topology of linguistic networks) and microscopic linguistic (e.g., syntactic) features (as the traditional concern of linguistics). We show that the models and quantitative tools of complex networks, when exploited properly, can constitute an operational methodology for linguistic inquiry, which contributes to the understanding of human language and the development of linguistics. We conclude our review with suggestions for future linguistic research from the complex network approach: 1) relationships between the system-level complexity of human language and microscopic linguistic features; 2) expansion of research scope from the global properties to other levels of granularity of linguistic networks; and 3) combination of linguistic network analysis with other quantitative studies of language (such as quantitative linguistics). Copyright © 2014 Elsevier B.V. All rights reserved.
Modular representation of layered neural networks.
Watanabe, Chihiro; Hiramatsu, Kaoru; Kashino, Kunio
2018-01-01
Layered neural networks have greatly improved the performance of various applications including image processing, speech recognition, natural language processing, and bioinformatics. However, it is still difficult to discover or interpret knowledge from the inference provided by a layered neural network, since its internal representation has many nonlinear and complex parameters embedded in hierarchical layers. Therefore, it becomes important to establish a new methodology by which layered neural networks can be understood. In this paper, we propose a new method for extracting a global and simplified structure from a layered neural network. Based on network analysis, the proposed method detects communities or clusters of units with similar connection patterns. We show its effectiveness by applying it to three use cases. (1) Network decomposition: it can decompose a trained neural network into multiple small independent networks thus dividing the problem and reducing the computation time. (2) Training assessment: the appropriateness of a trained result with a given hyperparameter or randomly chosen initial parameters can be evaluated by using a modularity index. And (3) data analysis: in practical data it reveals the community structure in the input, hidden, and output layers, which serves as a clue for discovering knowledge from a trained neural network. Copyright © 2017 Elsevier Ltd. All rights reserved.
Getting the big picture in community science: methods that capture context.
Luke, Douglas A
2005-06-01
Community science has a rich tradition of using theories and research designs that are consistent with its core value of contextualism. However, a survey of empirical articles published in the American Journal of Community Psychology shows that community scientists utilize a narrow range of statistical tools that are not well suited to assess contextual data. Multilevel modeling, geographic information systems (GIS), social network analysis, and cluster analysis are recommended as useful tools to address contextual questions in community science. An argument for increased methodological consilience is presented, where community scientists are encouraged to adopt statistical methodology that is capable of modeling a greater proportion of the data than is typical with traditional methods.
NASA Astrophysics Data System (ADS)
Marconi, S.; Orfanelli, S.; Karagounis, M.; Hemperek, T.; Christiansen, J.; Placidi, P.
2017-02-01
A dedicated power analysis methodology, based on modern digital design tools and integrated with the VEPIX53 simulation framework developed within RD53 collaboration, is being used to guide vital choices for the design and optimization of the next generation ATLAS and CMS pixel chips and their critical serial powering circuit (shunt-LDO). Power consumption is studied at different stages of the design flow under different operating conditions. Significant effort is put into extensive investigations of dynamic power variations in relation with the decoupling seen by the powering network. Shunt-LDO simulations are also reported to prove the reliability at the system level.
Social capital calculations in economic systems: Experimental study
NASA Astrophysics Data System (ADS)
Chepurov, E. G.; Berg, D. B.; Zvereva, O. M.; Nazarova, Yu. Yu.; Chekmarev, I. V.
2017-11-01
The paper describes the social capital study for a system where actors are engaged in an economic activity. The focus is on the analysis of communications structural parameters (transactions) between the actors. Comparison between transaction network graph structure and the structure of a random Bernoulli graph of the same dimension and density allows revealing specific structural features of the economic system under study. Structural analysis is based on SNA-methodology (SNA - Social Network Analysis). It is shown that structural parameter values of the graph formed by agent relationship links may well characterize different aspects of the social capital structure. The research advocates that it is useful to distinguish the difference between each agent social capital and the whole system social capital.
Space station data system analysis/architecture study. Task 3: Trade studies, DR-5, volume 1
NASA Technical Reports Server (NTRS)
1985-01-01
The primary objective of Task 3 is to provide additional analysis and insight necessary to support key design/programmatic decision for options quantification and selection for system definition. This includes: (1) the identification of key trade study topics; (2) the definition of a trade study procedure for each topic (issues to be resolved, key inputs, criteria/weighting, methodology); (3) conduct tradeoff and sensitivity analysis; and (4) the review/verification of results within the context of evolving system design and definition. The trade study topics addressed in this volume include space autonomy and function automation, software transportability, system network topology, communications standardization, onboard local area networking, distributed operating system, software configuration management, and the software development environment facility.
McCarthy, Alun
2011-09-01
Pharmacogenomic Innovative Solutions Ltd (PGXIS) was established in 2007 by a group of pharmacogenomic (PGx) experts to make their expertise available to biotechnology and pharmaceutical companies. PGXIS has subsequently established a network of experts to broaden its access to relevant PGx knowledge and technologies. In addition, it has developed a novel multivariate analysis method called Taxonomy3 which is both a data integration tool and a targeting tool. Together with siRNA methodology from CytoPathfinder Inc., PGXIS now has an extensive range of diverse PGx methodologies focused on enhancing drug development.
Infrared Algorithm Development for Ocean Observations with EOS/MODIS
NASA Technical Reports Server (NTRS)
Brown, Otis B.
1997-01-01
Efforts continue under this contract to develop algorithms for the computation of sea surface temperature (SST) from MODIS infrared measurements. This effort includes radiative transfer modeling, comparison of in situ and satellite observations, development and evaluation of processing and networking methodologies for algorithm computation and data accession, evaluation of surface validation approaches for IR radiances, development of experimental instrumentation, and participation in MODIS (project) related activities. Activities in this contract period have focused on radiative transfer modeling, evaluation of atmospheric correction methodologies, undertake field campaigns, analysis of field data, and participation in MODIS meetings.
NASA Astrophysics Data System (ADS)
Tang, Kai-Yu; Tsai, Chin-Chung
2016-01-01
The main purpose of this paper is to investigate the intellectual structure of the research on educational technology in science education (ETiSE) within the most recent years (2008-2013). Based on the criteria for educational technology research and the citation threshold for educational co-citation analysis, a total of 137 relevant ETiSE papers were identified from the International Journal of Science Education, the Journal of Research in Science Teaching, Science Education, and the Journal of Science Education and Technology. Then, a series of methodologies were performed to analyze all 137 source documents, including document co-citation analysis, social network analysis, and exploratory factor analysis. As a result, 454 co-citation ties were obtained and then graphically visualized with an undirected network, presenting a global structure of the current ETiSE research network. In addition, four major underlying intellectual subfields within the main component of the ETiSE network were extracted and named as: (1) technology-enhanced science inquiry, (2) simulation and visualization for understanding, (3) technology-enhanced chemistry learning, and (4) game-based science learning. The most influential co-citation pairs and cross-boundary phenomena were then analyzed and visualized in a co-citation network. This is the very first attempt to illuminate the core ideas underlying ETiSE research by integrating the co-citation method, factor analysis, and the networking visualization technique. The findings of this study provide a platform for scholarly discussion of the dissemination and research trends within the current ETiSE literature.
Vaiman, Daniel; Miralles, Francisco
2016-01-01
Preeclampsia (PE) is a pregnancy disorder defined by hypertension and proteinuria. This disease remains a major cause of maternal and fetal morbidity and mortality. Defective placentation is generally described as being at the root of the disease. The characterization of the transcriptome signature of the preeclamptic placenta has allowed to identify differentially expressed genes (DEGs). However, we still lack a detailed knowledge on how these DEGs impact the function of the placenta. The tools of network biology offer a methodology to explore complex diseases at a systems level. In this study we performed a cross-platform meta-analysis of seven publically available gene expression datasets comparing non-pathological and preeclamptic placentas. Using the rank product algorithm we identified a total of 369 DEGs consistently modified in PE. The DEGs were used as seeds to build both an extended physical protein-protein interactions network and a transcription factors regulatory network. Topological and clustering analysis was conducted to analyze the connectivity properties of the networks. Finally both networks were merged into a composite network which presents an integrated view of the regulatory pathways involved in preeclampsia and the crosstalk between them. This network is a useful tool to explore the relationship between the DEGs and enable hypothesis generation for functional experimentation. PMID:27802351
A multiscale method for a robust detection of the default mode network
NASA Astrophysics Data System (ADS)
Baquero, Katherine; Gómez, Francisco; Cifuentes, Christian; Guldenmund, Pieter; Demertzi, Athena; Vanhaudenhuyse, Audrey; Gosseries, Olivia; Tshibanda, Jean-Flory; Noirhomme, Quentin; Laureys, Steven; Soddu, Andrea; Romero, Eduardo
2013-11-01
The Default Mode Network (DMN) is a resting state network widely used for the analysis and diagnosis of mental disorders. It is normally detected in fMRI data, but for its detection in data corrupted by motion artefacts or low neuronal activity, the use of a robust analysis method is mandatory. In fMRI it has been shown that the signal-to-noise ratio (SNR) and the detection sensitivity of neuronal regions is increased with di erent smoothing kernels sizes. Here we propose to use a multiscale decomposition based of a linear scale-space representation for the detection of the DMN. Three main points are proposed in this methodology: rst, the use of fMRI data at di erent smoothing scale-spaces, second, detection of independent neuronal components of the DMN at each scale by using standard preprocessing methods and ICA decomposition at scale-level, and nally, a weighted contribution of each scale by the Goodness of Fit measurement. This method was applied to a group of control subjects and was compared with a standard preprocesing baseline. The detection of the DMN was improved at single subject level and at group level. Based on these results, we suggest to use this methodology to enhance the detection of the DMN in data perturbed with artefacts or applied to subjects with low neuronal activity. Furthermore, the multiscale method could be extended for the detection of other resting state neuronal networks.
Automated and comprehensive link engineering supporting branched, ring, and mesh network topologies
NASA Astrophysics Data System (ADS)
Farina, J.; Khomchenko, D.; Yevseyenko, D.; Meester, J.; Richter, A.
2016-02-01
Link design, while relatively easy in the past, can become quite cumbersome with complex channel plans and equipment configurations. The task of designing optical transport systems and selecting equipment is often performed by an applications or sales engineer using simple tools, such as custom Excel spreadsheets. Eventually, every individual has their own version of the spreadsheet as well as their own methodology for building the network. This approach becomes unmanageable very quickly and leads to mistakes, bending of the engineering rules and installations that do not perform as expected. We demonstrate a comprehensive planning environment, which offers an efficient approach to unify, control and expedite the design process by controlling libraries of equipment and engineering methodologies, automating the process and providing the analysis tools necessary to predict system performance throughout the system and for all channels. In addition to the placement of EDFAs and DCEs, performance analysis metrics are provided at every step of the way. Metrics that can be tracked include power, CD and OSNR, SPM, XPM, FWM and SBS. Automated routine steps assist in design aspects such as equalization, padding and gain setting for EDFAs, the placement of ROADMs and transceivers, and creating regeneration points. DWDM networks consisting of a large number of nodes and repeater huts, interconnected in linear, branched, mesh and ring network topologies, can be designed much faster when compared with conventional design methods. Using flexible templates for all major optical components, our technology-agnostic planning approach supports the constant advances in optical communications.
Analyzing Enterprise Networks Needs: Action Research from the Mechatronics Sector
NASA Astrophysics Data System (ADS)
Cagnazzo, Luca; Taticchi, Paolo; Bidini, Gianni; Baglieri, Enzo
New business models and theories are developing nowadays towards collaborative environments direction, and many new tools in sustaining companies involved in these organizations are emerging. Among them, a plethora of methodologies to analyze their needs are already developed for single companies. Few academic works are available about Enterprise Networks (ENs) need analysis. This paper presents the learning from an action research (AR) in the mechatronics sector: AR has been used in order to experience the issue of evaluating network needs and therefore define, develop, and test a complete framework for network evaluation. Reflection on the story in the light of the experience and the theory is presented, as well as extrapolation to a broader context and articulation of usable knowledge.
Current-mode subthreshold MOS implementation of the Herault-Jutten autoadaptive network
NASA Astrophysics Data System (ADS)
Cohen, Marc H.; Andreou, Andreas G.
1992-05-01
The translinear circuits in subthreshold MOS technology and current-mode design techniques for the implementation of neuromorphic analog network processing are investigated. The architecture, also known as the Herault-Jutten network, performs an independent component analysis and is essentially a continuous-time recursive linear adaptive filter. Analog I/O interface, weight coefficients, and adaptation blocks are all integrated on the chip. A small network with six neurons and 30 synapses was fabricated in a 2-microns n-well double-polysilicon, double-metal CMOS process. Circuit designs at the transistor level yield area-efficient implementations for neurons, synapses, and the adaptation blocks. The design methodology and constraints as well as test results from the fabricated chips are discussed.
EvoluCode: Evolutionary Barcodes as a Unifying Framework for Multilevel Evolutionary Data.
Linard, Benjamin; Nguyen, Ngoc Hoan; Prosdocimi, Francisco; Poch, Olivier; Thompson, Julie D
2012-01-01
Evolutionary systems biology aims to uncover the general trends and principles governing the evolution of biological networks. An essential part of this process is the reconstruction and analysis of the evolutionary histories of these complex, dynamic networks. Unfortunately, the methodologies for representing and exploiting such complex evolutionary histories in large scale studies are currently limited. Here, we propose a new formalism, called EvoluCode (Evolutionary barCode), which allows the integration of different evolutionary parameters (eg, sequence conservation, orthology, synteny …) in a unifying format and facilitates the multilevel analysis and visualization of complex evolutionary histories at the genome scale. The advantages of the approach are demonstrated by constructing barcodes representing the evolution of the complete human proteome. Two large-scale studies are then described: (i) the mapping and visualization of the barcodes on the human chromosomes and (ii) automatic clustering of the barcodes to highlight protein subsets sharing similar evolutionary histories and their functional analysis. The methodologies developed here open the way to the efficient application of other data mining and knowledge extraction techniques in evolutionary systems biology studies. A database containing all EvoluCode data is available at: http://lbgi.igbmc.fr/barcodes.
Structure and function of complex brain networks
Sporns, Olaf
2013-01-01
An increasing number of theoretical and empirical studies approach the function of the human brain from a network perspective. The analysis of brain networks is made feasible by the development of new imaging acquisition methods as well as new tools from graph theory and dynamical systems. This review surveys some of these methodological advances and summarizes recent findings on the architecture of structural and functional brain networks. Studies of the structural connectome reveal several modules or network communities that are interlinked by hub regions mediating communication processes between modules. Recent network analyses have shown that network hubs form a densely linked collective called a “rich club,” centrally positioned for attracting and dispersing signal traffic. In parallel, recordings of resting and task-evoked neural activity have revealed distinct resting-state networks that contribute to functions in distinct cognitive domains. Network methods are increasingly applied in a clinical context, and their promise for elucidating neural substrates of brain and mental disorders is discussed. PMID:24174898
NASA Technical Reports Server (NTRS)
Souza, V. M.; Vieira, L. E. A.; Medeiros, C.; Da Silva, L. A.; Alves, L. R.; Koga, D.; Sibeck, D. G.; Walsh, B. M.; Kanekal, S. G.; Jauer, P. R.;
2016-01-01
Analysis of particle pitch angle distributions (PADs) has been used as a means to comprehend a multitude of different physical mechanisms that lead to flux variations in the Van Allen belts and also to particle precipitation into the upper atmosphere. In this work we developed a neural network-based data clustering methodology that automatically identifies distinct PAD types in an unsupervised way using particle flux data. One can promptly identify and locate three well-known PAD types in both time and radial distance, namely, 90deg peaked, butterfly, and flattop distributions. In order to illustrate the applicability of our methodology, we used relativistic electron flux data from the whole month of November 2014, acquired from the Relativistic Electron-Proton Telescope instrument on board the Van Allen Probes, but it is emphasized that our approach can also be used with multiplatform spacecraft data. Our PAD classification results are in reasonably good agreement with those obtained by standard statistical fitting algorithms. The proposed methodology has a potential use for Van Allen belt's monitoring.
Alexakis, Dimitrios D.; Mexis, Filippos-Dimitrios K.; Vozinaki, Anthi-Eirini K.; Daliakopoulos, Ioannis N.; Tsanis, Ioannis K.
2017-01-01
A methodology for elaborating multi-temporal Sentinel-1 and Landsat 8 satellite images for estimating topsoil Soil Moisture Content (SMC) to support hydrological simulation studies is proposed. After pre-processing the remote sensing data, backscattering coefficient, Normalized Difference Vegetation Index (NDVI), thermal infrared temperature and incidence angle parameters are assessed for their potential to infer ground measurements of SMC, collected at the top 5 cm. A non-linear approach using Artificial Neural Networks (ANNs) is tested. The methodology is applied in Western Crete, Greece, where a SMC gauge network was deployed during 2015. The performance of the proposed algorithm is evaluated using leave-one-out cross validation and sensitivity analysis. ANNs prove to be the most efficient in SMC estimation yielding R2 values between 0.7 and 0.9. The proposed methodology is used to support a hydrological simulation with the HEC-HMS model, applied at the Keramianos basin which is ungauged for SMC. Results and model sensitivity highlight the contribution of combining Sentinel-1 SAR and Landsat 8 images for improving SMC estimates and supporting hydrological studies. PMID:28635625
Alexakis, Dimitrios D; Mexis, Filippos-Dimitrios K; Vozinaki, Anthi-Eirini K; Daliakopoulos, Ioannis N; Tsanis, Ioannis K
2017-06-21
A methodology for elaborating multi-temporal Sentinel-1 and Landsat 8 satellite images for estimating topsoil Soil Moisture Content (SMC) to support hydrological simulation studies is proposed. After pre-processing the remote sensing data, backscattering coefficient, Normalized Difference Vegetation Index (NDVI), thermal infrared temperature and incidence angle parameters are assessed for their potential to infer ground measurements of SMC, collected at the top 5 cm. A non-linear approach using Artificial Neural Networks (ANNs) is tested. The methodology is applied in Western Crete, Greece, where a SMC gauge network was deployed during 2015. The performance of the proposed algorithm is evaluated using leave-one-out cross validation and sensitivity analysis. ANNs prove to be the most efficient in SMC estimation yielding R² values between 0.7 and 0.9. The proposed methodology is used to support a hydrological simulation with the HEC-HMS model, applied at the Keramianos basin which is ungauged for SMC. Results and model sensitivity highlight the contribution of combining Sentinel-1 SAR and Landsat 8 images for improving SMC estimates and supporting hydrological studies.
NET: a new framework for the vectorization and examination of network data.
Lasser, Jana; Katifori, Eleni
2017-01-01
The analysis of complex networks both in general and in particular as pertaining to real biological systems has been the focus of intense scientific attention in the past and present. In this paper we introduce two tools that provide fast and efficient means for the processing and quantification of biological networks like Drosophila tracheoles or leaf venation patterns: the Network Extraction Tool ( NET ) to extract data and the Graph-edit-GUI ( GeGUI ) to visualize and modify networks. NET is especially designed for high-throughput semi-automated analysis of biological datasets containing digital images of networks. The framework starts with the segmentation of the image and then proceeds to vectorization using methodologies from optical character recognition. After a series of steps to clean and improve the quality of the extracted data the framework produces a graph in which the network is represented only by its nodes and neighborhood-relations. The final output contains information about the adjacency matrix of the graph, the width of the edges and the positions of the nodes in space. NET also provides tools for statistical analysis of the network properties, such as the number of nodes or total network length. Other, more complex metrics can be calculated by importing the vectorized network to specialized network analysis packages. GeGUI is designed to facilitate manual correction of non-planar networks as these may contain artifacts or spurious junctions due to branches crossing each other. It is tailored for but not limited to the processing of networks from microscopy images of Drosophila tracheoles. The networks extracted by NET closely approximate the network depicted in the original image. NET is fast, yields reproducible results and is able to capture the full geometry of the network, including curved branches. Additionally GeGUI allows easy handling and visualization of the networks.
Review of Recent Methodological Developments in Group-Randomized Trials: Part 1—Design
Li, Fan; Gallis, John A.; Prague, Melanie; Murray, David M.
2017-01-01
In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have highlighted the developments of the past 13 years in design with a companion article to focus on developments in analysis. As a pair, these articles update the 2004 review. We have discussed developments in the topics of the earlier review (e.g., clustering, matching, and individually randomized group-treatment trials) and in new topics, including constrained randomization and a range of randomized designs that are alternatives to the standard parallel-arm GRT. These include the stepped-wedge GRT, the pseudocluster randomized trial, and the network-randomized GRT, which, like the parallel-arm GRT, require clustering to be accounted for in both their design and analysis. PMID:28426295
Review of Recent Methodological Developments in Group-Randomized Trials: Part 1-Design.
Turner, Elizabeth L; Li, Fan; Gallis, John A; Prague, Melanie; Murray, David M
2017-06-01
In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have highlighted the developments of the past 13 years in design with a companion article to focus on developments in analysis. As a pair, these articles update the 2004 review. We have discussed developments in the topics of the earlier review (e.g., clustering, matching, and individually randomized group-treatment trials) and in new topics, including constrained randomization and a range of randomized designs that are alternatives to the standard parallel-arm GRT. These include the stepped-wedge GRT, the pseudocluster randomized trial, and the network-randomized GRT, which, like the parallel-arm GRT, require clustering to be accounted for in both their design and analysis.
NASA Technical Reports Server (NTRS)
Blakeslee, R. J.; Bailey, J. C.; Pinto, O.; Athayde, A.; Renno, N.; Weidman, C. D.
2003-01-01
A four station Advanced Lightning Direction Finder (ALDF) network was established in the state of Rondonia in western Brazil in 1999 through a collaboration of U.S. and Brazilian participants from NASA, INPE, INMET, and various universities. The network utilizes ALDF IMPACT (Improved Accuracy from Combined Technology) sensors to provide cloud-to-ground lightning observations (i.e., stroke/flash locations, signal amplitude, and polarity) using both time-of- arrival and magnetic direction finding techniques. The observations are collected, processed and archived at a central site in Brasilia and at the NASA/Marshall Space Flight Center in Huntsville, Alabama. Initial, non-quality assured quick-look results are made available in near real-time over the Internet. The network, which is still operational, was deployed to provide ground truth data for the Lightning Imaging Sensor (LIS) on the Tropical Rainfall Measuring Mission (TRMM) satellite that was launched in November 1997. The measurements are also being used to investigate the relationship between the electrical, microphysical and kinematic properties of tropical convection. In addition, the long-time series observations produced by this network will help establish a regional lightning climatological database, supplementing other databases in Brazil that already exist or may soon be implemented. Analytic inversion algorithms developed at the NASA/Marshall Space Flight Center have been applied to the Rondonian ALDF lightning observations to obtain site error corrections and improved location retrievals. The data will also be corrected for the network detection efficiency. The processing methodology and the results from the analysis of four years of network operations will be presented.
Correlation analysis on real-time tab-delimited network monitoring data
Pan, Aditya; Majumdar, Jahin; Bansal, Abhay; ...
2016-01-01
End-End performance monitoring in the Internet, also called PingER is a part of SLAC National Accelerator Laboratory’s research project. It was created to answer the growing need to monitor network both to analyze current performance and to designate resources to optimize execution between research centers, and the universities and institutes co-operating on present and future operations. The monitoring support reflects the broad geographical area of the collaborations and requires a comprehensive number of research and financial channels. The data architecture retrieval and methodology of the interpretation have emerged over numerous years. Analyzing this data is the main challenge due tomore » its high volume. Finally, by using correlation analysis, we can make crucial conclusions about how the network data affects the performance of the hosts and how it depends from countries to countries.« less
Hajibandeh, Shahab; Hajibandeh, Shahin; Antoniou, George A; Green, Patrick A; Maden, Michelle; Torella, Francesco
2017-04-01
Purpose We aimed to investigate association between bibliometric parameters, reporting and methodological quality of vascular and endovascular surgery randomised controlled trials. Methods The most recent 75 and oldest 75 randomised controlled trials published in leading journals over a 10-year period were identified. The reporting quality was analysed using the CONSORT statement, and methodological quality with the Intercollegiate Guidelines Network checklist. We used exploratory univariate and multivariable linear regression analysis to investigate associations. Findings Bibliometric parameters such as type of journal, study design reported in title, number of pages; external funding, industry sponsoring and number of citations are associated with reporting quality. Moreover, parameters such as type of journal, subject area and study design reported in title are associated with methodological quality. Conclusions The bibliometric parameters of randomised controlled trials may be independent predictors for their reporting and methodological quality. Moreover, the reporting quality of randomised controlled trials is associated with their methodological quality and vice versa.
Plagianakos, V P; Magoulas, G D; Vrahatis, M N
2006-03-01
Distributed computing is a process through which a set of computers connected by a network is used collectively to solve a single problem. In this paper, we propose a distributed computing methodology for training neural networks for the detection of lesions in colonoscopy. Our approach is based on partitioning the training set across multiple processors using a parallel virtual machine. In this way, interconnected computers of varied architectures can be used for the distributed evaluation of the error function and gradient values, and, thus, training neural networks utilizing various learning methods. The proposed methodology has large granularity and low synchronization, and has been implemented and tested. Our results indicate that the parallel virtual machine implementation of the training algorithms developed leads to considerable speedup, especially when large network architectures and training sets are used.
Saha, Sudipto; Dazard, Jean-Eudes; Xu, Hua; Ewing, Rob M.
2013-01-01
Large-scale protein–protein interaction data sets have been generated for several species including yeast and human and have enabled the identification, quantification, and prediction of cellular molecular networks. Affinity purification-mass spectrometry (AP-MS) is the preeminent methodology for large-scale analysis of protein complexes, performed by immunopurifying a specific “bait” protein and its associated “prey” proteins. The analysis and interpretation of AP-MS data sets is, however, not straightforward. In addition, although yeast AP-MS data sets are relatively comprehensive, current human AP-MS data sets only sparsely cover the human interactome. Here we develop a framework for analysis of AP-MS data sets that addresses the issues of noise, missing data, and sparsity of coverage in the context of a current, real world human AP-MS data set. Our goal is to extend and increase the density of the known human interactome by integrating bait–prey and cocomplexed preys (prey–prey associations) into networks. Our framework incorporates a score for each identified protein, as well as elements of signal processing to improve the confidence of identified protein–protein interactions. We identify many protein networks enriched in known biological processes and functions. In addition, we show that integrated bait–prey and prey–prey interactions can be used to refine network topology and extend known protein networks. PMID:22845868
Spike-train spectra and network response functions for non-linear integrate-and-fire neurons.
Richardson, Magnus J E
2008-11-01
Reduced models have long been used as a tool for the analysis of the complex activity taking place in neurons and their coupled networks. Recent advances in experimental and theoretical techniques have further demonstrated the usefulness of this approach. Despite the often gross simplification of the underlying biophysical properties, reduced models can still present significant difficulties in their analysis, with the majority of exact and perturbative results available only for the leaky integrate-and-fire model. Here an elementary numerical scheme is demonstrated which can be used to calculate a number of biologically important properties of the general class of non-linear integrate-and-fire models. Exact results for the first-passage-time density and spike-train spectrum are derived, as well as the linear response properties and emergent states of recurrent networks. Given that the exponential integrate-fire model has recently been shown to agree closely with the experimentally measured response of pyramidal cells, the methodology presented here promises to provide a convenient tool to facilitate the analysis of cortical-network dynamics.
A network approach to decentralized coordination of energy production-consumption grids
Arenas, Alex
2018-01-01
Energy grids are facing a relatively new paradigm consisting in the formation of local distributed energy sources and loads that can operate in parallel independently from the main power grid (usually called microgrids). One of the main challenges in microgrid-like networks management is that of self-adapting to the production and demands in a decentralized coordinated way. Here, we propose a stylized model that allows to analytically predict the coordination of the elements in the network, depending on the network topology. Surprisingly, almost global coordination is attained when users interact locally, with a small neighborhood, instead of the obvious but more costly all-to-all coordination. We compute analytically the optimal value of coordinated users in random homogeneous networks. The methodology proposed opens a new way of confronting the analysis of energy demand-side management in networked systems. PMID:29364962
Modulation of the brain's functional network architecture in the transition from wake to sleep
Larson-Prior, Linda J.; Power, Jonathan D.; Vincent, Justin L.; Nolan, Tracy S.; Coalson, Rebecca S.; Zempel, John; Snyder, Abraham Z.; Schlaggar, Bradley L.; Raichle, Marcus E.; Petersen, Steven E.
2013-01-01
The transition from quiet wakeful rest to sleep represents a period over which attention to the external environment fades. Neuroimaging methodologies have provided much information on the shift in neural activity patterns in sleep, but the dynamic restructuring of human brain networks in the transitional period from wake to sleep remains poorly understood. Analysis of electrophysiological measures and functional network connectivity of these early transitional states shows subtle shifts in network architecture that are consistent with reduced external attentiveness and increased internal and self-referential processing. Further, descent to sleep is accompanied by the loss of connectivity in anterior and posterior portions of the default-mode network and more locally organized global network architecture. These data clarify the complex and dynamic nature of the transitional period between wake and sleep and suggest the need for more studies investigating the dynamics of these processes. PMID:21854969
NASA Astrophysics Data System (ADS)
Various papers on global telecommunications are presented. The general topics addressed include: multiservice integration with optical fibers, multicompany owned telecommunication networks, softworks quality and reliability, advanced on-board processing, impact of new services and systems on operations and maintenance, analytical studies of protocols for data communication networks, topics in packet radio networking, CCITT No. 7 to support new services, document processing and communication, antenna technology and system aspects in satellite communications. Also considered are: communication systems modelling methodology, experimental integrated local area voice/data nets, spread spectrum communications, motion video at the DS-0 rate, optical and data communications, intelligent work stations, switch performance analysis, novel radio communication systems, wireless local networks, ISDN services, LAN communication protocols, user-system interface, radio propagation and performance, mobile satellite system, software for computer networks, VLSI for ISDN terminals, quality management, man-machine interfaces in switching, and local area network performance.
Spatial modeling of potential woody biomass flow
Woodam Chung; Nathaniel Anderson
2012-01-01
The flow of woody biomass to end users is determined by economic factors, especially the amount available across a landscape and delivery costs of bioenergy facilities. The objective of this study develop methodology to quantify landscape-level stocks and potential biomass flows using the currently available spatial database road network analysis tool. We applied this...
Strategic Role of HRM in Turkey: A Three-Country Comparative Analysis
ERIC Educational Resources Information Center
Ozcelik, Ayse Oya; Aydinli, Fulya
2006-01-01
Purpose: To explore the strategic role of human resource management (HRM) in Turkey by comparing Turkish companies to Spanish and German companies. Design/methodology/approach: The questionnaire form of the Cranet-G 1999-2000 Survey (Cranfield Network on Strategic International Human Resource Management) has been used to collect the data. The…
Mathematics Lectures as Narratives: Insights from Network Graph Methodology
ERIC Educational Resources Information Center
Weinberg, Aaron; Wiesner, Emilie; Fukawa-Connelly, Tim
2016-01-01
Although lecture is the traditional method of university mathematics instruction, there has been little empirical research that describes the general structure of lectures. In this paper, we adapt ideas from narrative analysis and apply them to an upper-level mathematics lecture. We develop a framework that enables us to conceptualize the lecture…
Performance modeling of automated manufacturing systems
NASA Astrophysics Data System (ADS)
Viswanadham, N.; Narahari, Y.
A unified and systematic treatment is presented of modeling methodologies and analysis techniques for performance evaluation of automated manufacturing systems. The book is the first treatment of the mathematical modeling of manufacturing systems. Automated manufacturing systems are surveyed and three principal analytical modeling paradigms are discussed: Markov chains, queues and queueing networks, and Petri nets.
Empirical Reference Distributions for Networks of Different Size
Smith, Anna; Calder, Catherine A.; Browning, Christopher R.
2016-01-01
Network analysis has become an increasingly prevalent research tool across a vast range of scientific fields. Here, we focus on the particular issue of comparing network statistics, i.e. graph-level measures of network structural features, across multiple networks that differ in size. Although “normalized” versions of some network statistics exist, we demonstrate via simulation why direct comparison is often inappropriate. We consider normalizing network statistics relative to a simple fully parameterized reference distribution and demonstrate via simulation how this is an improvement over direct comparison, but still sometimes problematic. We propose a new adjustment method based on a reference distribution constructed as a mixture model of random graphs which reflect the dependence structure exhibited in the observed networks. We show that using simple Bernoulli models as mixture components in this reference distribution can provide adjusted network statistics that are relatively comparable across different network sizes but still describe interesting features of networks, and that this can be accomplished at relatively low computational expense. Finally, we apply this methodology to a collection of ecological networks derived from the Los Angeles Family and Neighborhood Survey activity location data. PMID:27721556
Costello, Tracy J; Falk, Catherine T; Ye, Kenny Q
2003-01-01
The Framingham Heart Study data, as well as a related simulated data set, were generously provided to the participants of the Genetic Analysis Workshop 13 in order that newly developed and emerging statistical methodologies could be tested on that well-characterized data set. The impetus driving the development of novel methods is to elucidate the contributions of genes, environment, and interactions between and among them, as well as to allow comparison between and validation of methods. The seven papers that comprise this group used data-mining methodologies (tree-based methods, neural networks, discriminant analysis, and Bayesian variable selection) in an attempt to identify the underlying genetics of cardiovascular disease and related traits in the presence of environmental and genetic covariates. Data-mining strategies are gaining popularity because they are extremely flexible and may have greater efficiency and potential in identifying the factors involved in complex disorders. While the methods grouped together here constitute a diverse collection, some papers asked similar questions with very different methods, while others used the same underlying methodology to ask very different questions. This paper briefly describes the data-mining methodologies applied to the Genetic Analysis Workshop 13 data sets and the results of those investigations. Copyright 2003 Wiley-Liss, Inc.
Logic-based models in systems biology: a predictive and parameter-free network analysis method†
Wynn, Michelle L.; Consul, Nikita; Merajver, Sofia D.
2012-01-01
Highly complex molecular networks, which play fundamental roles in almost all cellular processes, are known to be dysregulated in a number of diseases, most notably in cancer. As a consequence, there is a critical need to develop practical methodologies for constructing and analysing molecular networks at a systems level. Mathematical models built with continuous differential equations are an ideal methodology because they can provide a detailed picture of a network’s dynamics. To be predictive, however, differential equation models require that numerous parameters be known a priori and this information is almost never available. An alternative dynamical approach is the use of discrete logic-based models that can provide a good approximation of the qualitative behaviour of a biochemical system without the burden of a large parameter space. Despite their advantages, there remains significant resistance to the use of logic-based models in biology. Here, we address some common concerns and provide a brief tutorial on the use of logic-based models, which we motivate with biological examples. PMID:23072820
An alternative approach based on artificial neural networks to study controlled drug release.
Reis, Marcus A A; Sinisterra, Rubén D; Belchior, Jadson C
2004-02-01
An alternative methodology based on artificial neural networks is proposed to be a complementary tool to other conventional methods to study controlled drug release. Two systems are used to test the approach; namely, hydrocortisone in a biodegradable matrix and rhodium (II) butyrate complexes in a bioceramic matrix. Two well-established mathematical models are used to simulate different release profiles as a function of fundamental properties; namely, diffusion coefficient (D), saturation solubility (C(s)), drug loading (A), and the height of the device (h). The models were tested, and the results show that these fundamental properties can be predicted after learning the experimental or model data for controlled drug release systems. The neural network results obtained after the learning stage can be considered to quantitatively predict ideal experimental conditions. Overall, the proposed methodology was shown to be efficient for ideal experiments, with a relative average error of <1% in both tests. This approach can be useful for the experimental analysis to simulate and design efficient controlled drug-release systems. Copyright 2004 Wiley-Liss, Inc. and the American Pharmacists Association
Unsupervised User Similarity Mining in GSM Sensor Networks
Shad, Shafqat Ali; Chen, Enhong
2013-01-01
Mobility data has attracted the researchers for the past few years because of its rich context and spatiotemporal nature, where this information can be used for potential applications like early warning system, route prediction, traffic management, advertisement, social networking, and community finding. All the mentioned applications are based on mobility profile building and user trend analysis, where mobility profile building is done through significant places extraction, user's actual movement prediction, and context awareness. However, significant places extraction and user's actual movement prediction for mobility profile building are a trivial task. In this paper, we present the user similarity mining-based methodology through user mobility profile building by using the semantic tagging information provided by user and basic GSM network architecture properties based on unsupervised clustering approach. As the mobility information is in low-level raw form, our proposed methodology successfully converts it to a high-level meaningful information by using the cell-Id location information rather than previously used location capturing methods like GPS, Infrared, and Wifi for profile mining and user similarity mining. PMID:23576905
Complexity of generic biochemical circuits: topology versus strength of interactions.
Tikhonov, Mikhail; Bialek, William
2016-12-06
The historical focus on network topology as a determinant of biological function is still largely maintained today, illustrated by the rise of structure-only approaches to network analysis. However, biochemical circuits and genetic regulatory networks are defined both by their topology and by a multitude of continuously adjustable parameters, such as the strength of interactions between nodes, also recognized as important. Here we present a class of simple perceptron-based Boolean models within which comparing the relative importance of topology versus interaction strengths becomes a quantitatively well-posed problem. We quantify the intuition that for generic networks, optimization of interaction strengths is a crucial ingredient of achieving high complexity, defined here as the number of fixed points the network can accommodate. We propose a new methodology for characterizing the relative role of parameter optimization for topologies of a given class.
Utilizing a structural meta-ontology for family-based quality assurance of the BioPortal ontologies.
Ochs, Christopher; He, Zhe; Zheng, Ling; Geller, James; Perl, Yehoshua; Hripcsak, George; Musen, Mark A
2016-06-01
An Abstraction Network is a compact summary of an ontology's structure and content. In previous research, we showed that Abstraction Networks support quality assurance (QA) of biomedical ontologies. The development of an Abstraction Network and its associated QA methodologies, however, is a labor-intensive process that previously was applicable only to one ontology at a time. To improve the efficiency of the Abstraction-Network-based QA methodology, we introduced a QA framework that uses uniform Abstraction Network derivation techniques and QA methodologies that are applicable to whole families of structurally similar ontologies. For the family-based framework to be successful, it is necessary to develop a method for classifying ontologies into structurally similar families. We now describe a structural meta-ontology that classifies ontologies according to certain structural features that are commonly used in the modeling of ontologies (e.g., object properties) and that are important for Abstraction Network derivation. Each class of the structural meta-ontology represents a family of ontologies with identical structural features, indicating which types of Abstraction Networks and QA methodologies are potentially applicable to all of the ontologies in the family. We derive a collection of 81 families, corresponding to classes of the structural meta-ontology, that enable a flexible, streamlined family-based QA methodology, offering multiple choices for classifying an ontology. The structure of 373 ontologies from the NCBO BioPortal is analyzed and each ontology is classified into multiple families modeled by the structural meta-ontology. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Chanda, Sandip; De, Abhinandan
2016-12-01
A social welfare optimization technique has been proposed in this paper with a developed state space based model and bifurcation analysis to offer substantial stability margin even in most inadvertent states of power system networks. The restoration of the power market dynamic price equilibrium has been negotiated in this paper, by forming Jacobian of the sensitivity matrix to regulate the state variables for the standardization of the quality of solution in worst possible contingencies of the network and even with co-option of intermittent renewable energy sources. The model has been tested in IEEE 30 bus system and illustrious particle swarm optimization has assisted the fusion of the proposed model and methodology.
Statistical Analysis of Big Data on Pharmacogenomics
Fan, Jianqing; Liu, Han
2013-01-01
This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905
Automated Meteor Detection by All-Sky Digital Camera Systems
NASA Astrophysics Data System (ADS)
Suk, Tomáš; Šimberová, Stanislava
2017-12-01
We have developed a set of methods to detect meteor light traces captured by all-sky CCD cameras. Operating at small automatic observatories (stations), these cameras create a network spread over a large territory. Image data coming from these stations are merged in one central node. Since a vast amount of data is collected by the stations in a single night, robotic storage and analysis are essential to processing. The proposed methodology is adapted to data from a network of automatic stations equipped with digital fish-eye cameras and includes data capturing, preparation, pre-processing, analysis, and finally recognition of objects in time sequences. In our experiments we utilized real observed data from two stations.
Cortical network architecture for context processing in primate brain
Chao, Zenas C; Nagasaka, Yasuo; Fujii, Naotaka
2015-01-01
Context is information linked to a situation that can guide behavior. In the brain, context is encoded by sensory processing and can later be retrieved from memory. How context is communicated within the cortical network in sensory and mnemonic forms is unknown due to the lack of methods for high-resolution, brain-wide neuronal recording and analysis. Here, we report the comprehensive architecture of a cortical network for context processing. Using hemisphere-wide, high-density electrocorticography, we measured large-scale neuronal activity from monkeys observing videos of agents interacting in situations with different contexts. We extracted five context-related network structures including a bottom-up network during encoding and, seconds later, cue-dependent retrieval of the same network with the opposite top-down connectivity. These findings show that context is represented in the cortical network as distributed communication structures with dynamic information flows. This study provides a general methodology for recording and analyzing cortical network neuronal communication during cognition. DOI: http://dx.doi.org/10.7554/eLife.06121.001 PMID:26416139
Methodology for Estimating ton-Miles of Goods Movements for U.S. Freight Mulitimodal Network System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliveira Neto, Francisco Moraes; Chin, Shih-Miao; Hwang, Ho-Ling
2013-01-01
Ton-miles is a commonly used measure of freight transportation output. Estimation of ton-miles in the U.S. transportation system requires freight flow data at disaggregated level (either by link flow, path flows or origin-destination flows between small geographic areas). However, the sheer magnitude of the freight data system as well as industrial confidentiality concerns in Census survey, limit the freight data which is made available to the public. Through the years, the Center for Transportation Analysis (CTA) of the Oak Ridge National Laboratory (ORNL) has been working in the development of comprehensive national and regional freight databases and network flow models.more » One of the main products of this effort is the Freight Analysis Framework (FAF), a public database released by the ORNL. FAF provides to the general public a multidimensional matrix of freight flows (weight and dollar value) on the U.S. transportation system between states, major metropolitan areas, and remainder of states. Recently, the CTA research team has developed a methodology to estimate ton-miles by mode of transportation between the 2007 FAF regions. This paper describes the data disaggregation methodology. The method relies on the estimation of disaggregation factors that are related to measures of production, attractiveness and average shipments distances by mode service. Production and attractiveness of counties are captured by the total employment payroll. Likely mileages for shipments between counties are calculated by using a geographic database, i.e. the CTA multimodal network system. Results of validation experiments demonstrate the validity of the method. Moreover, 2007 FAF ton-miles estimates are consistent with the major freight data programs for rail and water movements.« less
Continuously updated network meta-analysis and statistical monitoring for timely decision-making
Nikolakopoulou, Adriani; Mavridis, Dimitris; Egger, Matthias; Salanti, Georgia
2016-01-01
Pairwise and network meta-analysis (NMA) are traditionally used retrospectively to assess existing evidence. However, the current evidence often undergoes several updates as new studies become available. In each update recommendations about the conclusiveness of the evidence and the need of future studies need to be made. In the context of prospective meta-analysis future studies are planned as part of the accumulation of the evidence. In this setting, multiple testing issues need to be taken into account when the meta-analysis results are interpreted. We extend ideas of sequential monitoring of meta-analysis to provide a methodological framework for updating NMAs. Based on the z-score for each network estimate (the ratio of effect size to its standard error) and the respective information gained after each study enters NMA we construct efficacy and futility stopping boundaries. A NMA treatment effect is considered conclusive when it crosses an appended stopping boundary. The methods are illustrated using a recently published NMA where we show that evidence about a particular comparison can become conclusive via indirect evidence even if no further trials address this comparison. PMID:27587588
Brakowski, Janis; Spinelli, Simona; Dörig, Nadja; Bosch, Oliver Gero; Manoliu, Andrei; Holtforth, Martin Grosse; Seifritz, Erich
2017-09-01
The alterations of functional connectivity brain networks in major depressive disorder (MDD) have been subject of a large number of studies. Using different methodologies and focusing on diverse aspects of the disease, research shows heterogeneous results lacking integration. Disrupted network connectivity has been found in core MDD networks like the default mode network (DMN), the central executive network (CEN), and the salience network, but also in cerebellar and thalamic circuitries. Here we review literature published on resting state brain network function in MDD focusing on methodology, and clinical characteristics including symptomatology and antidepressant treatment related findings. There are relatively few investigations concerning the qualitative aspects of symptomatology of MDD, whereas most studies associate quantitative aspects with distinct resting state functional connectivity alterations. Such depression severity associated alterations are found in the DMN, frontal, cerebellar and thalamic brain regions as well as the insula and the subgenual anterior cingulate cortex. Similarly, different therapeutical options in MDD and their effects on brain function showed patchy results. Herein, pharmaceutical treatments reveal functional connectivity alterations throughout multiple brain regions notably the DMN, fronto-limbic, and parieto-temporal regions. Psychotherapeutical interventions show significant functional connectivity alterations in fronto-limbic networks, whereas electroconvulsive therapy and repetitive transcranial magnetic stimulation result in alterations of the subgenual anterior cingulate cortex, the DMN, the CEN and the dorsal lateral prefrontal cortex. While it appears clear that functional connectivity alterations are associated with the pathophysiology and treatment of MDD, future research should also generate a common strategy for data acquisition and analysis, as a least common denominator, to set the basis for comparability across studies and implementation of functional connectivity as a scientifically and clinically useful biomarker. Copyright © 2017 Elsevier Ltd. All rights reserved.
A systematic review protocol: social network analysis of tobacco use.
Maddox, Raglan; Davey, Rachel; Lovett, Ray; van der Sterren, Anke; Corbett, Joan; Cochrane, Tom
2014-08-08
Tobacco use is the single most preventable cause of death in the world. Evidence indicates that behaviours such as tobacco use can influence social networks, and that social network structures can influence behaviours. Social network analysis provides a set of analytic tools to undertake methodical analysis of social networks. We will undertake a systematic review to provide a comprehensive synthesis of the literature regarding social network analysis and tobacco use. The review will answer the following research questions: among participants who use tobacco, does social network structure/position influence tobacco use? Does tobacco use influence peer selection? Does peer selection influence tobacco use? We will follow the Preferred Reporting Items for Systemic Reviews and Meta-Analyses (PRISMA) guidelines and search the following databases for relevant articles: CINAHL (Cumulative Index to Nursing and Allied Health Literature); Informit Health Collection; PsycINFO; PubMed/MEDLINE; Scopus/Embase; Web of Science; and the Wiley Online Library. Keywords include tobacco; smoking; smokeless; cigarettes; cigar and 'social network' and reference lists of included articles will be hand searched. Studies will be included that provide descriptions of social network analysis of tobacco use.Qualitative, quantitative and mixed method data that meets the inclusion criteria for the review, including methodological rigour, credibility and quality standards, will be synthesized using narrative synthesis. Results will be presented using outcome statistics that address each of the research questions. This systematic review will provide a timely evidence base on the role of social network analysis of tobacco use, forming a basis for future research, policy and practice in this area. This systematic review will synthesise the evidence, supporting the hypothesis that social network structures can influence tobacco use. This will also include exploring the relationship between social network structure, social network position, peer selection, peer influence and tobacco use across all age groups, and across different demographics. The research will increase our understanding of social networks and their impact on tobacco use, informing policy and practice while highlighting gaps in the literature and areas for further research.
Passos, Izabel Christina Friche; Vieira, Kelly; Moreira, Laura; Rodrigues, Flávia; Amorim, Margarete; Santos, Cláudia; Abreu, Ana; Gomes, Lucas; Mendes, Luciana; Lima, Isabella; Moura, Francisco; França, Cassandra; Ferraz, Cláudia
This paper presents and discusses the results of an intervention research conducted in Ouro Preto, Brazil from August 2014 to March 2016. The main objective was to contribute to the development of an intersectoral and interdisciplinary network to face psychosocial vulnerabilities of children and teenagers, especially related to sexual violence and drug use. To achieve this, we identified the difficulties faced by the Sistema de Garantia de Direitos Humanos da Criança e do Adolescente (SGDHCA) implemented by the municipality which take care of this population. We also identified protective and promotion factors accomplished to empower them. The methodology used combines Deleuze and Guattari Cartography, Institutional Analysis and the Cross Training. This latter methodology was developed by a group of researchers of Douglas Institute, in Montreal, which we met through scientific co-operation with our laboratory. On account of the practical-theoric and co-participative activities with the professional network of Ouro Preto, we produced a detailed diagnosis of the SGDHCA and a document proposing short, medium and long-term strategies. As final result, we intend to help the local collective-the Forum Intersetorial da Infância e Juventude-to develop a work plan from the proposed actions. In this paper we will concentrate the potential of the methodology used by presenting outcome from two important moments of the research: the discussions of successful and unsuccessful cases that elucidate the network operation and the potential and difficulties arising from the Rotation Positional, important technical of the Cross Training.
Online location of a break in water distribution systems
NASA Astrophysics Data System (ADS)
Liang, Jianwen; Xiao, Di; Zhao, Xinhua; Zhang, Hongwei
2003-08-01
Breaks often occur to urban water distribution systems under severely cold weather, or due to corrosion of pipes, deformation of ground, etc., and the breaks cannot easily be located, especially immediately after the events. This paper develops a methodology to locate a break in a water distribution system by monitoring water pressure online at some nodes in the water distribution system. For the purpose of online monitoring, supervisory control and data acquisition (SCADA) technology can well be used. A neural network-based inverse analysis method is constructed for locating the break based on the variation of water pressure. The neural network is trained by using analytically simulated data from the water distribution system, and validated by using a set of data that have never been used in the training. It is found that the methodology provides a quick, effective, and practical way in which a break in a water distribution system can be located.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Young, S.C.
1993-08-01
This report discusses a field demonstration of a methodology for characterizing an aquifer's geohydrology in the detail required to design an optimum network of wells and/or infiltration galleries for bioreclamation systems. The project work was conducted on a 1-hectare test site at Columbus AFB, Mississippi. The technical report is divided into two volumes. Volume I describes the test site and the well network, the assumptions, and the application of equations that define groundwater flow to a well, the results of three large-scale aquifer tests, and the results of 160 single-pump tests. Volume II describes the bore hole flowmeter tests, themore » tracer tests, the geological investigations, the geostatistical analysis and the guidelines for using groundwater models to design bioreclamation systems. Site characterization, Hydraulic conductivity, Groundwater flow, Geostatistics, Geohydrology, Monitoring wells.« less
NASA Astrophysics Data System (ADS)
Hupe, Patrick; Ceranna, Lars; Pilger, Christoph
2018-04-01
The International Monitoring System (IMS) has been established to monitor compliance with the Comprehensive Nuclear-Test-Ban Treaty and comprises four technologies, one of which is infrasound. When fully established, the IMS infrasound network consists of 60 sites uniformly distributed around the globe. Besides its primary purpose of determining explosions in the atmosphere, the recorded data reveal information on other anthropogenic and natural infrasound sources. Furthermore, the almost continuous multi-year recordings of differential and absolute air pressure allow for analysing the atmospheric conditions. In this paper, spectral analysis tools are applied to derive atmospheric dynamics from barometric time series. Based on the solar atmospheric tides, a methodology for performing geographic and temporal variability analyses is presented, which is supposed to serve for upcoming studies related to atmospheric dynamics. The surplus value of using the IMS infrasound network data for such purposes is demonstrated by comparing the findings on the thermal tides with previous studies and the Modern-Era Retrospective analysis for Research and Applications Version 2 (MERRA-2), which represents the solar tides well in its surface pressure fields. Absolute air pressure recordings reveal geographical characteristics of atmospheric tides related to the solar day and even to the lunar day. We therefore claim the chosen methodology of using the IMS infrasound network to be applicable for global and temporal studies on specific atmospheric dynamics. Given the accuracy and high temporal resolution of the barometric data from the IMS infrasound network, interactions with gravity waves and planetary waves can be examined in future for refining the knowledge of atmospheric dynamics, e.g. the origin of tidal harmonics up to 9 cycles per day as found in the barometric data sets. Data assimilation in empirical models of solar tides would be a valuable application of the IMS infrasound data.
Somvanshi, Pramod Rajaram; Venkatesh, K V
2014-03-01
Human physiology is an ensemble of various biological processes spanning from intracellular molecular interactions to the whole body phenotypic response. Systems biology endures to decipher these multi-scale biological networks and bridge the link between genotype to phenotype. The structure and dynamic properties of these networks are responsible for controlling and deciding the phenotypic state of a cell. Several cells and various tissues coordinate together to generate an organ level response which further regulates the ultimate physiological state. The overall network embeds a hierarchical regulatory structure, which when unusually perturbed can lead to undesirable physiological state termed as disease. Here, we treat a disease diagnosis problem analogous to a fault diagnosis problem in engineering systems. Accordingly we review the application of engineering methodologies to address human diseases from systems biological perspective. The review highlights potential networks and modeling approaches used for analyzing human diseases. The application of such analysis is illustrated in the case of cancer and diabetes. We put forth a concept of cell-to-human framework comprising of five modules (data mining, networking, modeling, experimental and validation) for addressing human physiology and diseases based on a paradigm of system level analysis. The review overtly emphasizes on the importance of multi-scale biological networks and subsequent modeling and analysis for drug target identification and designing efficient therapies.
Decoding the Regulatory Network for Blood Development from Single-Cell Gene Expression Measurements
Haghverdi, Laleh; Lilly, Andrew J.; Tanaka, Yosuke; Wilkinson, Adam C.; Buettner, Florian; Macaulay, Iain C.; Jawaid, Wajid; Diamanti, Evangelia; Nishikawa, Shin-Ichi; Piterman, Nir; Kouskoff, Valerie; Theis, Fabian J.; Fisher, Jasmin; Göttgens, Berthold
2015-01-01
Here we report the use of diffusion maps and network synthesis from state transition graphs to better understand developmental pathways from single cell gene expression profiling. We map the progression of mesoderm towards blood in the mouse by single-cell expression analysis of 3,934 cells, capturing cells with blood-forming potential at four sequential developmental stages. By adapting the diffusion plot methodology for dimensionality reduction to single-cell data, we reconstruct the developmental journey to blood at single-cell resolution. Using transitions between individual cellular states as input, we develop a single-cell network synthesis toolkit to generate a computationally executable transcriptional regulatory network model that recapitulates blood development. Model predictions were validated by showing that Sox7 inhibits primitive erythropoiesis, and that Sox and Hox factors control early expression of Erg. We therefore demonstrate that single-cell analysis of a developing organ coupled with computational approaches can reveal the transcriptional programs that control organogenesis. PMID:25664528
1992-12-21
in preparation). Foundations of artificial intelligence. Cambridge, MA: MIT Press. O’Reilly, R. C. (1991). X3DNet: An X- Based Neural Network ...2.2.3 Trace based protocol analysis 19 2.2A Summary of important data features 21 2.3 Tools related to process model testing 23 2.3.1 Tools for building...algorithm 57 3. Requirements for testing process models using trace based protocol 59 analysis 3.1 Definition of trace based protocol analysis (TBPA) 59
Clustered Numerical Data Analysis Using Markov Lie Monoid Based Networks
NASA Astrophysics Data System (ADS)
Johnson, Joseph
2016-03-01
We have designed and build an optimal numerical standardization algorithm that links numerical values with their associated units, error level, and defining metadata thus supporting automated data exchange and new levels of artificial intelligence (AI). The software manages all dimensional and error analysis and computational tracing. Tables of entities verses properties of these generalized numbers (called ``metanumbers'') support a transformation of each table into a network among the entities and another network among their properties where the network connection matrix is based upon a proximity metric between the two items. We previously proved that every network is isomorphic to the Lie algebra that generates continuous Markov transformations. We have also shown that the eigenvectors of these Markov matrices provide an agnostic clustering of the underlying patterns. We will present this methodology and show how our new work on conversion of scientific numerical data through this process can reveal underlying information clusters ordered by the eigenvalues. We will also show how the linking of clusters from different tables can be used to form a ``supernet'' of all numerical information supporting new initiatives in AI.
Sanz-García, Ancor; Vega-Zelaya, Lorena; Pastor, Jesús; Torres, Cristina V.; Sola, Rafael G.; Ortega, Guillermo J.
2016-01-01
Approximately 30% of epilepsy patients are refractory to antiepileptic drugs. In these cases, surgery is the only alternative to eliminate/control seizures. However, a significant minority of patients continues to exhibit post-operative seizures, even in those cases in which the suspected source of seizures has been correctly localized and resected. The protocol presented here combines a clinical procedure routinely employed during the pre-operative evaluation of temporal lobe epilepsy (TLE) patients with a novel technique for network analysis. The method allows for the evaluation of the temporal evolution of mesial network parameters. The bilateral insertion of foramen ovale electrodes (FOE) into the ambient cistern simultaneously records electrocortical activity at several mesial areas in the temporal lobe. Furthermore, network methodology applied to the recorded time series tracks the temporal evolution of the mesial networks both interictally and during the seizures. In this way, the presented protocol offers a unique way to visualize and quantify measures that considers the relationships between several mesial areas instead of a single area. PMID:28060326
Structure and evolution of a European Parliament via a network and correlation analysis
NASA Astrophysics Data System (ADS)
Puccio, Elena; Pajala, Antti; Piilo, Jyrki; Tumminello, Michele
2016-11-01
We present a study of the network of relationships among elected members of the Finnish parliament, based on a quantitative analysis of initiative co-signatures, and its evolution over 16 years. To understand the structure of the parliament, we constructed a statistically validated network of members, based on the similarity between the patterns of initiatives they signed. We looked for communities within the network and characterized them in terms of members' attributes, such as electoral district and party. To gain insight on the nested structure of communities, we constructed a hierarchical tree of members from the correlation matrix. Afterwards, we studied parliament dynamics yearly, with a focus on correlations within and between parties, by also distinguishing between government and opposition. Finally, we investigated the role played by specific individuals, at a local level. In particular, whether they act as proponents who gather consensus, or as signers. Our results provide a quantitative background to current theories in political science. From a methodological point of view, our network approach has proven able to highlight both local and global features of a complex social system.
Tsiouris, Κostas Μ; Pezoulas, Vasileios C; Zervakis, Michalis; Konitsiotis, Spiros; Koutsouris, Dimitrios D; Fotiadis, Dimitrios I
2018-05-17
The electroencephalogram (EEG) is the most prominent means to study epilepsy and capture changes in electrical brain activity that could declare an imminent seizure. In this work, Long Short-Term Memory (LSTM) networks are introduced in epileptic seizure prediction using EEG signals, expanding the use of deep learning algorithms with convolutional neural networks (CNN). A pre-analysis is initially performed to find the optimal architecture of the LSTM network by testing several modules and layers of memory units. Based on these results, a two-layer LSTM network is selected to evaluate seizure prediction performance using four different lengths of preictal windows, ranging from 15 min to 2 h. The LSTM model exploits a wide range of features extracted prior to classification, including time and frequency domain features, between EEG channels cross-correlation and graph theoretic features. The evaluation is performed using long-term EEG recordings from the open CHB-MIT Scalp EEG database, suggest that the proposed methodology is able to predict all 185 seizures, providing high rates of seizure prediction sensitivity and low false prediction rates (FPR) of 0.11-0.02 false alarms per hour, depending on the duration of the preictal window. The proposed LSTM-based methodology delivers a significant increase in seizure prediction performance compared to both traditional machine learning techniques and convolutional neural networks that have been previously evaluated in the literature. Copyright © 2018 Elsevier Ltd. All rights reserved.
Bando, Silvia Yumi; Silva, Filipi Nascimento; Costa, Luciano da Fontoura; Silva, Alexandre V.; Pimentel-Silva, Luciana R.; Castro, Luiz HM.; Wen, Hung-Tzu; Amaro, Edson; Moreira-Filho, Carlos Alberto
2013-01-01
We previously described – studying transcriptional signatures of hippocampal CA3 explants – that febrile (FS) and afebrile (NFS) forms of refractory mesial temporal lobe epilepsy constitute two distinct genomic phenotypes. That network analysis was based on a limited number (hundreds) of differentially expressed genes (DE networks) among a large set of valid transcripts (close to two tens of thousands). Here we developed a methodology for complex network visualization (3D) and analysis that allows the categorization of network nodes according to distinct hierarchical levels of gene-gene connections (node degree) and of interconnection between node neighbors (concentric node degree). Hubs are highly connected nodes, VIPs have low node degree but connect only with hubs, and high-hubs have VIP status and high overall number of connections. Studying the whole set of CA3 valid transcripts we: i) obtained complete transcriptional networks (CO) for FS and NFS phenotypic groups; ii) examined how CO and DE networks are related; iii) characterized genomic and molecular mechanisms underlying FS and NFS phenotypes, identifying potential novel targets for therapeutic interventions. We found that: i) DE hubs and VIPs are evenly distributed inside the CO networks; ii) most DE hubs and VIPs are related to synaptic transmission and neuronal excitability whereas most CO hubs, VIPs and high hubs are related to neuronal differentiation, homeostasis and neuroprotection, indicating compensatory mechanisms. Complex network visualization and analysis is a useful tool for systems biology approaches to multifactorial diseases. Network centrality observed for hubs, VIPs and high hubs of CO networks, is consistent with the network disease model, where a group of nodes whose perturbation leads to a disease phenotype occupies a central position in the network. Conceivably, the chance for exerting therapeutic effects through the modulation of particular genes will be higher if these genes are highly interconnected in transcriptional networks. PMID:24278214
NASA Astrophysics Data System (ADS)
Nakagawa, M.; Akano, K.; Kobayashi, T.; Sekiguchi, Y.
2017-09-01
Image-based virtual reality (VR) is a virtual space generated with panoramic images projected onto a primitive model. In imagebased VR, realistic VR scenes can be generated with lower rendering cost, and network data can be described as relationships among VR scenes. The camera network data are generated manually or by an automated procedure using camera position and rotation data. When panoramic images are acquired in indoor environments, network data should be generated without Global Navigation Satellite Systems (GNSS) positioning data. Thus, we focused on image-based VR generation using a panoramic camera in indoor environments. We propose a methodology to automate network data generation using panoramic images for an image-based VR space. We verified and evaluated our methodology through five experiments in indoor environments, including a corridor, elevator hall, room, and stairs. We confirmed that our methodology can automatically reconstruct network data using panoramic images for image-based VR in indoor environments without GNSS position data.
Paule‐Mandel estimators for network meta‐analysis with random inconsistency effects
Veroniki, Areti Angeliki; Law, Martin; Tricco, Andrea C.; Baker, Rose
2017-01-01
Network meta‐analysis is used to simultaneously compare multiple treatments in a single analysis. However, network meta‐analyses may exhibit inconsistency, where direct and different forms of indirect evidence are not in agreement with each other, even after allowing for between‐study heterogeneity. Models for network meta‐analysis with random inconsistency effects have the dual aim of allowing for inconsistencies and estimating average treatment effects across the whole network. To date, two classical estimation methods for fitting this type of model have been developed: a method of moments that extends DerSimonian and Laird's univariate method and maximum likelihood estimation. However, the Paule and Mandel estimator is another recommended classical estimation method for univariate meta‐analysis. In this paper, we extend the Paule and Mandel method so that it can be used to fit models for network meta‐analysis with random inconsistency effects. We apply all three estimation methods to a variety of examples that have been used previously and we also examine a challenging new dataset that is highly heterogenous. We perform a simulation study based on this new example. We find that the proposed Paule and Mandel method performs satisfactorily and generally better than the previously proposed method of moments because it provides more accurate inferences. Furthermore, the Paule and Mandel method possesses some advantages over likelihood‐based methods because it is both semiparametric and requires no convergence diagnostics. Although restricted maximum likelihood estimation remains the gold standard, the proposed methodology is a fully viable alternative to this and other estimation methods. PMID:28585257
NASA Technical Reports Server (NTRS)
Madrid, G. A.; Westmoreland, P. T.
1983-01-01
A progress report is presented on a program to upgrade the existing NASA Deep Space Network in terms of a redesigned computer-controlled data acquisition system for channelling tracking, telemetry, and command data between a California-based control center and three signal processing centers in Australia, California, and Spain. The methodology for the improvements is oriented towards single subsystem development with consideration for a multi-system and multi-subsystem network of operational software. Details of the existing hardware configurations and data transmission links are provided. The program methodology includes data flow design, interface design and coordination, incremental capability availability, increased inter-subsystem developmental synthesis and testing, system and network level synthesis and testing, and system verification and validation. The software has been implemented thus far to a 65 percent completion level, and the methodology being used to effect the changes, which will permit enhanced tracking and communication with spacecraft, has been concluded to feature effective techniques.
Visual behavior characterization for intrusion and misuse detection
NASA Astrophysics Data System (ADS)
Erbacher, Robert F.; Frincke, Deborah
2001-05-01
As computer and network intrusions become more and more of a concern, the need for better capabilities, to assist in the detection and analysis of intrusions also increase. System administrators typically rely on log files to analyze usage and detect misuse. However, as a consequence of the amount of data collected by each machine, multiplied by the tens or hundreds of machines under the system administrator's auspices, the entirety of the data available is neither collected nor analyzed. This is compounded by the need to analyze network traffic data as well. We propose a methodology for analyzing network and computer log information visually based on the analysis of the behavior of the users. Each user's behavior is the key to determining their intent and overriding activity, whether they attempt to hide their actions or not. Proficient hackers will attempt to hide their ultimate activities, which hinders the reliability of log file analysis. Visually analyzing the users''s behavior however, is much more adaptable and difficult to counteract.
NASA Technical Reports Server (NTRS)
Blakelee, Richard
1999-01-01
A four station Advanced Lightning Direction Finder (ALDF) network was recently established in the state of Rondonia in western Brazil through a collaboration of U.S. and Brazilian participants from NASA, INPE, INMET, and various universities. The network utilizes ALDF IMPACT (Improved Accuracy from Combined Technology) sensors to provide cloud-to-ground lightning observations (i.e., stroke/flash locations, signal amplitude, and polarity) using both time-of-arrival and magnetic direction finding techniques. The observations are collected, processed and archived at a central site in Brasilia and at the NASA/Marshall Space Flight Center (MSFC) in Huntsville, Alabama. Initial, non-quality assured quick-look results are made available in near real-time over the internet. The network will remain deployed for several years to provide ground truth data for the Lightning Imaging Sensor (LIS) on the Tropical Rainfall Measurement Mission (TRMM) satellite which was launched in November 1997. The measurements will also be used to investigate the relationship between the electrical, microphysical and kinematic properties of tropical convection. In addition, the long-term observations from this network will contribute in establishing a regional lightning climatological data base, supplementing other data bases in Brazil that already exist or may soon be implemented. Analytic inversion algorithms developed at NASA/MSFC are now being applied to the Rondonian ALDF lightning observations to obtain site error corrections and improved location retrievals. The processing methodology and the initial results from an analysis of the first 6 months of network operations will be presented.
NASA Technical Reports Server (NTRS)
Blakeslee, Rich; Bailey, Jeff; Koshak, Bill
1999-01-01
A four station Advanced Lightning Direction Finder (ALDF) network was recently established in the state of Rondonia in western Brazil through a collaboration of U.S. and Brazilian participants from NASA, INPE, INMET, and various universities. The network utilizes ALDF IMPACT (Improved Accuracy from Combined Technology) sensors to provide cloud-to-ground lightning observations (i.e., stroke/flash locations, signal amplitude, and polarity) using both time-of-arrival and magnetic direction finding techniques. The observations are collected, processed and archived at a central site in Brasilia and at the NASA/ Marshall Space Flight Center (MSFC) in Huntsville, Alabama. Initial, non-quality assured quick-look results are made available in near real-time over the internet. The network will remain deployed for several years to provide ground truth data for the Lightning Imaging Sensor (LIS) on the Tropical Rainfall Measuring Mission (TRMM) satellite which was launched in November 1997. The measurements will also be used to investigate the relationship between the electrical, microphysical and kinematic properties of tropical convection. In addition, the long-term observations from this network will contribute in establishing a regional lightning climatological data base, supplementing other data bases in Brazil that already exist or may soon be implemented. Analytic inversion algorithms developed at NASA/Marshall Space Flight Center (MSFC) are now being applied to the Rondonian ALDF lightning observations to obtain site error corrections and improved location retrievals. The processing methodology and the initial results from an analysis of the first 6 months of network operations will be presented.
Node-making process in network meta-analysis of nonpharmacological treatment are poorly reported.
James, Arthur; Yavchitz, Amélie; Ravaud, Philippe; Boutron, Isabelle
2018-05-01
To identify methods to support the node-making process in network meta-analyses (NMAs) of nonpharmacological treatments. We proceeded in two stages. First, we conducted a literature review of guidelines and methodological articles about NMAs to identify methods proposed to lump interventions into nodes. Second, we conducted a systematic review of NMAs of nonpharmacological treatments to extract methods used by authors to support their node-making process. MEDLINE and Google Scholar were searched to identify articles assessing NMA guidelines or methodology intended for NMA authors. MEDLINE, CENTRAL, and EMBASE were searched to identify reports of NMAs including at least one nonpharmacological treatment. Both searches involved articles available from database inception to March 2016. From the methodological review, we identified and extracted methods proposed to lump interventions into nodes. From the systematic review, the reporting of the network was assessed as long as the method described supported the node-making process. Among the 116 articles retrieved in the literature review, 12 (10%) discussed the concept of lumping or splitting interventions in NMAs. No consensual method was identified during the methodological review, and expert consensus was the only method proposed to support the node-making process. Among 5187 references for the systematic review, we included 110 reports of NMAs published between 2007 and 2016. The nodes were described in the introduction section of 88 reports (80%), which suggested that the node content might have been a priori decided before the systematic review. Nine reports (8.1%) described a specific process or justification to build nodes for the network. Two methods were identified: (1) fit a previously published classification and (2) expert consensus. Despite the importance of NMA in the delivery of evidence when several interventions are available for a single indication, recommendations on the reporting of the node-making process in NMAs are lacking, and reporting of the node-making process in NMAs seems insufficient. Copyright © 2017 Elsevier Inc. All rights reserved.
Collagen morphology and texture analysis: from statistics to classification
Mostaço-Guidolin, Leila B.; Ko, Alex C.-T.; Wang, Fei; Xiang, Bo; Hewko, Mark; Tian, Ganghong; Major, Arkady; Shiomi, Masashi; Sowa, Michael G.
2013-01-01
In this study we present an image analysis methodology capable of quantifying morphological changes in tissue collagen fibril organization caused by pathological conditions. Texture analysis based on first-order statistics (FOS) and second-order statistics such as gray level co-occurrence matrix (GLCM) was explored to extract second-harmonic generation (SHG) image features that are associated with the structural and biochemical changes of tissue collagen networks. Based on these extracted quantitative parameters, multi-group classification of SHG images was performed. With combined FOS and GLCM texture values, we achieved reliable classification of SHG collagen images acquired from atherosclerosis arteries with >90% accuracy, sensitivity and specificity. The proposed methodology can be applied to a wide range of conditions involving collagen re-modeling, such as in skin disorders, different types of fibrosis and muscular-skeletal diseases affecting ligaments and cartilage. PMID:23846580
NASA Astrophysics Data System (ADS)
Tsiotas, Dimitrios; Polyzos, Serafeim
2018-02-01
This article studies the topological consistency of spatial networks due to node aggregation, examining the changes captured between different network representations that result from nodes' grouping and they refer to the same socioeconomic system. The main purpose of this study is to evaluate what kind of topological information remains unalterable due to node aggregation and, further, to develop a framework for linking the data of an empirical network with data of its socioeconomic environment, when the latter are available for hierarchically higher levels of aggregation, in an effort to promote the interdisciplinary research in the field of complex network analysis. The research question is empirically tested on topological and socioeconomic data extracted from the Greek Maritime Network (GMN) that is modeled as a non-directed multilayer (bilayer) graph consisting of a port-layer, where nodes represent ports, and a prefecture-layer, where nodes represent coastal and insular prefectural groups of ports. The analysis highlights that the connectivity (degree) of the GMN is the most consistent aspect of this multilayer network, which preserves both the topological and the socioeconomic information through node aggregation. In terms of spatial analysis and regional science, such effects illustrate the effectiveness of the prefectural administrative division for the functionality of the Greek maritime transportation system. Overall, this approach proposes a methodological framework that can enjoy further applications about the grouping effects induced on the network topology, providing physical, technical, socioeconomic, strategic or political insights.
NASA Astrophysics Data System (ADS)
Pages, Lucien; Bertel, Evelyne; Joffre, Henri; Sklavenitis, Laodamas
2012-12-01
Even though the United States lacks a national climate policy, significant action has occurred at the local and regional levels. Some of the most aggressive climate change policies have occurred at the state and local levels and in interagency cooperation on specific management issues. While there is a long history of partnerships in dealing with a wide variety of policy issues, the uncertainty and the political debate surrounding climate change has generated new challenges to establishing effective policy networks. This paper investigates the formation of climate policy networks in the State of Nevada. It presents a methodology based on social network analysis for assessing the structure and function of local policy networks across a range of substantive climate impacted resources (water, landscape management, conservation, forestry and others). It draws from an emerging literature on federalism and climate policy, public sector innovation, and institutional analysis in socio-ecological systems. Comparisons across different policy issue networks in the state are used to highlight the influence of network structure, connectivity, bridging across vertical and horizontal organizational units, organizational diversity, and flows between organizational nodes.
[Social support network and health of elderly individuals with chronic pneumopathies].
Mesquita, Rafael Barreto de; Morano, Maria Tereza Aguiar Pessoa; Landim, Fátima Luna Pinheiro; Collares, Patrícia Moreira Costa; Pinto, Juliana Maria de Sousa
2012-05-01
This study sought to analyze characteristics of the social support network of the elderly with chronic pneumopathies, establishing links with health maintenance/rehabilitation. The assumptions of Social Network Analysis (SNA) methodology were used, addressing the social support concept. A questionnaire and semi-structured interviews, both applied to 16 elderly people attended by a public hospital in Fortaleza-CE, were used for data collection. Quantitative data were processed using the UCINET 6.123, NetDraw 2.38 and Microsoft Excel software programs. In the qualitative analysis, the body of material was subjected to interpretations based on relevant and current theoretical references. Each informant brought an average of 10.37 individuals into the network. Among the 3 types of social support, there was a predominance of informational support given by health professionals. The importance of reciprocity in providing/receiving social support was also noted, as well as the participation of health professionals and the family functioning as social support. The conclusion reached was that the network of the elderly with pneumopathies is not cohesive, being restricted to the personal network of each individual, and that even so, the informants recognize and are satisfied with the social support it provides.
NASA Astrophysics Data System (ADS)
Sourabh, Nishant; Timbadiya, P. V.
2018-04-01
The hydraulic simulation of the existing sewerage network provides various information about critical points to assess the deteriorating condition and help in rehabilitation of existing network and future expansion. In the present study, hydraulic and condition assessment of existing network of educational Institute (i.e. Sardar Vallabhbhai National Institute of Technology-Surat, Gujarat, India), having an area of 100 ha and ground levels in range of 5.0-9.0 m above mean sea level, has been carried out using sewage flow simulation for existing and future scenarios analysis using SewerGEMS v8i. The paper describes the features of 4.79 km long sewerage network of institute followed by network model simulation for aforesaid scenarios and recommendations on improvement of the existing network for future use. The total sewer loads for present and future scenarios are 1.67 million litres per day (MLD) and 3.62 MLD, considering the peak factor of 3 on the basis of population. The hydraulic simulation of the existing scenario indicated depth by diameter (d/D) ratio in the range of 0.02-0.48 and velocity range of 0.08-0.53 m/s for existing network for present scenario. For the future scenario, the existing network is needed to be modified and it was found that total of 11 conduits (length: 464.8 m) should be replaced to the next higher diameter available, i.e., 350 mm for utilization of existing network for future scenario. The present study provides the methodology for condition assessment of existing network and its utilization as per guidelines provided by Central Public Health and Environmental Engineering Organization, 2013. The methodology presented in this paper can be used by municipal/public health engineer for the assessment of existing sewerage network for its serviceability and improvement in future.
Multidimensional stock network analysis: An Escoufier's RV coefficient approach
NASA Astrophysics Data System (ADS)
Lee, Gan Siew; Djauhari, Maman A.
2013-09-01
The current practice of stocks network analysis is based on the assumption that the time series of closed stock price could represent the behaviour of the each stock. This assumption leads to consider minimal spanning tree (MST) and sub-dominant ultrametric (SDU) as an indispensible tool to filter the economic information contained in the network. Recently, there is an attempt where researchers represent stock not only as a univariate time series of closed price but as a bivariate time series of closed price and volume. In this case, they developed the so-called multidimensional MST to filter the important economic information. However, in this paper, we show that their approach is only applicable for that bivariate time series only. This leads us to introduce a new methodology to construct MST where each stock is represented by a multivariate time series. An example of Malaysian stock exchange will be presented and discussed to illustrate the advantages of the method.
Cognitive Task Analysis of Network Analysts and Managers for Network Situational Awareness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erbacher, Robert; Frincke, Deborah A.; Wong, Pak C.
The goal of the project was to create a set of next generation cyber situational awareness capabilities with applications to other domains in the long term. The goal is to improve the decision making process such that decision makers can choose better actions. To this end, we put extensive effort into ensuring we had feedback from network analysts and managers and understood what their needs truly were. Consequently, this is the focus of this portion of the research. This paper discusses the methodology we followed to acquire this feedback from the analysts, namely a cognitive task analysis. Additionally, this papermore » provides the details we acquired from the analysts. This essentially provides details on their processes, goals, concerns, the data and meta-data they analyze, etc. A final result we describe is the generation of a task-flow diagram.« less
Development blocks in innovation networks: The Swedish manufacturing industry, 1970-2007.
Taalbi, Josef
2017-01-01
The notion of development blocks (Dahmén, 1950, 1991) suggests the co-evolution of technologies and industries through complementarities and the overcoming of imbalances. This study proposes and applies a methodology to analyse development blocks empirically. To assess the extent and character of innovational interdependencies between industries the study combines analysis of innovation biographies and statistical network analysis. This is made possible by using data from a newly constructed innovation output database for Sweden. The study finds ten communities of closely related industries in which innovation activity has been prompted by the emergence of technological imbalances or by the exploitation of new technological opportunities. The communities found in the Swedish network of innovation are shown to be stable over time and often characterized by strong user-supplier interdependencies. These findings serve to stress how historical imbalances and opportunities are key to understanding the dynamics of the long-run development of industries and new technologies.
Bae, Youngoh; Yoo, Byeong Wook; Lee, Jung Chan; Kim, Hee Chan
2017-05-01
Detection and diagnosis based on extracting features and classification using electroencephalography (EEG) signals are being studied vigorously. A network analysis of time series EEG signal data is one of many techniques that could help study brain functions. In this study, we analyze EEG to diagnose alcoholism. We propose a novel methodology to estimate the differences in the status of the brain based on EEG data of normal subjects and data from alcoholics by computing many parameters stemming from effective network using Granger causality. Among many parameters, only ten parameters were chosen as final candidates. By the combination of ten graph-based parameters, our results demonstrate predictable differences between alcoholics and normal subjects. A support vector machine classifier with best performance had 90% accuracy with sensitivity of 95.3%, and specificity of 82.4% for differentiating between the two groups.
Tracking cohesive subgroups over time in inferred social networks
NASA Astrophysics Data System (ADS)
Chin, Alvin; Chignell, Mark; Wang, Hao
2010-04-01
As a first step in the development of community trackers for large-scale online interaction, this paper shows how cohesive subgroup analysis using the Social Cohesion Analysis of Networks (SCAN; Chin and Chignell 2008) and Data-Intensive Socially Similar Evolving Community Tracker (DISSECT; Chin and Chignell 2010) methods can be applied to the problem of identifying cohesive subgroups and tracking them over time. Three case studies are reported, and the findings are used to evaluate how well the SCAN and DISSECT methods work for different types of data. In the largest of the case studies, variations in temporal cohesiveness are identified across a set of subgroups extracted from the inferred social network. Further modifications to the DISSECT methodology are suggested based on the results obtained. The paper concludes with recommendations concerning further research that would be beneficial in addressing the community tracking problem for online data.
NASA Astrophysics Data System (ADS)
Sword-Daniels, V. L.; Rossetto, T.; Wilson, T. M.; Sargeant, S.
2015-05-01
The essential services that support urban living are complex and interdependent, and their disruption in disasters directly affects society. Yet there are few empirical studies to inform our understanding of the vulnerabilities and resilience of complex infrastructure systems in disasters. This research takes a systems thinking approach to explore the dynamic behaviour of a network of essential services, in the presence and absence of volcanic ashfall hazards in Montserrat, West Indies. Adopting a case study methodology and qualitative methods to gather empirical data, we centre the study on the healthcare system and its interconnected network of essential services. We identify different types of relationship between sectors and develop a new interdependence classification system for analysis. Relationships are further categorised by hazard conditions, for use in extensive risk contexts. During heightened volcanic activity, relationships between systems transform in both number and type: connections increase across the network by 41%, and adapt to increase cooperation and information sharing. Interconnections add capacities to the network, increasing the resilience of prioritised sectors. This in-depth and context-specific approach provides a new methodology for studying the dynamics of infrastructure interdependence in an extensive risk context, and can be adapted for use in other hazard contexts.
NASA Astrophysics Data System (ADS)
Sword-Daniels, V. L.; Rossetto, T.; Wilson, T. M.; Sargeant, S.
2015-02-01
The essential services that support urban living are complex and interdependent, and their disruption in disasters directly affects society. Yet there are few empirical studies to inform our understanding of the vulnerabilities and resilience of complex infrastructure systems in disasters. This research takes a systems thinking approach to explore the dynamic behaviour of a network of essential services, in the presence and absence of volcanic ashfall hazards in Montserrat, West Indies. Adopting a case study methodology and qualitative methods to gather empirical data we centre the study on the healthcare system and its interconnected network of essential services. We identify different types of relationship between sectors and develop a new interdependence classification system for analysis. Relationships are further categorised by hazard condition, for use in extensive risk contexts. During heightened volcanic activity, relationships between systems transform in both number and type: connections increase across the network by 41%, and adapt to increase cooperation and information sharing. Interconnections add capacities to the network, increasing the resilience of prioritised sectors. This in-depth and context-specific approach provides a new methodology for studying the dynamics of infrastructure interdependence in an extensive risk context, and can be adapted for use in other hazard contexts.
A statistical method for measuring activation of gene regulatory networks.
Esteves, Gustavo H; Reis, Luiz F L
2018-06-13
Gene expression data analysis is of great importance for modern molecular biology, given our ability to measure the expression profiles of thousands of genes and enabling studies rooted in systems biology. In this work, we propose a simple statistical model for the activation measuring of gene regulatory networks, instead of the traditional gene co-expression networks. We present the mathematical construction of a statistical procedure for testing hypothesis regarding gene regulatory network activation. The real probability distribution for the test statistic is evaluated by a permutation based study. To illustrate the functionality of the proposed methodology, we also present a simple example based on a small hypothetical network and the activation measuring of two KEGG networks, both based on gene expression data collected from gastric and esophageal samples. The two KEGG networks were also analyzed for a public database, available through NCBI-GEO, presented as Supplementary Material. This method was implemented in an R package that is available at the BioConductor project website under the name maigesPack.
Zhang, Xiao-Dong; Wu, Hong-Ying; Jin, Jin; Yu, Guang-Yun; He, Xin; Wang, Hao; Shen, Xiu; Zhou, Ze-Wei; Liu, Pei-Xun; Fan, Sai-Jun
2013-01-01
A traditional Chinese medicine (TCM) formula network including 362 TCM formulas was built by using complex network methodologies. The properties of this network were analyzed including network diameter, average distance, clustering coefficient, and average degree. Meanwhile, we built a TCM chemical space and a TCM metabolism room under the theory of chemical space. The properties of chemical space and metabolism room were calculated and analyzed. The properties of the medicine pairs in “eighteen antagonisms and nineteen mutual inhibitors,” an ancient rule for TCM incompatibility, were studied based on the TCM formula network, chemical space, and metabolism room. The results showed that the properties of these incompatible medicine pairs are different from those of the other TCM based on the analysis of the TCM formula network, chemical space, and metabolism room. The lines of evidence derived from our work demonstrated that the ancient rule of TCM incompatibility, “eighteen antagonisms and nineteen mutual inhibitors,” is probably scientifically based. PMID:24369478
Network Ethnography and the "Cyberflâneur": Evolving Policy Sociology in Education
ERIC Educational Resources Information Center
Hogan, Anna
2016-01-01
This paper makes the argument that new global spatialities and new governance structures in education have important implications for how we think about education policy and do education policy analysis. This context necessitates that researchers engage in new methodologies to ensure that there is a suitable link between their research problem and…
The Brain Network for Deductive Reasoning: A Quantitative Meta-Analysis of 28 Neuroimaging Studies
ERIC Educational Resources Information Center
Prado, Jerome; Chadha, Angad; Booth, James R.
2011-01-01
Over the course of the past decade, contradictory claims have been made regarding the neural bases of deductive reasoning. Researchers have been puzzled by apparent inconsistencies in the literature. Some have even questioned the effectiveness of the methodology used to study the neural bases of deductive reasoning. However, the idea that…
Adoption of Agri-Environmental Measures by Organic Farmers: The Role of Interpersonal Communication
ERIC Educational Resources Information Center
Unay Gailhard, Ilkay; Bavorová, Miroslava; Pirscher, Frauke
2015-01-01
Purpose: The purpose of this study is to investigate the impact of interpersonal communication on the adoption of agri-environmental measures (AEM) by organic farmers in Germany. Methodology: The study used the logit model to predict the probability of adoption behaviour, and Social Network Analysis (SNA) was conducted to analyse the question of…
Implementing a Social-Ecological Model of Health in Wales
ERIC Educational Resources Information Center
Rothwell, Heather; Shepherd, Michael; Murphy, Simon; Burgess, Stephen; Townsend, Nick; Pimm, Claire
2010-01-01
Purpose: The purpose of this paper is to assess the implementation of the Welsh Network of Healthy School Schemes (WNHSS) at national, local and school levels, using a systems approach drawing on the Ottawa Charter. Design/methodology/approach: The approach takes the form of a single-case study using data from a documentary analysis, interviews…
DOT National Transportation Integrated Search
1995-01-01
Prepared ca. 1995. This paper illustrates the use of the simulation-optimization technique of response surface methodology (RSM) in traffic signal optimization of urban networks. It also quantifies the gains of using the common random number (CRN) va...
de Dumast, Priscille; Mirabel, Clément; Cevidanes, Lucia; Ruellas, Antonio; Yatabe, Marilia; Ioshida, Marcos; Ribera, Nina Tubau; Michoud, Loic; Gomes, Liliane; Huang, Chao; Zhu, Hongtu; Muniz, Luciana; Shoukri, Brandon; Paniagua, Beatriz; Styner, Martin; Pieper, Steve; Budin, Francois; Vimort, Jean-Baptiste; Pascal, Laura; Prieto, Juan Carlos
2018-07-01
The purpose of this study is to describe the methodological innovations of a web-based system for storage, integration and computation of biomedical data, using a training imaging dataset to remotely compute a deep neural network classifier of temporomandibular joint osteoarthritis (TMJOA). This study imaging dataset consisted of three-dimensional (3D) surface meshes of mandibular condyles constructed from cone beam computed tomography (CBCT) scans. The training dataset consisted of 259 condyles, 105 from control subjects and 154 from patients with diagnosis of TMJ OA. For the image analysis classification, 34 right and left condyles from 17 patients (39.9 ± 11.7 years), who experienced signs and symptoms of the disease for less than 5 years, were included as the testing dataset. For the integrative statistical model of clinical, biological and imaging markers, the sample consisted of the same 17 test OA subjects and 17 age and sex matched control subjects (39.4 ± 15.4 years), who did not show any sign or symptom of OA. For these 34 subjects, a standardized clinical questionnaire, blood and saliva samples were also collected. The technological methodologies in this study include a deep neural network classifier of 3D condylar morphology (ShapeVariationAnalyzer, SVA), and a flexible web-based system for data storage, computation and integration (DSCI) of high dimensional imaging, clinical, and biological data. The DSCI system trained and tested the neural network, indicating 5 stages of structural degenerative changes in condylar morphology in the TMJ with 91% close agreement between the clinician consensus and the SVA classifier. The DSCI remotely ran with a novel application of a statistical analysis, the Multivariate Functional Shape Data Analysis, that computed high dimensional correlations between shape 3D coordinates, clinical pain levels and levels of biological markers, and then graphically displayed the computation results. The findings of this study demonstrate a comprehensive phenotypic characterization of TMJ health and disease at clinical, imaging and biological levels, using novel flexible and versatile open-source tools for a web-based system that provides advanced shape statistical analysis and a neural network based classification of temporomandibular joint osteoarthritis. Published by Elsevier Ltd.
Genetic networks and soft computing.
Mitra, Sushmita; Das, Ranajit; Hayashi, Yoichi
2011-01-01
The analysis of gene regulatory networks provides enormous information on various fundamental cellular processes involving growth, development, hormone secretion, and cellular communication. Their extraction from available gene expression profiles is a challenging problem. Such reverse engineering of genetic networks offers insight into cellular activity toward prediction of adverse effects of new drugs or possible identification of new drug targets. Tasks such as classification, clustering, and feature selection enable efficient mining of knowledge about gene interactions in the form of networks. It is known that biological data is prone to different kinds of noise and ambiguity. Soft computing tools, such as fuzzy sets, evolutionary strategies, and neurocomputing, have been found to be helpful in providing low-cost, acceptable solutions in the presence of various types of uncertainties. In this paper, we survey the role of these soft methodologies and their hybridizations, for the purpose of generating genetic networks.
Bao, Weier; Greenwold, Matthew J; Sawyer, Roger H
2017-11-01
Gene co-expression network analysis has been a research method widely used in systematically exploring gene function and interaction. Using the Weighted Gene Co-expression Network Analysis (WGCNA) approach to construct a gene co-expression network using data from a customized 44K microarray transcriptome of chicken epidermal embryogenesis, we have identified two distinct modules that are highly correlated with scale or feather development traits. Signaling pathways related to feather development were enriched in the traditional KEGG pathway analysis and functional terms relating specifically to embryonic epidermal development were also enriched in the Gene Ontology analysis. Significant enrichment annotations were discovered from customized enrichment tools such as Modular Single-Set Enrichment Test (MSET) and Medical Subject Headings (MeSH). Hub genes in both trait-correlated modules showed strong specific functional enrichment toward epidermal development. Also, regulatory elements, such as transcription factors and miRNAs, were targeted in the significant enrichment result. This work highlights the advantage of this methodology for functional prediction of genes not previously associated with scale- and feather trait-related modules.
The dynamics of information-driven coordination phenomena: A transfer entropy analysis
Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro
2016-01-01
Data from social media provide unprecedented opportunities to investigate the processes that govern the dynamics of collective social phenomena. We consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of microblogging time series to extract directed networks of influence among geolocalized subunits in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social subunits. In the absence of clear exogenous driving, social collective phenomena can be represented as endogenously driven structural transitions of the information transfer network. This study provides results that can help define models and predictive algorithms for the analysis of societal events based on open source data. PMID:27051875
The dynamics of information-driven coordination phenomena: A transfer entropy analysis.
Borge-Holthoefer, Javier; Perra, Nicola; Gonçalves, Bruno; González-Bailón, Sandra; Arenas, Alex; Moreno, Yamir; Vespignani, Alessandro
2016-04-01
Data from social media provide unprecedented opportunities to investigate the processes that govern the dynamics of collective social phenomena. We consider an information theoretical approach to define and measure the temporal and structural signatures typical of collective social events as they arise and gain prominence. We use the symbolic transfer entropy analysis of microblogging time series to extract directed networks of influence among geolocalized subunits in social systems. This methodology captures the emergence of system-level dynamics close to the onset of socially relevant collective phenomena. The framework is validated against a detailed empirical analysis of five case studies. In particular, we identify a change in the characteristic time scale of the information transfer that flags the onset of information-driven collective phenomena. Furthermore, our approach identifies an order-disorder transition in the directed network of influence between social subunits. In the absence of clear exogenous driving, social collective phenomena can be represented as endogenously driven structural transitions of the information transfer network. This study provides results that can help define models and predictive algorithms for the analysis of societal events based on open source data.
Burn, Donald H.; Hannaford, Jamie; Hodgkins, Glenn A.; Whitfield, Paul H.; Thorne, Robin; Marsh, Terry
2012-01-01
Reference hydrologic networks (RHNs) can play an important role in monitoring for changes in the hydrological regime related to climate variation and change. Currently, the literature concerning hydrological response to climate variations is complex and confounded by the combinations of many methods of analysis, wide variations in hydrology, and the inclusion of data series that include changes in land use, storage regulation and water use in addition to those of climate. Three case studies that illustrate a variety of approaches to the analysis of data from RHNs are presented and used, together with a summary of studies from the literature, to develop approaches for the investigation of changes in the hydrological regime at a continental or global scale, particularly for international comparison. We present recommendations for an analysis framework and the next steps to advance such an initiative. There is a particular focus on the desirability of establishing standardized procedures and methodologies for both the creation of new national RHNs and the systematic analysis of data derived from a collection of RHNs.
Wen, Dingqiao; Yu, Yun; Hahn, Matthew W.; Nakhleh, Luay
2016-01-01
The role of hybridization and subsequent introgression has been demonstrated in an increasing number of species. Recently, Fontaine et al. (Science, 347, 2015, 1258524) conducted a phylogenomic analysis of six members of the Anopheles gambiae species complex. Their analysis revealed a reticulate evolutionary history and pointed to extensive introgression on all four autosomal arms. The study further highlighted the complex evolutionary signals that the co-occurrence of incomplete lineage sorting (ILS) and introgression can give rise to in phylogenomic analyses. While tree-based methodologies were used in the study, phylogenetic networks provide a more natural model to capture reticulate evolutionary histories. In this work, we reanalyse the Anopheles data using a recently devised framework that combines the multispecies coalescent with phylogenetic networks. This framework allows us to capture ILS and introgression simultaneously, and forms the basis for statistical methods for inferring reticulate evolutionary histories. The new analysis reveals a phylogenetic network with multiple hybridization events, some of which differ from those reported in the original study. To elucidate the extent and patterns of introgression across the genome, we devise a new method that quantifies the use of reticulation branches in the phylogenetic network by each genomic region. Applying the method to the mosquito data set reveals the evolutionary history of all the chromosomes. This study highlights the utility of ‘network thinking’ and the new insights it can uncover, in particular in phylogenomic analyses of large data sets with extensive gene tree incongruence. PMID:26808290
Ma, Zhanshan Sam
2018-05-01
Relatively little progress in the methodology for differentiating between the healthy and diseased microbiomes, beyond comparing microbial community diversities with traditional species richness or Shannon index, has been made. Network analysis has increasingly been called for the task, but most currently available microbiome datasets only allows for the construction of simple species correlation networks (SCNs). The main results from SCN analysis are a series of network properties such as network degree and modularity, but the metrics for these network properties often produce inconsistent evidence. We propose a simple new network property, the P/N ratio, defined as the ratio of positive links to the number of negative links in the microbial SCN. We postulate that the P/N ratio should reflect the balance between facilitative and inhibitive interactions among microbial species, possibly one of the most important changes occurring in diseased microbiome. We tested our hypothesis with five datasets representing five major human microbiome sites and discovered that the P/N ratio exhibits contrasting differences between healthy and diseased microbiomes and may be harnessed as an in silico biomarker for detecting disease-associated changes in the human microbiome, and may play an important role in personalized diagnosis of the human microbiome-associated diseases.
Challenges to inferring causality from viral information dispersion in dynamic social networks
NASA Astrophysics Data System (ADS)
Ternovski, John
2014-06-01
Understanding the mechanism behind large-scale information dispersion through complex networks has important implications for a variety of industries ranging from cyber-security to public health. With the unprecedented availability of public data from online social networks (OSNs) and the low cost nature of most OSN outreach, randomized controlled experiments, the "gold standard" of causal inference methodologies, have been used with increasing regularity to study viral information dispersion. And while these studies have dramatically furthered our understanding of how information disseminates through social networks by isolating causal mechanisms, there are still major methodological concerns that need to be addressed in future research. This paper delineates why modern OSNs are markedly different from traditional sociological social networks and why these differences present unique challenges to experimentalists and data scientists. The dynamic nature of OSNs is particularly troublesome for researchers implementing experimental designs, so this paper identifies major sources of bias arising from network mutability and suggests strategies to circumvent and adjust for these biases. This paper also discusses the practical considerations of data quality and collection, which may adversely impact the efficiency of the estimator. The major experimental methodologies used in the current literature on virality are assessed at length, and their strengths and limits identified. Other, as-yetunsolved threats to the efficiency and unbiasedness of causal estimators--such as missing data--are also discussed. This paper integrates methodologies and learnings from a variety of fields under an experimental and data science framework in order to systematically consolidate and identify current methodological limitations of randomized controlled experiments conducted in OSNs.
Statistical Model Applied to NetFlow for Network Intrusion Detection
NASA Astrophysics Data System (ADS)
Proto, André; Alexandre, Leandro A.; Batista, Maira L.; Oliveira, Isabela L.; Cansian, Adriano M.
The computers and network services became presence guaranteed in several places. These characteristics resulted in the growth of illicit events and therefore the computers and networks security has become an essential point in any computing environment. Many methodologies were created to identify these events; however, with increasing of users and services on the Internet, many difficulties are found in trying to monitor a large network environment. This paper proposes a methodology for events detection in large-scale networks. The proposal approaches the anomaly detection using the NetFlow protocol, statistical methods and monitoring the environment in a best time for the application.
Small-World Brain Networks Revisited
Bassett, Danielle S.; Bullmore, Edward T.
2016-01-01
It is nearly 20 years since the concept of a small-world network was first quantitatively defined, by a combination of high clustering and short path length; and about 10 years since this metric of complex network topology began to be widely applied to analysis of neuroimaging and other neuroscience data as part of the rapid growth of the new field of connectomics. Here, we review briefly the foundational concepts of graph theoretical estimation and generation of small-world networks. We take stock of some of the key developments in the field in the past decade and we consider in some detail the implications of recent studies using high-resolution tract-tracing methods to map the anatomical networks of the macaque and the mouse. In doing so, we draw attention to the important methodological distinction between topological analysis of binary or unweighted graphs, which have provided a popular but simple approach to brain network analysis in the past, and the topology of weighted graphs, which retain more biologically relevant information and are more appropriate to the increasingly sophisticated data on brain connectivity emerging from contemporary tract-tracing and other imaging studies. We conclude by highlighting some possible future trends in the further development of weighted small-worldness as part of a deeper and broader understanding of the topology and the functional value of the strong and weak links between areas of mammalian cortex. PMID:27655008
Stochastic Geometric Network Models for Groups of Functional and Structural Connectomes
Friedman, Eric J.; Landsberg, Adam S.; Owen, Julia P.; Li, Yi-Ou; Mukherjee, Pratik
2014-01-01
Structural and functional connectomes are emerging as important instruments in the study of normal brain function and in the development of new biomarkers for a variety of brain disorders. In contrast to single-network studies that presently dominate the (non-connectome) network literature, connectome analyses typically examine groups of empirical networks and then compare these against standard (stochastic) network models. Current practice in connectome studies is to employ stochastic network models derived from social science and engineering contexts as the basis for the comparison. However, these are not necessarily best suited for the analysis of connectomes, which often contain groups of very closely related networks, such as occurs with a set of controls or a set of patients with a specific disorder. This paper studies important extensions of standard stochastic models that make them better adapted for analysis of connectomes, and develops new statistical fitting methodologies that account for inter-subject variations. The extensions explicitly incorporate geometric information about a network based on distances and inter/intra hemispherical asymmetries (to supplement ordinary degree-distribution information), and utilize a stochastic choice of networks' density levels (for fixed threshold networks) to better capture the variance in average connectivity among subjects. The new statistical tools introduced here allow one to compare groups of networks by matching both their average characteristics and the variations among them. A notable finding is that connectomes have high “smallworldness” beyond that arising from geometric and degree considerations alone. PMID:25067815
Concept mapping and network analysis: an analytic approach to measure ties among constructs.
Goldman, Alyssa W; Kane, Mary
2014-12-01
Group concept mapping is a mixed-methods approach that helps a group visually represent its ideas on a topic of interest through a series of related maps. The maps and additional graphics are useful for planning, evaluation and theory development. Group concept maps are typically described, interpreted and utilized through points, clusters and distances, and the implications of these features in understanding how constructs relate to one another. This paper focuses on the application of network analysis to group concept mapping to quantify the strength and directionality of relationships among clusters. The authors outline the steps of this analysis, and illustrate its practical use through an organizational strategic planning example. Additional benefits of this analysis to evaluation projects are also discussed, supporting the overall utility of this supplemental technique to the standard concept mapping methodology. Copyright © 2014 Elsevier Ltd. All rights reserved.
Network analysis applications in hydrology
NASA Astrophysics Data System (ADS)
Price, Katie
2017-04-01
Applied network theory has seen pronounced expansion in recent years, in fields such as epidemiology, computer science, and sociology. Concurrent development of analytical methods and frameworks has increased possibilities and tools available to researchers seeking to apply network theory to a variety of problems. While water and nutrient fluxes through stream systems clearly demonstrate a directional network structure, the hydrological applications of network theory remain underexplored. This presentation covers a review of network applications in hydrology, followed by an overview of promising network analytical tools that potentially offer new insights into conceptual modeling of hydrologic systems, identifying behavioral transition zones in stream networks and thresholds of dynamical system response. Network applications were tested along an urbanization gradient in Atlanta, Georgia, USA. Peachtree Creek and Proctor Creek. Peachtree Creek contains a nest of five longterm USGS streamflow and water quality gages, allowing network application of longterm flow statistics. The watershed spans a range of suburban and heavily urbanized conditions. Summary flow statistics and water quality metrics were analyzed using a suite of network analysis techniques, to test the conceptual modeling and predictive potential of the methodologies. Storm events and low flow dynamics during Summer 2016 were analyzed using multiple network approaches, with an emphasis on tomogravity methods. Results indicate that network theory approaches offer novel perspectives for understanding long term and eventbased hydrological data. Key future directions for network applications include 1) optimizing data collection, 2) identifying "hotspots" of contaminant and overland flow influx to stream systems, 3) defining process domains, and 4) analyzing dynamic connectivity of various system components, including groundwatersurface water interactions.
Artificial Neural Networks: an overview and their use in the analysis of the AMPHORA-3 dataset.
Buscema, Paolo Massimo; Massini, Giulia; Maurelli, Guido
2014-10-01
The Artificial Adaptive Systems (AAS) are theories with which generative algebras are able to create artificial models simulating natural phenomenon. Artificial Neural Networks (ANNs) are the more diffused and best-known learning system models in the AAS. This article describes an overview of ANNs, noting its advantages and limitations for analyzing dynamic, complex, non-linear, multidimensional processes. An example of a specific ANN application to alcohol consumption in Spain, as part of the EU AMPHORA-3 project, during 1961-2006 is presented. Study's limitations are noted and future needed research using ANN methodologies are suggested.
Classification of Alzheimer's Patients through Ubiquitous Computing.
Nieto-Reyes, Alicia; Duque, Rafael; Montaña, José Luis; Lage, Carmen
2017-07-21
Functional data analysis and artificial neural networks are the building blocks of the proposed methodology that distinguishes the movement patterns among c's patients on different stages of the disease and classifies new patients to their appropriate stage of the disease. The movement patterns are obtained by the accelerometer device of android smartphones that the patients carry while moving freely. The proposed methodology is relevant in that it is flexible on the type of data to which it is applied. To exemplify that, it is analyzed a novel real three-dimensional functional dataset where each datum is observed in a different time domain. Not only is it observed on a difference frequency but also the domain of each datum has different length. The obtained classification success rate of 83 % indicates the potential of the proposed methodology.
Kenett, Dror Y; Tumminello, Michele; Madi, Asaf; Gur-Gershgoren, Gitit; Mantegna, Rosario N; Ben-Jacob, Eshel
2010-12-20
What are the dominant stocks which drive the correlations present among stocks traded in a stock market? Can a correlation analysis provide an answer to this question? In the past, correlation based networks have been proposed as a tool to uncover the underlying backbone of the market. Correlation based networks represent the stocks and their relationships, which are then investigated using different network theory methodologies. Here we introduce a new concept to tackle the above question--the partial correlation network. Partial correlation is a measure of how the correlation between two variables, e.g., stock returns, is affected by a third variable. By using it we define a proxy of stock influence, which is then used to construct partial correlation networks. The empirical part of this study is performed on a specific financial system, namely the set of 300 highly capitalized stocks traded at the New York Stock Exchange, in the time period 2001-2003. By constructing the partial correlation network, unlike the case of standard correlation based networks, we find that stocks belonging to the financial sector and, in particular, to the investment services sub-sector, are the most influential stocks affecting the correlation profile of the system. Using a moving window analysis, we find that the strong influence of the financial stocks is conserved across time for the investigated trading period. Our findings shed a new light on the underlying mechanisms and driving forces controlling the correlation profile observed in a financial market.
Protein-Protein Interface and Disease: Perspective from Biomolecular Networks.
Hu, Guang; Xiao, Fei; Li, Yuqian; Li, Yuan; Vongsangnak, Wanwipa
Protein-protein interactions are involved in many important biological processes and molecular mechanisms of disease association. Structural studies of interfacial residues in protein complexes provide information on protein-protein interactions. Characterizing protein-protein interfaces, including binding sites and allosteric changes, thus pose an imminent challenge. With special focus on protein complexes, approaches based on network theory are proposed to meet this challenge. In this review we pay attention to protein-protein interfaces from the perspective of biomolecular networks and their roles in disease. We first describe the different roles of protein complexes in disease through several structural aspects of interfaces. We then discuss some recent advances in predicting hot spots and communication pathway analysis in terms of amino acid networks. Finally, we highlight possible future aspects of this area with respect to both methodology development and applications for disease treatment.
NASA Astrophysics Data System (ADS)
Charakopoulos, A. K.; Katsouli, G. A.; Karakasidis, T. E.
2018-04-01
Understanding the underlying processes and extracting detailed characteristics of spatiotemporal dynamics of ocean and atmosphere as well as their interaction is of significant interest and has not been well thoroughly established. The purpose of this study was to examine the performance of two main additional methodologies for the identification of spatiotemporal underlying dynamic characteristics and patterns among atmospheric and oceanic variables from Seawatch buoys from Aegean and Ionian Sea, provided by the Hellenic Center for Marine Research (HCMR). The first approach involves the estimation of cross correlation analysis in an attempt to investigate time-lagged relationships, and further in order to identify the direction of interactions between the variables we performed the Granger causality method. According to the second approach the time series are converted into complex networks and then the main topological network properties such as degree distribution, average path length, diameter, modularity and clustering coefficient are evaluated. Our results show that the proposed analysis of complex network analysis of time series can lead to the extraction of hidden spatiotemporal characteristics. Also our findings indicate high level of positive and negative correlations and causalities among variables, both from the same buoy and also between buoys from different stations, which cannot be determined from the use of simple statistical measures.
Reverse Engineering Validation using a Benchmark Synthetic Gene Circuit in Human Cells
Kang, Taek; White, Jacob T.; Xie, Zhen; Benenson, Yaakov; Sontag, Eduardo; Bleris, Leonidas
2013-01-01
Multi-component biological networks are often understood incompletely, in large part due to the lack of reliable and robust methodologies for network reverse engineering and characterization. As a consequence, developing automated and rigorously validated methodologies for unraveling the complexity of biomolecular networks in human cells remains a central challenge to life scientists and engineers. Today, when it comes to experimental and analytical requirements, there exists a great deal of diversity in reverse engineering methods, which renders the independent validation and comparison of their predictive capabilities difficult. In this work we introduce an experimental platform customized for the development and verification of reverse engineering and pathway characterization algorithms in mammalian cells. Specifically, we stably integrate a synthetic gene network in human kidney cells and use it as a benchmark for validating reverse engineering methodologies. The network, which is orthogonal to endogenous cellular signaling, contains a small set of regulatory interactions that can be used to quantify the reconstruction performance. By performing successive perturbations to each modular component of the network and comparing protein and RNA measurements, we study the conditions under which we can reliably reconstruct the causal relationships of the integrated synthetic network. PMID:23654266
Reverse engineering validation using a benchmark synthetic gene circuit in human cells.
Kang, Taek; White, Jacob T; Xie, Zhen; Benenson, Yaakov; Sontag, Eduardo; Bleris, Leonidas
2013-05-17
Multicomponent biological networks are often understood incompletely, in large part due to the lack of reliable and robust methodologies for network reverse engineering and characterization. As a consequence, developing automated and rigorously validated methodologies for unraveling the complexity of biomolecular networks in human cells remains a central challenge to life scientists and engineers. Today, when it comes to experimental and analytical requirements, there exists a great deal of diversity in reverse engineering methods, which renders the independent validation and comparison of their predictive capabilities difficult. In this work we introduce an experimental platform customized for the development and verification of reverse engineering and pathway characterization algorithms in mammalian cells. Specifically, we stably integrate a synthetic gene network in human kidney cells and use it as a benchmark for validating reverse engineering methodologies. The network, which is orthogonal to endogenous cellular signaling, contains a small set of regulatory interactions that can be used to quantify the reconstruction performance. By performing successive perturbations to each modular component of the network and comparing protein and RNA measurements, we study the conditions under which we can reliably reconstruct the causal relationships of the integrated synthetic network.
Construction of Gene Regulatory Networks Using Recurrent Neural Networks and Swarm Intelligence.
Khan, Abhinandan; Mandal, Sudip; Pal, Rajat Kumar; Saha, Goutam
2016-01-01
We have proposed a methodology for the reverse engineering of biologically plausible gene regulatory networks from temporal genetic expression data. We have used established information and the fundamental mathematical theory for this purpose. We have employed the Recurrent Neural Network formalism to extract the underlying dynamics present in the time series expression data accurately. We have introduced a new hybrid swarm intelligence framework for the accurate training of the model parameters. The proposed methodology has been first applied to a small artificial network, and the results obtained suggest that it can produce the best results available in the contemporary literature, to the best of our knowledge. Subsequently, we have implemented our proposed framework on experimental (in vivo) datasets. Finally, we have investigated two medium sized genetic networks (in silico) extracted from GeneNetWeaver, to understand how the proposed algorithm scales up with network size. Additionally, we have implemented our proposed algorithm with half the number of time points. The results indicate that a reduction of 50% in the number of time points does not have an effect on the accuracy of the proposed methodology significantly, with a maximum of just over 15% deterioration in the worst case.
3-D Survey Applied to Industrial Archaeology by Tls Methodology
NASA Astrophysics Data System (ADS)
Monego, M.; Fabris, M.; Menin, A.; Achilli, V.
2017-05-01
This work describes the three-dimensional survey of "Ex Stazione Frigorifera Specializzata": initially used for agricultural storage, during the years it was allocated to different uses until the complete neglect. The historical relevance and the architectural heritage that this building represents has brought the start of a recent renovation project and functional restoration. In this regard it was necessary a global 3-D survey that was based on the application and integration of different geomatic methodologies (mainly terrestrial laser scanner, classical topography, and GNSS). The acquisitions of point clouds was performed using different laser scanners: with time of flight (TOF) and phase shift technologies for the distance measurements. The topographic reference network, needed for scans alignment in the same system, was measured with a total station. For the complete survey of the building, 122 scans were acquired and 346 targets were measured from 79 vertices of the reference network. Moreover, 3 vertices were measured with GNSS methodology in order to georeference the network. For the detail survey of machine room were executed 14 scans with 23 targets. The 3-D global model of the building have less than one centimeter of error in the alignment (for the machine room the error in alignment is not greater than 6 mm) and was used to extract products such as longitudinal and transversal sections, plans, architectural perspectives, virtual scans. A complete spatial knowledge of the building is obtained from the processed data, providing basic information for restoration project, structural analysis, industrial and architectural heritage valorization.
Candidate change agent identification among men at risk for HIV infection
Schneider, John A.; McFadden, Rachel B.; Laumann, Edward O.; Kumar, SG Prem; Gandham, Sabitha R.; Oruganti, Ganesh
2012-01-01
Despite limited HIV prevention potency, peer-based programs have become one of the most often used HIV prevention approaches internationally. These programs demonstrate a need for greater specificity in peer change agent (PCA) recruitment and social network evaluation. In the present three-phase study based in India (2009–2010), we first explored the nature of friendship among truck-drivers, a group of men at high risk for HIV infection, in order to develop a thorough understanding of the social forces that contribute to and maintain their personal networks. This was accomplished in the first two study phases, through a combination of focus group discussions (n=5 groups), in-depth qualitative interviews (n=20), and personal network analyses (n=25) of truck-drivers to define friendship and deepen our understanding of friendship across geographic spaces. Measures collected in phases I and II included friend typologies, discussion topics, social network influences, advice-giving, and risk reduction. Outcomes were assessed through an iterative process of qualitative textual analysis and social network analysis. The networks of truck-drivers were found to comprise three typologies: close friends, parking lot friends, and other friends. From these data, we developed an algorithmic approach to the identification of a candidate PCA within a high-risk man’s personal network. In stage III we piloted field-use of this approach to identify and recruit PCAs, and further evaluated their potential for intervention through preliminary analysis of the PCA’s own personal networks. An instrument was developed to translate what social network theory and analysis has taught us about egocentric network dynamics into a real-world methodology for identifying intervention-appropriate peers within an individual’s personal network. Our approach can be tailored to the specifications of any high-risk population, and may serve to enhance current peer-based HIV interventions. PMID:22762951
Design Sensitivity for a Subsonic Aircraft Predicted by Neural Network and Regression Models
NASA Technical Reports Server (NTRS)
Hopkins, Dale A.; Patnaik, Surya N.
2005-01-01
A preliminary methodology was obtained for the design optimization of a subsonic aircraft by coupling NASA Langley Research Center s Flight Optimization System (FLOPS) with NASA Glenn Research Center s design optimization testbed (COMETBOARDS with regression and neural network analysis approximators). The aircraft modeled can carry 200 passengers at a cruise speed of Mach 0.85 over a range of 2500 n mi and can operate on standard 6000-ft takeoff and landing runways. The design simulation was extended to evaluate the optimal airframe and engine parameters for the subsonic aircraft to operate on nonstandard runways. Regression and neural network approximators were used to examine aircraft operation on runways ranging in length from 4500 to 7500 ft.
Mounts, W M; Liebman, M N
1997-07-01
We have developed a method for representing biological pathways and simulating their behavior based on the use of stochastic activity networks (SANs). SANs, an extension of the original Petri net, have been used traditionally to model flow systems including data-communications networks and manufacturing processes. We apply the methodology to the blood coagulation cascade, a biological flow system, and present the representation method as well as results of simulation studies based on published experimental data. In addition to describing the dynamic model, we also present the results of its utilization to perform simulations of clinical states including hemophilia's A and B as well as sensitivity analysis of individual factors and their impact on thrombin production.
Western and Eastern Views on Social Networks
ERIC Educational Resources Information Center
Ordonez de Pablos, Patricia
2005-01-01
Purpose: The aim of this paper is to examine social networks from a Western and Eastern view. Design/methodology/approach: The paper uses case study methodology to gather evidence of how world pioneering firms from Asia and Europe measure and report their social connections from a Western perspective. Findings: It examined the basic indicators…
Intra-organizational Computation and Complexity
2003-01-01
models. New methodologies, centered on understanding algorithmic complexity, are being developed that may enable us to better handle network data ...tractability of data analysis, and enable more precise theorization. A variety of measures of algorithmic complexity, e.g., Kolmogorov-Chaitin, and a...variety of proxies exist (which are often turned to for pragmatic reasons) ( Lempel and Ziv ,1976). For the most part, social and organizational
ERIC Educational Resources Information Center
Schizas, Dimitrios; Katrana, Evagelia; Stamou, George
2013-01-01
In the present study we used the technique of word association tests to assess students' cognitive structures during the learning period. In particular, we tried to investigate what students living near a protected area in Greece (Dadia forest) knew about the phenomenon of decomposition. Decomposition was chosen as a stimulus word because it…
Analysis of Layered Social Networks
2006-09-01
C . Anderson , O . P . John , D . Keltner , and A. M. Kring. Who attains social sta- tus... of action is provided by the following equation, U ( c ) = ∑ d (PdEx), where 56 Religious Financial Commercial Military Infrastructure Avoid Secure...Assuming that this methodology can indeed be applied to the transmission of information, the matrix powers ( p > 2) actually capture a variety of walks
The Perceived Value of Networking through an EMBA: A Study of Taiwanese Women
ERIC Educational Resources Information Center
Chen, Aurora; Doherty, Noeleen; Vinnicombe, Susan
2012-01-01
Purpose: This paper seeks to explore the perceived value of an executive MBA (EMBA) to the development of knowing-who competency for Taiwanese women managers. Design/methodology/approach: This qualitative research drew on in-depth interviews with a sample of 18 female alumni across three business schools in Taiwan. Analysis, using NVivo 8.0,…
Towards a Methodology for Validation of Centrality Measures in Complex Networks
2014-01-01
Background Living systems are associated with Social networks — networks made up of nodes, some of which may be more important in various aspects as compared to others. While different quantitative measures labeled as “centralities” have previously been used in the network analysis community to find out influential nodes in a network, it is debatable how valid the centrality measures actually are. In other words, the research question that remains unanswered is: how exactly do these measures perform in the real world? So, as an example, if a centrality of a particular node identifies it to be important, is the node actually important? Purpose The goal of this paper is not just to perform a traditional social network analysis but rather to evaluate different centrality measures by conducting an empirical study analyzing exactly how do network centralities correlate with data from published multidisciplinary network data sets. Method We take standard published network data sets while using a random network to establish a baseline. These data sets included the Zachary's Karate Club network, dolphin social network and a neural network of nematode Caenorhabditis elegans. Each of the data sets was analyzed in terms of different centrality measures and compared with existing knowledge from associated published articles to review the role of each centrality measure in the determination of influential nodes. Results Our empirical analysis demonstrates that in the chosen network data sets, nodes which had a high Closeness Centrality also had a high Eccentricity Centrality. Likewise high Degree Centrality also correlated closely with a high Eigenvector Centrality. Whereas Betweenness Centrality varied according to network topology and did not demonstrate any noticeable pattern. In terms of identification of key nodes, we discovered that as compared with other centrality measures, Eigenvector and Eccentricity Centralities were better able to identify important nodes. PMID:24709999
Yager, Douglas B.; Hofstra, Albert H.; Granitto, Matthew
2012-01-01
This report emphasizes geographic information system analysis and the display of data stored in the legacy U.S. Geological Survey National Geochemical Database for use in mineral resource investigations. Geochemical analyses of soils, stream sediments, and rocks that are archived in the National Geochemical Database provide an extensive data source for investigating geochemical anomalies. A study area in the Egan Range of east-central Nevada was used to develop a geographic information system analysis methodology for two different geochemical datasets involving detailed (Bureau of Land Management Wilderness) and reconnaissance-scale (National Uranium Resource Evaluation) investigations. ArcGIS was used to analyze and thematically map geochemical information at point locations. Watershed-boundary datasets served as a geographic reference to relate potentially anomalous sample sites with hydrologic unit codes at varying scales. The National Hydrography Dataset was analyzed with Hydrography Event Management and ArcGIS Utility Network Analyst tools to delineate potential sediment-sample provenance along a stream network. These tools can be used to track potential upstream-sediment-contributing areas to a sample site. This methodology identifies geochemically anomalous sample sites, watersheds, and streams that could help focus mineral resource investigations in the field.
A methodology aimed at fostering and sustaining the development processes of an IE-based industry
NASA Astrophysics Data System (ADS)
Corallo, Angelo; Errico, Fabrizio; de Maggio, Marco; Giangreco, Enza
In the current competitive scenario, where business relationships are fundamental in building successful business models and inter/intra organizational business processes are progressively digitalized, an end-to-end methodology is required that is capable of guiding business networks through the Internetworked Enterprise (IE) paradigm: a new and innovative organizational model able to leverage Internet technologies to perform real-time coordination of intra and inter-firm activities, to create value by offering innovative and personalized products/services and reduce transaction costs. This chapter presents the TEKNE project Methodology of change that guides business networks, by means of a modular and flexible approach, towards the IE techno-organizational paradigm, taking into account the competitive environment of the network and how this environment influences its strategic, organizational and technological levels. Contingency, the business model, enterprise architecture and performance metrics are the key concepts that form the cornerstone of this methodological framework.
Detecting switching and intermittent causalities in time series
NASA Astrophysics Data System (ADS)
Zanin, Massimiliano; Papo, David
2017-04-01
During the last decade, complex network representations have emerged as a powerful instrument for describing the cross-talk between different brain regions both at rest and as subjects are carrying out cognitive tasks, in healthy brains and neurological pathologies. The transient nature of such cross-talk has nevertheless by and large been neglected, mainly due to the inherent limitations of some metrics, e.g., causality ones, which require a long time series in order to yield statistically significant results. Here, we present a methodology to account for intermittent causal coupling in neural activity, based on the identification of non-overlapping windows within the original time series in which the causality is strongest. The result is a less coarse-grained assessment of the time-varying properties of brain interactions, which can be used to create a high temporal resolution time-varying network. We apply the proposed methodology to the analysis of the brain activity of control subjects and alcoholic patients performing an image recognition task. Our results show that short-lived, intermittent, local-scale causality is better at discriminating both groups than global network metrics. These results highlight the importance of the transient nature of brain activity, at least under some pathological conditions.
Source space analysis of event-related dynamic reorganization of brain networks.
Ioannides, Andreas A; Dimitriadis, Stavros I; Saridis, George A; Voultsidou, Marotesa; Poghosyan, Vahe; Liu, Lichan; Laskaris, Nikolaos A
2012-01-01
How the brain works is nowadays synonymous with how different parts of the brain work together and the derivation of mathematical descriptions for the functional connectivity patterns that can be objectively derived from data of different neuroimaging techniques. In most cases static networks are studied, often relying on resting state recordings. Here, we present a quantitative study of dynamic reconfiguration of connectivity for event-related experiments. Our motivation is the development of a methodology that can be used for personalized monitoring of brain activity. In line with this motivation, we use data with visual stimuli from a typical subject that participated in different experiments that were previously analyzed with traditional methods. The earlier studies identified well-defined changes in specific brain areas at specific latencies related to attention, properties of stimuli, and tasks demands. Using a recently introduced methodology, we track the event-related changes in network organization, at source space level, thus providing a more global and complete view of the stages of processing associated with the regional changes in activity. The results suggest the time evolving modularity as an additional brain code that is accessible with noninvasive means and hence available for personalized monitoring and clinical applications.
Bolte, Gabriele; David, Madlen; Dębiak, Małgorzata; Fiedel, Lotta; Hornberg, Claudia; Kolossa-Gehring, Marike; Kraus, Ute; Lätzsch, Rebecca; Paeck, Tatjana; Palm, Kerstin; Schneider, Alexandra
2018-06-01
The comprehensive consideration of sex/gender in health research is essential to increase relevance and validity of research results. Contrary to other areas of health research, there is no systematic summary of the current state of research on the significance of sex/gender in environmental health. Within the interdisciplinary research network Sex/Gender-Environment-Health (GeUmGe-NET) the current state of integration of sex/gender aspects or, respectively, gender theoretical concepts into research was systematically assessed within selected topics of the research areas environmental toxicology, environmental medicine, environmental epidemiology and public health research on environment and health. Knowledge gaps and research needs were identified in all research areas. Furthermore, the potential for methodological advancements by using gender theoretical concepts was depicted. A dialogue between biomedical research, public health research, and gender studies was started with the research network GeUmGe-NET. This dialogue has to be continued particularly regarding a common testing of methodological innovations in data collection and data analysis. Insights of this interdisciplinary research are relevant for practice areas such as environmental health protection, health promotion, environmental justice, and environmental health monitoring.
Sand/cement ratio evaluation on mortar using neural networks and ultrasonic transmission inspection.
Molero, M; Segura, I; Izquierdo, M A G; Fuente, J V; Anaya, J J
2009-02-01
The quality and degradation state of building materials can be determined by nondestructive testing (NDT). These materials are composed of a cementitious matrix and particles or fragments of aggregates. Sand/cement ratio (s/c) provides the final material quality; however, the sand content can mask the matrix properties in a nondestructive measurement. Therefore, s/c ratio estimation is needed in nondestructive characterization of cementitious materials. In this study, a methodology to classify the sand content in mortar is presented. The methodology is based on ultrasonic transmission inspection, data reduction, and features extraction by principal components analysis (PCA), and neural network classification. This evaluation is carried out with several mortar samples, which were made while taking into account different cement types and s/c ratios. The estimated s/c ratio is determined by ultrasonic spectral attenuation with three different broadband transducers (0.5, 1, and 2 MHz). Statistical PCA to reduce the dimension of the captured traces has been applied. Feed-forward neural networks (NNs) are trained using principal components (PCs) and their outputs are used to display the estimated s/c ratios in false color images, showing the s/c ratio distribution of the mortar samples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Domínguez-Gómez, J. Andrés, E-mail: andres@uhu.es
In the last twenty years, both the increase in academic production and the expansion of professional involvement in Environmental Impact Assessment (EIA) and Social Impact Assessment (SIA) have evidenced growing scientific and business interest in risk and impact analysis. However, this growth has not brought with it parallel progress in addressing the main shortcomings of EIA/SIA, i.e. insufficient integration of environmental and social factors into development project analyses and, in cases where the social aspects are considered, technical-methodological failings in their analysis and assessment. It is clear that these weaknesses carry with them substantial threats to the sustainability (social, environmentalmore » and economic) of projects which impact on the environment, and consequently to the local contexts where they are carried out and to the delicate balance of the global ecosystem. This paper argues that, in a sociological context of complexity and dynamism, four conceptual elements should underpin approaches to socio-environmental risk and impact assessment in development projects: a theoretical base in actor–network theory; an ethical grounding in values which are internationally recognized (though not always fulfilled in practice); a (new) epistemological-scientific base; and a methodological foundation in social participation. - Highlights: • A theoretical foundation in actor–network theory • An ethical grounding in values which are internationally recognized, but rarely carried through into practice • A (new) epistemological-scientific base • A methodological foundation in social participation.« less
Functional connectomics from a "big data" perspective.
Xia, Mingrui; He, Yong
2017-10-15
In the last decade, explosive growth regarding functional connectome studies has been observed. Accumulating knowledge has significantly contributed to our understanding of the brain's functional network architectures in health and disease. With the development of innovative neuroimaging techniques, the establishment of large brain datasets and the increasing accumulation of published findings, functional connectomic research has begun to move into the era of "big data", which generates unprecedented opportunities for discovery in brain science and simultaneously encounters various challenging issues, such as data acquisition, management and analyses. Big data on the functional connectome exhibits several critical features: high spatial and/or temporal precision, large sample sizes, long-term recording of brain activity, multidimensional biological variables (e.g., imaging, genetic, demographic, cognitive and clinic) and/or vast quantities of existing findings. We review studies regarding functional connectomics from a big data perspective, with a focus on recent methodological advances in state-of-the-art image acquisition (e.g., multiband imaging), analysis approaches and statistical strategies (e.g., graph theoretical analysis, dynamic network analysis, independent component analysis, multivariate pattern analysis and machine learning), as well as reliability and reproducibility validations. We highlight the novel findings in the application of functional connectomic big data to the exploration of the biological mechanisms of cognitive functions, normal development and aging and of neurological and psychiatric disorders. We advocate the urgent need to expand efforts directed at the methodological challenges and discuss the direction of applications in this field. Copyright © 2017 Elsevier Inc. All rights reserved.
Space Shuttle RTOS Bayesian Network
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Beling, Peter A.
2001-01-01
With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores. Using a prioritization of measures from the decision-maker, trade-offs between the scores are used to rank order the available set of RTOS candidates.
NASA Astrophysics Data System (ADS)
Ayatollahy Tafti, Tayeb
We develop a new method for integrating information and data from different sources. We also construct a comprehensive workflow for characterizing and modeling a fracture network in unconventional reservoirs, using microseismic data. The methodology is based on combination of several mathematical and artificial intelligent techniques, including geostatistics, fractal analysis, fuzzy logic, and neural networks. The study contributes to scholarly knowledge base on the characterization and modeling fractured reservoirs in several ways; including a versatile workflow with a novel objective functions. Some the characteristics of the methods are listed below: 1. The new method is an effective fracture characterization procedure estimates different fracture properties. Unlike the existing methods, the new approach is not dependent on the location of events. It is able to integrate all multi-scaled and diverse fracture information from different methodologies. 2. It offers an improved procedure to create compressional and shear velocity models as a preamble for delineating anomalies and map structures of interest and to correlate velocity anomalies with fracture swarms and other reservoir properties of interest. 3. It offers an effective way to obtain the fractal dimension of microseismic events and identify the pattern complexity, connectivity, and mechanism of the created fracture network. 4. It offers an innovative method for monitoring the fracture movement in different stages of stimulation that can be used to optimize the process. 5. Our newly developed MDFN approach allows to create a discrete fracture network model using only microseismic data with potential cost reduction. It also imposes fractal dimension as a constraint on other fracture modeling approaches, which increases the visual similarity between the modeled networks and the real network over the simulated volume.
Identifying the starting point of a spreading process in complex networks.
Comin, Cesar Henrique; Costa, Luciano da Fontoura
2011-11-01
When dealing with the dissemination of epidemics, one important question that can be asked is the location where the contamination began. In this paper, we analyze three spreading schemes and propose and validate an effective methodology for the identification of the source nodes. The method is based on the calculation of the centrality of the nodes on the sampled network, expressed here by degree, betweenness, closeness, and eigenvector centrality. We show that the source node tends to have the highest measurement values. The potential of the methodology is illustrated with respect to three theoretical complex network models as well as a real-world network, the email network of the University Rovira i Virgili.
Network Effects on Scientific Collaborations
Uddin, Shahadat; Hossain, Liaquat; Rasmussen, Kim
2013-01-01
Background The analysis of co-authorship network aims at exploring the impact of network structure on the outcome of scientific collaborations and research publications. However, little is known about what network properties are associated with authors who have increased number of joint publications and are being cited highly. Methodology/Principal Findings Measures of social network analysis, for example network centrality and tie strength, have been utilized extensively in current co-authorship literature to explore different behavioural patterns of co-authorship networks. Using three SNA measures (i.e., degree centrality, closeness centrality and betweenness centrality), we explore scientific collaboration networks to understand factors influencing performance (i.e., citation count) and formation (tie strength between authors) of such networks. A citation count is the number of times an article is cited by other articles. We use co-authorship dataset of the research field of ‘steel structure’ for the year 2005 to 2009. To measure the strength of scientific collaboration between two authors, we consider the number of articles co-authored by them. In this study, we examine how citation count of a scientific publication is influenced by different centrality measures of its co-author(s) in a co-authorship network. We further analyze the impact of the network positions of authors on the strength of their scientific collaborations. We use both correlation and regression methods for data analysis leading to statistical validation. We identify that citation count of a research article is positively correlated with the degree centrality and betweenness centrality values of its co-author(s). Also, we reveal that degree centrality and betweenness centrality values of authors in a co-authorship network are positively correlated with the strength of their scientific collaborations. Conclusions/Significance Authors’ network positions in co-authorship networks influence the performance (i.e., citation count) and formation (i.e., tie strength) of scientific collaborations. PMID:23469021
2011-06-01
alternative technologies early in the product life cycle. Use case 3 reflects SLAD’s response to changes in the way the Army acquires technical...development on the one hand, and to systems evaluated for production and deployment on the other. Together, these three use cases provide the Army...Package E x a m p le P ro b le m Mission based SLVA of networked-enabled small units subject to one or more threats. Mission based early
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthews, W.
2000-02-22
Modern High Energy Nuclear and Particle Physics (HENP) experiments at Laboratories around the world present a significant challenge to wide area networks. Petabytes (1015) or exabytes (1018) of data will be generated during the lifetime of the experiment. Much of this data will be distributed via the Internet to the experiment's collaborators at Universities and Institutes throughout the world for analysis. In order to assess the feasibility of the computing goals of these and future experiments, the HENP networking community is actively monitoring performance across a large part of the Internet used by its collaborators. Since 1995, the pingER projectmore » has been collecting data on ping packet loss and round trip times. In January 2000, there are 28 monitoring sites in 15 countries gathering data on over 2,000 end-to-end pairs. HENP labs such as SLAC, Fermi Lab and CERN are using Advanced Network's Surveyor project and monitoring performance from one-way delay of UDP packets. More recently several HENP sites have become involved with NLANR's active measurement program (AMP). In addition SLAC and CERN are part of the RIPE test-traffic project and SLAC is home for a NIMI machine. The large End-to-end performance monitoring infrastructure allows the HENP networking community to chart long term trends and closely examine short term glitches across a wide range of networks and connections. The different methodologies provide opportunities to compare results based on different protocols and statistical samples. Understanding agreement and discrepancies between results provides particular insight into the nature of the network. This paper will highlight the practical side of monitoring by reviewing the special needs of High Energy Nuclear and Particle Physics experiments and provide an overview of the experience of measuring performance across a large number of interconnected networks throughout the world with various methodologies. In particular, results from each project will be compared and disagreement will be analyzed. The goal is to address issues for improving understanding for gathering and analysis of accurate monitoring data, but the outlook for the computing goals of HENP will also be examined.« less
Henden, Lyndal; Lee, Stuart; Mueller, Ivo; Barry, Alyssa; Bahlo, Melanie
2018-05-01
Identification of genomic regions that are identical by descent (IBD) has proven useful for human genetic studies where analyses have led to the discovery of familial relatedness and fine-mapping of disease critical regions. Unfortunately however, IBD analyses have been underutilized in analysis of other organisms, including human pathogens. This is in part due to the lack of statistical methodologies for non-diploid genomes in addition to the added complexity of multiclonal infections. As such, we have developed an IBD methodology, called isoRelate, for analysis of haploid recombining microorganisms in the presence of multiclonal infections. Using the inferred IBD status at genomic locations, we have also developed a novel statistic for identifying loci under positive selection and propose relatedness networks as a means of exploring shared haplotypes within populations. We evaluate the performance of our methodologies for detecting IBD and selection, including comparisons with existing tools, then perform an exploratory analysis of whole genome sequencing data from a global Plasmodium falciparum dataset of more than 2500 genomes. This analysis identifies Southeast Asia as having many highly related isolates, possibly as a result of both reduced transmission from intensified control efforts and population bottlenecks following the emergence of antimalarial drug resistance. Many signals of selection are also identified, most of which overlap genes that are known to be associated with drug resistance, in addition to two novel signals observed in multiple countries that have yet to be explored in detail. Additionally, we investigate relatedness networks over the selected loci and determine that one of these sweeps has spread between continents while the other has arisen independently in different countries. IBD analysis of microorganisms using isoRelate can be used for exploring population structure, positive selection and haplotype distributions, and will be a valuable tool for monitoring disease control and elimination efforts of many diseases.
Model of community emergence in weighted social networks
NASA Astrophysics Data System (ADS)
Kumpula, J. M.; Onnela, J.-P.; Saramäki, J.; Kertész, J.; Kaski, K.
2009-04-01
Over the years network theory has proven to be rapidly expanding methodology to investigate various complex systems and it has turned out to give quite unparalleled insight to their structure, function, and response through data analysis, modeling, and simulation. For social systems in particular the network approach has empirically revealed a modular structure due to interplay between the network topology and link weights between network nodes or individuals. This inspired us to develop a simple network model that could catch some salient features of mesoscopic community and macroscopic topology formation during network evolution. Our model is based on two fundamental mechanisms of network sociology for individuals to find new friends, namely cyclic closure and focal closure, which are mimicked by local search-link-reinforcement and random global attachment mechanisms, respectively. In addition we included to the model a node deletion mechanism by removing all its links simultaneously, which corresponds for an individual to depart from the network. Here we describe in detail the implementation of our model algorithm, which was found to be computationally efficient and produce many empirically observed features of large-scale social networks. Thus this model opens a new perspective for studying such collective social phenomena as spreading, structure formation, and evolutionary processes.
European Healthy Cities evaluation: conceptual framework and methodology.
de Leeuw, Evelyne; Green, Geoff; Dyakova, Mariana; Spanswick, Lucy; Palmer, Nicola
2015-06-01
This paper presents the methodology, programme logic and conceptual framework that drove the evaluation of the Fifth Phase of the WHO European Healthy Cities Network. Towards the end of the phase, 99 cities were designated progressively through the life of the phase (2009-14). The paper establishes the values, systems and aspirations that these cities sign up for, as foundations for the selection of methodology. We assert that a realist synthesis methodology, driven by a wide range of qualitative and quantitative methods, is the most appropriate perspective to address the wide geopolitical, demographic, population and health diversities of these cities. The paper outlines the rationale for a structured multiple case study approach, the deployment of a comprehensive questionnaire, data mining through existing databases including Eurostat and analysis of management information generation tools used throughout the period. Response rates were considered extremely high for this type of research. Non-response analyses are described, which show that data are representative for cities across the spectrum of diversity. This paper provides a foundation for further analysis on specific areas of interest presented in this supplement. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Knopman, Debra S.; Voss, Clifford I.; Garabedian, Stephen P.
1991-01-01
Tests of a one-dimensional sampling design methodology on measurements of bromide concentration collected during the natural gradient tracer test conducted by the U.S. Geological Survey on Cape Cod, Massachusetts, demonstrate its efficacy for field studies of solute transport in groundwater and the utility of one-dimensional analysis. The methodology was applied to design of sparse two-dimensional networks of fully screened wells typical of those often used in engineering practice. In one-dimensional analysis, designs consist of the downstream distances to rows of wells oriented perpendicular to the groundwater flow direction and the timing of sampling to be carried out on each row. The power of a sampling design is measured by its effectiveness in simultaneously meeting objectives of model discrimination, parameter estimation, and cost minimization. One-dimensional models of solute transport, differing in processes affecting the solute and assumptions about the structure of the flow field, were considered for description of tracer cloud migration. When fitting each model using nonlinear regression, additive and multiplicative error forms were allowed for the residuals which consist of both random and model errors. The one-dimensional single-layer model of a nonreactive solute with multiplicative error was judged to be the best of those tested. Results show the efficacy of the methodology in designing sparse but powerful sampling networks. Designs that sample five rows of wells at five or fewer times in any given row performed as well for model discrimination as the full set of samples taken up to eight times in a given row from as many as 89 rows. Also, designs for parameter estimation judged to be good by the methodology were as effective in reducing the variance of parameter estimates as arbitrary designs with many more samples. Results further showed that estimates of velocity and longitudinal dispersivity in one-dimensional models based on data from only five rows of fully screened wells each sampled five or fewer times were practically equivalent to values determined from moments analysis of the complete three-dimensional set of 29,285 samples taken during 16 sampling times.
Using principal component analysis for selecting network behavioral anomaly metrics
NASA Astrophysics Data System (ADS)
Gregorio-de Souza, Ian; Berk, Vincent; Barsamian, Alex
2010-04-01
This work addresses new approaches to behavioral analysis of networks and hosts for the purposes of security monitoring and anomaly detection. Most commonly used approaches simply implement anomaly detectors for one, or a few, simple metrics and those metrics can exhibit unacceptable false alarm rates. For instance, the anomaly score of network communication is defined as the reciprocal of the likelihood that a given host uses a particular protocol (or destination);this definition may result in an unrealistically high threshold for alerting to avoid being flooded by false positives. We demonstrate that selecting and adapting the metrics and thresholds, on a host-by-host or protocol-by-protocol basis can be done by established multivariate analyses such as PCA. We will show how to determine one or more metrics, for each network host, that records the highest available amount of information regarding the baseline behavior, and shows relevant deviances reliably. We describe the methodology used to pick from a large selection of available metrics, and illustrate a method for comparing the resulting classifiers. Using our approach we are able to reduce the resources required to properly identify misbehaving hosts, protocols, or networks, by dedicating system resources to only those metrics that actually matter in detecting network deviations.
Udrescu, Lucreţia; Sbârcea, Laura; Topîrceanu, Alexandru; Iovanovici, Alexandru; Kurunczi, Ludovic; Bogdan, Paul; Udrescu, Mihai
2016-09-07
Analyzing drug-drug interactions may unravel previously unknown drug action patterns, leading to the development of new drug discovery tools. We present a new approach to analyzing drug-drug interaction networks, based on clustering and topological community detection techniques that are specific to complex network science. Our methodology uncovers functional drug categories along with the intricate relationships between them. Using modularity-based and energy-model layout community detection algorithms, we link the network clusters to 9 relevant pharmacological properties. Out of the 1141 drugs from the DrugBank 4.1 database, our extensive literature survey and cross-checking with other databases such as Drugs.com, RxList, and DrugBank 4.3 confirm the predicted properties for 85% of the drugs. As such, we argue that network analysis offers a high-level grasp on a wide area of pharmacological aspects, indicating possible unaccounted interactions and missing pharmacological properties that can lead to drug repositioning for the 15% drugs which seem to be inconsistent with the predicted property. Also, by using network centralities, we can rank drugs according to their interaction potential for both simple and complex multi-pathology therapies. Moreover, our clustering approach can be extended for applications such as analyzing drug-target interactions or phenotyping patients in personalized medicine applications.
Udrescu, Lucreţia; Sbârcea, Laura; Topîrceanu, Alexandru; Iovanovici, Alexandru; Kurunczi, Ludovic; Bogdan, Paul; Udrescu, Mihai
2016-01-01
Analyzing drug-drug interactions may unravel previously unknown drug action patterns, leading to the development of new drug discovery tools. We present a new approach to analyzing drug-drug interaction networks, based on clustering and topological community detection techniques that are specific to complex network science. Our methodology uncovers functional drug categories along with the intricate relationships between them. Using modularity-based and energy-model layout community detection algorithms, we link the network clusters to 9 relevant pharmacological properties. Out of the 1141 drugs from the DrugBank 4.1 database, our extensive literature survey and cross-checking with other databases such as Drugs.com, RxList, and DrugBank 4.3 confirm the predicted properties for 85% of the drugs. As such, we argue that network analysis offers a high-level grasp on a wide area of pharmacological aspects, indicating possible unaccounted interactions and missing pharmacological properties that can lead to drug repositioning for the 15% drugs which seem to be inconsistent with the predicted property. Also, by using network centralities, we can rank drugs according to their interaction potential for both simple and complex multi-pathology therapies. Moreover, our clustering approach can be extended for applications such as analyzing drug-target interactions or phenotyping patients in personalized medicine applications. PMID:27599720
Lê, Gillian; Mirzoev, Tolib; Orgill, Marsha; Erasmus, Ermin; Lehmann, Uta; Okeyo, Stephen; Goudge, Jane; Maluka, Stephen; Uzochukwu, Benjamin; Aikins, Moses; de Savigny, Don; Tomson, Goran; Gilson, Lucy
2014-10-08
The importance of health policy and systems research and analysis (HPSR+A) has been increasingly recognised, but it is still unclear how most effectively to strengthen the capacity of the different organisations involved in this field. Universities are particularly crucial but the expansive literature on capacity development has little to offer the unique needs of HPSR+A activity within universities, and often overlooks the pivotal contribution of capacity assessments to capacity strengthening. The Consortium for Health Policy and Systems Analysis in Africa 2011-2015 designed and implemented a new framework for capacity assessment for HPSR+A within universities. The methodology is reported in detail. Our reflections on developing and conducting the assessment generated four lessons for colleagues in the field. Notably, there are currently no published capacity assessment methodologies for HPSR+A that focus solely on universities - we report a first for the field to initiate the dialogue and exchange of experiences with others. Second, in HPSR+A, the unit of assessment can be a challenge, because HPSR+A groups within universities tend to overlap between academic departments and are embedded in different networks. Third, capacity assessment experience can itself be capacity strengthening, even when taking into account that doing such assessments require capacity. From our experience, we propose that future systematic assessments of HPSR+A capacity need to focus on both capacity assets and needs and assess capacity at individual, organisational, and systems levels, whilst taking into account the networked nature of HPSR+A activity. A genuine partnership process between evaluators and those participating in an assessment can improve the quality of assessment and uptake of results in capacity strengthening.
Geospatial Data Integration for Assessing Landslide Hazard on Engineered Slopes
NASA Astrophysics Data System (ADS)
Miller, P. E.; Mills, J. P.; Barr, S. L.; Birkinshaw, S. J.
2012-07-01
Road and rail networks are essential components of national infrastructures, underpinning the economy, and facilitating the mobility of goods and the human workforce. Earthwork slopes such as cuttings and embankments are primary components, and their reliability is of fundamental importance. However, instability and failure can occur, through processes such as landslides. Monitoring the condition of earthworks is a costly and continuous process for network operators, and currently, geospatial data is largely underutilised. The research presented here addresses this by combining airborne laser scanning and multispectral aerial imagery to develop a methodology for assessing landslide hazard. This is based on the extraction of key slope stability variables from the remotely sensed data. The methodology is implemented through numerical modelling, which is parameterised with the slope stability information, simulated climate conditions, and geotechnical properties. This allows determination of slope stability (expressed through the factor of safety) for a range of simulated scenarios. Regression analysis is then performed in order to develop a functional model relating slope stability to the input variables. The remotely sensed raster datasets are robustly re-sampled to two-dimensional cross-sections to facilitate meaningful interpretation of slope behaviour and mapping of landslide hazard. Results are stored in a geodatabase for spatial analysis within a GIS environment. For a test site located in England, UK, results have shown the utility of the approach in deriving practical hazard assessment information. Outcomes were compared to the network operator's hazard grading data, and show general agreement. The utility of the slope information was also assessed with respect to auto-population of slope geometry, and found to deliver significant improvements over the network operator's existing field-based approaches.
Dos Santos Vasconcelos, Crhisllane Rafaele; de Lima Campos, Túlio; Rezende, Antonio Mauro
2018-03-06
Systematic analysis of a parasite interactome is a key approach to understand different biological processes. It makes possible to elucidate disease mechanisms, to predict protein functions and to select promising targets for drug development. Currently, several approaches for protein interaction prediction for non-model species incorporate only small fractions of the entire proteomes and their interactions. Based on this perspective, this study presents an integration of computational methodologies, protein network predictions and comparative analysis of the protozoan species Leishmania braziliensis and Leishmania infantum. These parasites cause Leishmaniasis, a worldwide distributed and neglected disease, with limited treatment options using currently available drugs. The predicted interactions were obtained from a meta-approach, applying rigid body docking tests and template-based docking on protein structures predicted by different comparative modeling techniques. In addition, we trained a machine-learning algorithm (Gradient Boosting) using docking information performed on a curated set of positive and negative protein interaction data. Our final model obtained an AUC = 0.88, with recall = 0.69, specificity = 0.88 and precision = 0.83. Using this approach, it was possible to confidently predict 681 protein structures and 6198 protein interactions for L. braziliensis, and 708 protein structures and 7391 protein interactions for L. infantum. The predicted networks were integrated to protein interaction data already available, analyzed using several topological features and used to classify proteins as essential for network stability. The present study allowed to demonstrate the importance of integrating different methodologies of interaction prediction to increase the coverage of the protein interaction of the studied protocols, besides it made available protein structures and interactions not previously reported.
ERIC Educational Resources Information Center
Sheffield, Jenna Pack; Kimme Hea, Amy C.
2016-01-01
While composition studies researchers have examined the ways social media are impacting our lives inside and outside of the classroom, less attention has been given to the ways in which social media--specifically Social Network Sites (SNSs)--may enhance our own research methods and methodologies by helping to combat research participant attrition…
Das, Abhik; Tyson, Jon; Pedroza, Claudia; Schmidt, Barbara; Gantz, Marie; Wallace, Dennis; Truog, William E; Higgins, Rosemary D
2016-10-01
Impressive advances in neonatology have occurred over the 30 years of life of The Eunice Kennedy Shriver National Institute of Child Health and Human Development Neonatal Research Network (NRN). However, substantial room for improvement remains in investigating and further developing the evidence base for improving outcomes among the extremely premature. We discuss some of the specific methodological challenges in the statistical design and analysis of randomized trials and observational studies in this population. Challenges faced by the NRN include designing trials for unusual or rare outcomes, accounting for and explaining center variations, identifying other subgroup differences, and balancing safety and efficacy concerns between short-term hospital outcomes and longer-term neurodevelopmental outcomes. In conclusion, the constellation of unique patient characteristics in neonates calls for broad understanding and careful consideration of the issues identified in this article for conducting rigorous studies in this population. Copyright © 2016 Elsevier Inc. All rights reserved.
Stefanović, Stefica Cerjan; Bolanča, Tomislav; Luša, Melita; Ukić, Sime; Rogošić, Marko
2012-02-24
This paper describes the development of ad hoc methodology for determination of inorganic anions in oilfield water, since their composition often significantly differs from the average (concentration of components and/or matrix). Therefore, fast and reliable method development has to be performed in order to ensure the monitoring of desired properties under new conditions. The method development was based on computer assisted multi-criteria decision making strategy. The used criteria were: maximal value of objective functions used, maximal robustness of the separation method, minimal analysis time, and maximal retention distance between two nearest components. Artificial neural networks were used for modeling of anion retention. The reliability of developed method was extensively tested by the validation of performance characteristics. Based on validation results, the developed method shows satisfactory performance characteristics, proving the successful application of computer assisted methodology in the described case study. Copyright © 2011 Elsevier B.V. All rights reserved.
Online monitoring of seismic damage in water distribution systems
NASA Astrophysics Data System (ADS)
Liang, Jianwen; Xiao, Di; Zhao, Xinhua; Zhang, Hongwei
2004-07-01
It is shown that water distribution systems can be damaged by earthquakes, and the seismic damages cannot easily be located, especially immediately after the events. Earthquake experiences show that accurate and quick location of seismic damage is critical to emergency response of water distribution systems. This paper develops a methodology to locate seismic damage -- multiple breaks in a water distribution system by monitoring water pressure online at limited positions in the water distribution system. For the purpose of online monitoring, supervisory control and data acquisition (SCADA) technology can well be used. A neural network-based inverse analysis method is constructed for locating the seismic damage based on the variation of water pressure. The neural network is trained by using analytically simulated data from the water distribution system, and validated by using a set of data that have never been used in the training. It is found that the methodology provides an effective and practical way in which seismic damage in a water distribution system can be accurately and quickly located.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cameron, Maria K., E-mail: cameron@math.umd.edu
We develop computational tools for spectral analysis of stochastic networks representing energy landscapes of atomic and molecular clusters. Physical meaning and some properties of eigenvalues, left and right eigenvectors, and eigencurrents are discussed. We propose an approach to compute a collection of eigenpairs and corresponding eigencurrents describing the most important relaxation processes taking place in the system on its way to the equilibrium. It is suitable for large and complex stochastic networks where pairwise transition rates, given by the Arrhenius law, vary by orders of magnitude. The proposed methodology is applied to the network representing the Lennard-Jones-38 cluster created bymore » Wales's group. Its energy landscape has a double funnel structure with a deep and narrow face-centered cubic funnel and a shallower and wider icosahedral funnel. However, the complete spectrum of the generator matrix of the Lennard-Jones-38 network has no appreciable spectral gap separating the eigenvalue corresponding to the escape from the icosahedral funnel. We provide a detailed description of the escape process from the icosahedral funnel using the eigencurrent and demonstrate a superexponential growth of the corresponding eigenvalue. The proposed spectral approach is compared to the methodology of the Transition Path Theory. Finally, we discuss whether the Lennard-Jones-38 cluster is metastable from the points of view of a mathematician and a chemical physicist, and make a connection with experimental works.« less
A Fault Diagnosis Methodology for Gear Pump Based on EEMD and Bayesian Network
Liu, Zengkai; Liu, Yonghong; Shan, Hongkai; Cai, Baoping; Huang, Qing
2015-01-01
This paper proposes a fault diagnosis methodology for a gear pump based on the ensemble empirical mode decomposition (EEMD) method and the Bayesian network. Essentially, the presented scheme is a multi-source information fusion based methodology. Compared with the conventional fault diagnosis with only EEMD, the proposed method is able to take advantage of all useful information besides sensor signals. The presented diagnostic Bayesian network consists of a fault layer, a fault feature layer and a multi-source information layer. Vibration signals from sensor measurement are decomposed by the EEMD method and the energy of intrinsic mode functions (IMFs) are calculated as fault features. These features are added into the fault feature layer in the Bayesian network. The other sources of useful information are added to the information layer. The generalized three-layer Bayesian network can be developed by fully incorporating faults and fault symptoms as well as other useful information such as naked eye inspection and maintenance records. Therefore, diagnostic accuracy and capacity can be improved. The proposed methodology is applied to the fault diagnosis of a gear pump and the structure and parameters of the Bayesian network is established. Compared with artificial neural network and support vector machine classification algorithms, the proposed model has the best diagnostic performance when sensor data is used only. A case study has demonstrated that some information from human observation or system repair records is very helpful to the fault diagnosis. It is effective and efficient in diagnosing faults based on uncertain, incomplete information. PMID:25938760
A Fault Diagnosis Methodology for Gear Pump Based on EEMD and Bayesian Network.
Liu, Zengkai; Liu, Yonghong; Shan, Hongkai; Cai, Baoping; Huang, Qing
2015-01-01
This paper proposes a fault diagnosis methodology for a gear pump based on the ensemble empirical mode decomposition (EEMD) method and the Bayesian network. Essentially, the presented scheme is a multi-source information fusion based methodology. Compared with the conventional fault diagnosis with only EEMD, the proposed method is able to take advantage of all useful information besides sensor signals. The presented diagnostic Bayesian network consists of a fault layer, a fault feature layer and a multi-source information layer. Vibration signals from sensor measurement are decomposed by the EEMD method and the energy of intrinsic mode functions (IMFs) are calculated as fault features. These features are added into the fault feature layer in the Bayesian network. The other sources of useful information are added to the information layer. The generalized three-layer Bayesian network can be developed by fully incorporating faults and fault symptoms as well as other useful information such as naked eye inspection and maintenance records. Therefore, diagnostic accuracy and capacity can be improved. The proposed methodology is applied to the fault diagnosis of a gear pump and the structure and parameters of the Bayesian network is established. Compared with artificial neural network and support vector machine classification algorithms, the proposed model has the best diagnostic performance when sensor data is used only. A case study has demonstrated that some information from human observation or system repair records is very helpful to the fault diagnosis. It is effective and efficient in diagnosing faults based on uncertain, incomplete information.
Improved classification of drainage networks using junction angles and secondary tributary lengths
NASA Astrophysics Data System (ADS)
Jung, Kichul; Marpu, Prashanth R.; Ouarda, Taha B. M. J.
2015-06-01
River networks in different regions have distinct characteristics generated by geological processes. These differences enable classification of drainage networks using several measures with many features of the networks. In this study, we propose a new approach that only uses the junction angles with secondary tributary lengths to directly classify different network types. This methodology is based on observations on 50 predefined channel networks. The cumulative distributions of secondary tributary lengths for different ranges of junction angles are used to obtain the descriptive values that are defined using a power-law representation. The averages of the values for the known networks are used to represent the classes, and any unclassified network can be classified based on the similarity of the representative values to those of the known classes. The methodology is applied to 10 networks in the United Arab Emirates and Oman and five networks in the USA, and the results are validated using the classification obtained with other methods.
MinePath: Mining for Phenotype Differential Sub-paths in Molecular Pathways
Koumakis, Lefteris; Kartsaki, Evgenia; Chatzimina, Maria; Zervakis, Michalis; Vassou, Despoina; Marias, Kostas; Moustakis, Vassilis; Potamias, George
2016-01-01
Pathway analysis methodologies couple traditional gene expression analysis with knowledge encoded in established molecular pathway networks, offering a promising approach towards the biological interpretation of phenotype differentiating genes. Early pathway analysis methodologies, named as gene set analysis (GSA), view pathways just as plain lists of genes without taking into account either the underlying pathway network topology or the involved gene regulatory relations. These approaches, even if they achieve computational efficiency and simplicity, consider pathways that involve the same genes as equivalent in terms of their gene enrichment characteristics. Most recent pathway analysis approaches take into account the underlying gene regulatory relations by examining their consistency with gene expression profiles and computing a score for each profile. Even with this approach, assessing and scoring single-relations limits the ability to reveal key gene regulation mechanisms hidden in longer pathway sub-paths. We introduce MinePath, a pathway analysis methodology that addresses and overcomes the aforementioned problems. MinePath facilitates the decomposition of pathways into their constituent sub-paths. Decomposition leads to the transformation of single-relations to complex regulation sub-paths. Regulation sub-paths are then matched with gene expression sample profiles in order to evaluate their functional status and to assess phenotype differential power. Assessment of differential power supports the identification of the most discriminant profiles. In addition, MinePath assess the significance of the pathways as a whole, ranking them by their p-values. Comparison results with state-of-the-art pathway analysis systems are indicative for the soundness and reliability of the MinePath approach. In contrast with many pathway analysis tools, MinePath is a web-based system (www.minepath.org) offering dynamic and rich pathway visualization functionality, with the unique characteristic to color regulatory relations between genes and reveal their phenotype inclination. This unique characteristic makes MinePath a valuable tool for in silico molecular biology experimentation as it serves the biomedical researchers’ exploratory needs to reveal and interpret the regulatory mechanisms that underlie and putatively govern the expression of target phenotypes. PMID:27832067
MinePath: Mining for Phenotype Differential Sub-paths in Molecular Pathways.
Koumakis, Lefteris; Kanterakis, Alexandros; Kartsaki, Evgenia; Chatzimina, Maria; Zervakis, Michalis; Tsiknakis, Manolis; Vassou, Despoina; Kafetzopoulos, Dimitris; Marias, Kostas; Moustakis, Vassilis; Potamias, George
2016-11-01
Pathway analysis methodologies couple traditional gene expression analysis with knowledge encoded in established molecular pathway networks, offering a promising approach towards the biological interpretation of phenotype differentiating genes. Early pathway analysis methodologies, named as gene set analysis (GSA), view pathways just as plain lists of genes without taking into account either the underlying pathway network topology or the involved gene regulatory relations. These approaches, even if they achieve computational efficiency and simplicity, consider pathways that involve the same genes as equivalent in terms of their gene enrichment characteristics. Most recent pathway analysis approaches take into account the underlying gene regulatory relations by examining their consistency with gene expression profiles and computing a score for each profile. Even with this approach, assessing and scoring single-relations limits the ability to reveal key gene regulation mechanisms hidden in longer pathway sub-paths. We introduce MinePath, a pathway analysis methodology that addresses and overcomes the aforementioned problems. MinePath facilitates the decomposition of pathways into their constituent sub-paths. Decomposition leads to the transformation of single-relations to complex regulation sub-paths. Regulation sub-paths are then matched with gene expression sample profiles in order to evaluate their functional status and to assess phenotype differential power. Assessment of differential power supports the identification of the most discriminant profiles. In addition, MinePath assess the significance of the pathways as a whole, ranking them by their p-values. Comparison results with state-of-the-art pathway analysis systems are indicative for the soundness and reliability of the MinePath approach. In contrast with many pathway analysis tools, MinePath is a web-based system (www.minepath.org) offering dynamic and rich pathway visualization functionality, with the unique characteristic to color regulatory relations between genes and reveal their phenotype inclination. This unique characteristic makes MinePath a valuable tool for in silico molecular biology experimentation as it serves the biomedical researchers' exploratory needs to reveal and interpret the regulatory mechanisms that underlie and putatively govern the expression of target phenotypes.
Cruz-Garza, Jesus G; Hernandez, Zachery R; Tse, Teresa; Caducoy, Eunice; Abibullaev, Berdakh; Contreras-Vidal, Jose L
2015-10-04
Understanding typical and atypical development remains one of the fundamental questions in developmental human neuroscience. Traditionally, experimental paradigms and analysis tools have been limited to constrained laboratory tasks and contexts due to technical limitations imposed by the available set of measuring and analysis techniques and the age of the subjects. These limitations severely limit the study of developmental neural dynamics and associated neural networks engaged in cognition, perception and action in infants performing "in action and in context". This protocol presents a novel approach to study infants and young children as they freely organize their own behavior, and its consequences in a complex, partly unpredictable and highly dynamic environment. The proposed methodology integrates synchronized high-density active scalp electroencephalography (EEG), inertial measurement units (IMUs), video recording and behavioral analysis to capture brain activity and movement non-invasively in freely-behaving infants. This setup allows for the study of neural network dynamics in the developing brain, in action and context, as these networks are recruited during goal-oriented, exploration and social interaction tasks.
Quantitative 3D analysis of the canal network in cortical bone by micro-computed tomography.
Cooper, D M L; Turinsky, A L; Sensen, C W; Hallgrímsson, B
2003-09-01
Cortical bone is perforated by an interconnected network of porous canals that facilitate the distribution of neurovascular structures throughout the cortex. This network is an integral component of cortical microstructure and, therefore, undergoes continual change throughout life as the cortex is remodeled. To date, the investigation of cortical microstructure, including the canal network, has largely been limited to the two-dimensional (2D) realm due to methodological hurdles. Thanks to continuing improvements in scan resolution, micro-computed tomography (muCT) is the first nondestructive imaging technology capable of resolving cortical canals. Like its application to trabecular bone, muCT provides an efficient means of quantifying aspects of 3D architecture of the canal network. Our aim here is to introduce the use of muCT for this application by providing examples, discussing some of the parameters that can be acquired, and relating these to research applications. Although several parameters developed for the analysis of trabecular microstructure are suitable for the analysis of cortical porosity, the algorithm used to estimate connectivity is not. We adapt existing algorithms based on skeletonization for this task. We believe that 3D analysis of the dimensions and architecture of the canal network will provide novel information relevant to many aspects of bone biology. For example, parameters related to the size, spacing, and volume of the canals may be particularly useful for investigation of the mechanical properties of bone. Alternatively, parameters describing the 3D architecture of the canal network, such as connectivity between the canals, may provide a means of evaluating cumulative remodeling related change. Copyright 2003 Wiley-Liss, Inc.
A review of machine learning in obesity.
DeGregory, K W; Kuiper, P; DeSilvio, T; Pleuss, J D; Miller, R; Roginski, J W; Fisher, C B; Harness, D; Viswanath, S; Heymsfield, S B; Dungan, I; Thomas, D M
2018-05-01
Rich sources of obesity-related data arising from sensors, smartphone apps, electronic medical health records and insurance data can bring new insights for understanding, preventing and treating obesity. For such large datasets, machine learning provides sophisticated and elegant tools to describe, classify and predict obesity-related risks and outcomes. Here, we review machine learning methods that predict and/or classify such as linear and logistic regression, artificial neural networks, deep learning and decision tree analysis. We also review methods that describe and characterize data such as cluster analysis, principal component analysis, network science and topological data analysis. We introduce each method with a high-level overview followed by examples of successful applications. The algorithms were then applied to National Health and Nutrition Examination Survey to demonstrate methodology, utility and outcomes. The strengths and limitations of each method were also evaluated. This summary of machine learning algorithms provides a unique overview of the state of data analysis applied specifically to obesity. © 2018 World Obesity Federation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, Justin Matthew
These are the slides for a graduate presentation at Mississippi State University. It covers the following: the BRL Shaped-Charge Geometry in PAGOSA, mesh refinement study, surrogate modeling using a radial basis function network (RBFN), ruling out parameters using sensitivity analysis (equation of state study), uncertainty quantification (UQ) methodology, and sensitivity analysis (SA) methodology. In summary, a mesh convergence study was used to ensure that solutions were numerically stable by comparing PDV data between simulations. A Design of Experiments (DOE) method was used to reduce the simulation space to study the effects of the Jones-Wilkins-Lee (JWL) Parameters for the Composition Bmore » main charge. Uncertainty was quantified by computing the 95% data range about the median of simulation output using a brute force Monte Carlo (MC) random sampling method. Parameter sensitivities were quantified using the Fourier Amplitude Sensitivity Test (FAST) spectral analysis method where it was determined that detonation velocity, initial density, C1, and B1 controlled jet tip velocity.« less
Efficient Process Migration for Parallel Processing on Non-Dedicated Networks of Workstations
NASA Technical Reports Server (NTRS)
Chanchio, Kasidit; Sun, Xian-He
1996-01-01
This paper presents the design and preliminary implementation of MpPVM, a software system that supports process migration for PVM application programs in a non-dedicated heterogeneous computing environment. New concepts of migration point as well as migration point analysis and necessary data analysis are introduced. In MpPVM, process migrations occur only at previously inserted migration points. Migration point analysis determines appropriate locations to insert migration points; whereas, necessary data analysis provides a minimum set of variables to be transferred at each migration pint. A new methodology to perform reliable point-to-point data communications in a migration environment is also discussed. Finally, a preliminary implementation of MpPVM and its experimental results are presented, showing the correctness and promising performance of our process migration mechanism in a scalable non-dedicated heterogeneous computing environment. While MpPVM is developed on top of PVM, the process migration methodology introduced in this study is general and can be applied to any distributed software environment.
Reasoning and Knowledge Acquisition Framework for 5G Network Analytics
2017-01-01
Autonomic self-management is a key challenge for next-generation networks. This paper proposes an automated analysis framework to infer knowledge in 5G networks with the aim to understand the network status and to predict potential situations that might disrupt the network operability. The framework is based on the Endsley situational awareness model, and integrates automated capabilities for metrics discovery, pattern recognition, prediction techniques and rule-based reasoning to infer anomalous situations in the current operational context. Those situations should then be mitigated, either proactive or reactively, by a more complex decision-making process. The framework is driven by a use case methodology, where the network administrator is able to customize the knowledge inference rules and operational parameters. The proposal has also been instantiated to prove its adaptability to a real use case. To this end, a reference network traffic dataset was used to identify suspicious patterns and to predict the behavior of the monitored data volume. The preliminary results suggest a good level of accuracy on the inference of anomalous traffic volumes based on a simple configuration. PMID:29065473
Foreign currency rate forecasting using neural networks
NASA Astrophysics Data System (ADS)
Pandya, Abhijit S.; Kondo, Tadashi; Talati, Amit; Jayadevappa, Suryaprasad
2000-03-01
Neural networks are increasingly being used as a forecasting tool in many forecasting problems. This paper discusses the application of neural networks in predicting daily foreign exchange rates between the USD, GBP as well as DEM. We approach the problem from a time-series analysis framework - where future exchange rates are forecasted solely using past exchange rates. This relies on the belief that the past prices and future prices are very close related, and interdependent. We present the result of training a neural network with historical USD-GBP data. The methodology used in explained, as well as the training process. We discuss the selection of inputs to the network, and present a comparison of using the actual exchange rates and the exchange rate differences as inputs. Price and rate differences are the preferred way of training neural network in financial applications. Results of both approaches are present together for comparison. We show that the network is able to learn the trends in the exchange rate movements correctly, and present the results of the prediction over several periods of time.
Van Landeghem, Sofie; Van Parys, Thomas; Dubois, Marieke; Inzé, Dirk; Van de Peer, Yves
2016-01-05
Differential networks have recently been introduced as a powerful way to study the dynamic rewiring capabilities of an interactome in response to changing environmental conditions or stimuli. Currently, such differential networks are generated and visualised using ad hoc methods, and are often limited to the analysis of only one condition-specific response or one interaction type at a time. In this work, we present a generic, ontology-driven framework to infer, visualise and analyse an arbitrary set of condition-specific responses against one reference network. To this end, we have implemented novel ontology-based algorithms that can process highly heterogeneous networks, accounting for both physical interactions and regulatory associations, symmetric and directed edges, edge weights and negation. We propose this integrative framework as a standardised methodology that allows a unified view on differential networks and promotes comparability between differential network studies. As an illustrative application, we demonstrate its usefulness on a plant abiotic stress study and we experimentally confirmed a predicted regulator. Diffany is freely available as open-source java library and Cytoscape plugin from http://bioinformatics.psb.ugent.be/supplementary_data/solan/diffany/.
Reasoning and Knowledge Acquisition Framework for 5G Network Analytics.
Sotelo Monge, Marco Antonio; Maestre Vidal, Jorge; García Villalba, Luis Javier
2017-10-21
Autonomic self-management is a key challenge for next-generation networks. This paper proposes an automated analysis framework to infer knowledge in 5G networks with the aim to understand the network status and to predict potential situations that might disrupt the network operability. The framework is based on the Endsley situational awareness model, and integrates automated capabilities for metrics discovery, pattern recognition, prediction techniques and rule-based reasoning to infer anomalous situations in the current operational context. Those situations should then be mitigated, either proactive or reactively, by a more complex decision-making process. The framework is driven by a use case methodology, where the network administrator is able to customize the knowledge inference rules and operational parameters. The proposal has also been instantiated to prove its adaptability to a real use case. To this end, a reference network traffic dataset was used to identify suspicious patterns and to predict the behavior of the monitored data volume. The preliminary results suggest a good level of accuracy on the inference of anomalous traffic volumes based on a simple configuration.
Fracture mechanics analysis of cracked structures using weight function and neural network method
NASA Astrophysics Data System (ADS)
Chen, J. G.; Zang, F. G.; Yang, Y.; Shi, K. K.; Fu, X. L.
2018-06-01
Stress intensity factors(SIFs) due to thermal-mechanical load has been established by using weight function method. Two reference stress states sere used to determine the coefficients in the weight function. Results were evaluated by using data from literature and show a good agreement between them. So, the SIFs can be determined quickly using the weight function obtained when cracks subjected to arbitrary loads, and presented method can be used for probabilistic fracture mechanics analysis. A probabilistic methodology considering Monte-Carlo with neural network (MCNN) has been developed. The results indicate that an accurate probabilistic characteristic of the KI can be obtained by using the developed method. The probability of failure increases with the increasing of loads, and the relationship between is nonlinear.
A multi-phase network situational awareness cognitive task analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erbacher, Robert; Frincke, Deborah A.; Wong, Pak C.
Abstract The goal of our project is to create a set of next-generation cyber situational-awareness capabilities with applications to other domains in the long term. The objective is to improve the decision-making process to enable decision makers to choose better actions. To this end, we put extensive effort into making certain that we had feedback from network analysts and managers and understand what their genuine needs are. This article discusses the cognitive task-analysis methodology that we followed to acquire feedback from the analysts. This article also provides the details we acquired from the analysts on their processes, goals, concerns, themore » data and metadata that they analyze. Finally, we describe the generation of a novel task-flow diagram representing the activities of the target user base.« less
Using Networks To Understand Medical Data: The Case of Class III Malocclusions
Scala, Antonio; Auconi, Pietro; Scazzocchio, Marco; Caldarelli, Guido; McNamara, James A.; Franchi, Lorenzo
2012-01-01
A system of elements that interact or regulate each other can be represented by a mathematical object called a network. While network analysis has been successfully applied to high-throughput biological systems, less has been done regarding their application in more applied fields of medicine; here we show an application based on standard medical diagnostic data. We apply network analysis to Class III malocclusion, one of the most difficult to understand and treat orofacial anomaly. We hypothesize that different interactions of the skeletal components can contribute to pathological disequilibrium; in order to test this hypothesis, we apply network analysis to 532 Class III young female patients. The topology of the Class III malocclusion obtained by network analysis shows a strong co-occurrence of abnormal skeletal features. The pattern of these occurrences influences the vertical and horizontal balance of disharmony in skeletal form and position. Patients with more unbalanced orthodontic phenotypes show preponderance of the pathological skeletal nodes and minor relevance of adaptive dentoalveolar equilibrating nodes. Furthermore, by applying Power Graphs analysis we identify some functional modules among orthodontic nodes. These modules correspond to groups of tightly inter-related features and presumably constitute the key regulators of plasticity and the sites of unbalance of the growing dentofacial Class III system. The data of the present study show that, in their most basic abstraction level, the orofacial characteristics can be represented as graphs using nodes to represent orthodontic characteristics, and edges to represent their various types of interactions. The applications of this mathematical model could improve the interpretation of the quantitative, patient-specific information, and help to better targeting therapy. Last but not least, the methodology we have applied in analyzing orthodontic features can be applied easily to other fields of the medical science. PMID:23028552
Döring, Clemens; Hussein, Mohamed A; Jekle, Mario; Becker, Thomas
2017-08-15
For rye dough structure, it is hypothesised that the presence of arabinoxylan hinders the proteins from forming a coherent network. This hypothesis was investigated using fluorescent-stained antibodies that bind to the arabinoxylan chains. Image analysis proves that the arabinoxylan surrounds the proteins, negatively affecting protein networking. Further, it is hypothesised that the dosing of xylanase and transglutaminase has a positive impact on rye dough and bread characteristics; the findings in this study evidenced that this increases the protein network by up to 38% accompanied by a higher volume rise of 10.67%, compared to standard rye dough. These outcomes combine a product-oriented and physiochemical design of a recipe, targeting structural and functional relationships, and demonstrate a successful methodology for enhancing rye bread quality. Copyright © 2017 Elsevier Ltd. All rights reserved.
Raja, Muhammad Asif Zahoor; Khan, Junaid Ali; Ahmad, Siraj-ul-Islam; Qureshi, Ijaz Mansoor
2012-01-01
A methodology for solution of Painlevé equation-I is presented using computational intelligence technique based on neural networks and particle swarm optimization hybridized with active set algorithm. The mathematical model of the equation is developed with the help of linear combination of feed-forward artificial neural networks that define the unsupervised error of the model. This error is minimized subject to the availability of appropriate weights of the networks. The learning of the weights is carried out using particle swarm optimization algorithm used as a tool for viable global search method, hybridized with active set algorithm for rapid local convergence. The accuracy, convergence rate, and computational complexity of the scheme are analyzed based on large number of independents runs and their comprehensive statistical analysis. The comparative studies of the results obtained are made with MATHEMATICA solutions, as well as, with variational iteration method and homotopy perturbation method. PMID:22919371
ERIC Educational Resources Information Center
Gruzd, Anatoliy; Paulin, Drew; Haythornthwaite, Caroline
2016-01-01
In just a short period, social media have altered many aspects of our daily lives, from how we form and maintain social relationships to how we discover, access, and share information online. Now social media are also affecting how we teach and learn. In this paper, we discuss methods that can help researchers and educators evaluate and understand…
Stability of ecological industry chain: an entropy model approach.
Wang, Qingsong; Qiu, Shishou; Yuan, Xueliang; Zuo, Jian; Cao, Dayong; Hong, Jinglan; Zhang, Jian; Dong, Yong; Zheng, Ying
2016-07-01
A novel methodology is proposed in this study to examine the stability of ecological industry chain network based on entropy theory. This methodology is developed according to the associated dissipative structure characteristics, i.e., complexity, openness, and nonlinear. As defined in the methodology, network organization is the object while the main focus is the identification of core enterprises and core industry chains. It is proposed that the chain network should be established around the core enterprise while supplementation to the core industry chain helps to improve system stability, which is verified quantitatively. Relational entropy model can be used to identify core enterprise and core eco-industry chain. It could determine the core of the network organization and core eco-industry chain through the link form and direction of node enterprises. Similarly, the conductive mechanism of different node enterprises can be examined quantitatively despite the absence of key data. Structural entropy model can be employed to solve the problem of order degree for network organization. Results showed that the stability of the entire system could be enhanced by the supplemented chain around the core enterprise in eco-industry chain network organization. As a result, the sustainability of the entire system could be further improved.
Classification of Alzheimer’s Patients through Ubiquitous Computing †
Nieto-Reyes, Alicia; Duque, Rafael; Montaña, José Luis; Lage, Carmen
2017-01-01
Functional data analysis and artificial neural networks are the building blocks of the proposed methodology that distinguishes the movement patterns among c’s patients on different stages of the disease and classifies new patients to their appropriate stage of the disease. The movement patterns are obtained by the accelerometer device of android smartphones that the patients carry while moving freely. The proposed methodology is relevant in that it is flexible on the type of data to which it is applied. To exemplify that, it is analyzed a novel real three-dimensional functional dataset where each datum is observed in a different time domain. Not only is it observed on a difference frequency but also the domain of each datum has different length. The obtained classification success rate of 83% indicates the potential of the proposed methodology. PMID:28753975
NASA Astrophysics Data System (ADS)
Wang, Jiang; Yang, Chen; Wang, Ruofan; Yu, Haitao; Cao, Yibin; Liu, Jing
2016-10-01
In this paper, EEG series are applied to construct functional connections with the correlation between different regions in order to investigate the nonlinear characteristic and the cognitive function of the brain with Alzheimer's disease (AD). First, limited penetrable visibility graph (LPVG) and phase space method map single EEG series into networks, and investigate the underlying chaotic system dynamics of AD brain. Topological properties of the networks are extracted, such as average path length and clustering coefficient. It is found that the network topology of AD in several local brain regions are different from that of the control group with no statistically significant difference existing all over the brain. Furthermore, in order to detect the abnormality of AD brain as a whole, functional connections among different brain regions are reconstructed based on similarity of clustering coefficient sequence (CCSS) of EEG series in the four frequency bands (delta, theta, alpha, and beta), which exhibit obvious small-world properties. Graph analysis demonstrates that for both methodologies, the functional connections between regions of AD brain decrease, particularly in the alpha frequency band. AD causes the graph index complexity of the functional network decreased, the small-world properties weakened, and the vulnerability increased. The obtained results show that the brain functional network constructed by LPVG and phase space method might be more effective to distinguish AD from the normal control than the analysis of single series, which is helpful for revealing the underlying pathological mechanism of the disease.
Duell, L.F.
1987-01-01
A basinwide ideal network and an actual network were designed to identify ambient groundwater quality, trends in groundwater quality, and degree of threat from potential pollution sources in Antelope Valley, California. In general, throughout the valley groundwater quality has remained unchanged, and no specific trends are apparent. The main source of groundwater for the valley is generally suitable for domestic, irrigation, and most industrial uses. Water quality data for selected constituents of some network wells and surface-water sites are presented. The ideal network of 77 sites was selected on the basis of site-specific criteria, geohydrology, and current land use (agricultural, residential, and industrial). These sites were used as a guide in the design of the actual network consisting of 44 existing wells. Wells are currently being monitored and were selected whenever possible because of budgetary constraints. Of the remaining ideal sites, 20 have existing wells not part of a current water quality network, and 13 are locations where no wells exist. The methodology used for the selection of sites, constituents monitored, and frequency of analysis will enable network users to make appropriate future changes to the monitoring network. (USGS)
Correlations between Community Structure and Link Formation in Complex Networks
Liu, Zhen; He, Jia-Lin; Kapoor, Komal; Srivastava, Jaideep
2013-01-01
Background Links in complex networks commonly represent specific ties between pairs of nodes, such as protein-protein interactions in biological networks or friendships in social networks. However, understanding the mechanism of link formation in complex networks is a long standing challenge for network analysis and data mining. Methodology/Principal Findings Links in complex networks have a tendency to cluster locally and form so-called communities. This widely existed phenomenon reflects some underlying mechanism of link formation. To study the correlations between community structure and link formation, we present a general computational framework including a theory for network partitioning and link probability estimation. Our approach enables us to accurately identify missing links in partially observed networks in an efficient way. The links having high connection likelihoods in the communities reveal that links are formed preferentially to create cliques and accordingly promote the clustering level of the communities. The experimental results verify that such a mechanism can be well captured by our approach. Conclusions/Significance Our findings provide a new insight into understanding how links are created in the communities. The computational framework opens a wide range of possibilities to develop new approaches and applications, such as community detection and missing link prediction. PMID:24039818
NASA Astrophysics Data System (ADS)
Dimond, David A.; Burgess, Robert; Barrios, Nolan; Johnson, Neil D.
2000-05-01
Traditionally, to guarantee the network performance of medical image data transmission, imaging traffic was isolated on a separate network. Organizations are depending on a new generation of multi-purpose networks to transport both normal information and image traffic as they expand access to images throughout the enterprise. These organi want to leverage their existing infrastructure for imaging traffic, but are not willing to accept degradations in overall network performance. To guarantee 'on demand' network performance for image transmissions anywhere at any time, networks need to be designed with the ability to 'carve out' bandwidth for specific applications and to minimize the chances of network failures. This paper will present the methodology Cincinnati Children's Hospital Medical Center (CHMC) used to enhance the physical and logical network design of the existing hospital network to guarantee a class of service for imaging traffic. PACS network designs should utilize the existing enterprise local area network i.e. (LAN) infrastructure where appropriate. Logical separation or segmentation provides the application independence from other clinical and administrative applications as required, ensuring bandwidth and service availability.
A Methodology for Phased Array Radar Threshold Modeling Using the Advanced Propagation Model (APM)
2017-10-01
TECHNICAL REPORT 3079 October 2017 A Methodology for Phased Array Radar Threshold Modeling Using the Advanced Propagation Model (APM...Head 55190 Networks Division iii EXECUTIVE SUMMARY This report summarizes the methodology developed to improve the radar threshold modeling...PHASED ARRAY RADAR CONFIGURATION ..................................................................... 1 3. METHODOLOGY
Solution to the indexing problem of frequency domain simulation experiments
NASA Technical Reports Server (NTRS)
Mitra, Mousumi; Park, Stephen K.
1991-01-01
A frequency domain simulation experiment is one in which selected system parameters are oscillated sinusoidally to induce oscillations in one or more system statistics of interest. A spectral (Fourier) analysis of these induced oscillations is then performed. To perform this spectral analysis, all oscillation frequencies must be referenced to a common, independent variable - an oscillation index. In a discrete-event simulation, the global simulation clock is the most natural choice for the oscillation index. However, past efforts to reference all frequencies to the simulation clock generally yielded unsatisfactory results. The reason for these unsatisfactory results is explained in this paper and a new methodology which uses the simulation clock as the oscillation index is presented. Techniques for implementing this new methodology are demonstrated by performing a frequency domain simulation experiment for a network of queues.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-08-01
The objective of this report is to develop a generalized methodology for examining water distribution systems for adjustable speed drive (ASD) applications and to provide an example (the City of Chicago 68th Street Water Pumping Station) using the methodology. The City of Chicago water system was chosen as the candidate for analysis because it has a large service area distribution network with no storage provisions after the distribution pumps. Many industrial motors operate at only one speed or a few speeds. By speeding up or slowing down, ASDs achieve gentle startups and gradual shutdowns thereby providing plant equipment a longermore » life with fewer breakdowns while minimizing the energy requirements. The test program substantiated that ASDs enhance product quality and increase productivity in many industrial operations, including extended equipment life. 35 figs.« less
2010-04-01
Methodological Results / Details ................................................ 24 4.1.3.1 Clock Synchronization , Network & Temporal Resolution...xii DRDC Atlantic CR 2010-058 Acknowledgements Special thanks to Carl Helmick, Patti Devlin, Mike Taber, and the Dalhousie lab...Methodological Results / Details 4.1.3.1 Clock Synchronization , Network & Temporal Resolution Due to drift in computer clock times, especially laptop
Kumar, Avishek; Butler, Brandon M.; Kumar, Sudhir; Ozkan, S. Banu
2016-01-01
Summary Sequencing technologies are revealing many new non-synonymous single nucleotide variants (nsSNVs) in each personal exome. To assess their functional impacts, comparative genomics is frequently employed to predict if they are benign or not. However, evolutionary analysis alone is insufficient, because it misdiagnoses many disease-associated nsSNVs, such as those at positions involved in protein interfaces, and because evolutionary predictions do not provide mechanistic insights into functional change or loss. Structural analyses can aid in overcoming both of these problems by incorporating conformational dynamics and allostery in nSNV diagnosis. Finally, protein-protein interaction networks using systems-level methodologies shed light onto disease etiology and pathogenesis. Bridging these network approaches with structurally resolved protein interactions and dynamics will advance genomic medicine. PMID:26684487
Modeling the resilience of critical infrastructure: the role of network dependencies.
Guidotti, Roberto; Chmielewski, Hana; Unnikrishnan, Vipin; Gardoni, Paolo; McAllister, Therese; van de Lindt, John
2016-01-01
Water and wastewater network, electric power network, transportation network, communication network, and information technology network are among the critical infrastructure in our communities; their disruption during and after hazard events greatly affects communities' well-being, economic security, social welfare, and public health. In addition, a disruption in one network may cause disruption to other networks and lead to their reduced functionality. This paper presents a unified theoretical methodology for the modeling of dependent/interdependent infrastructure networks and incorporates it in a six-step probabilistic procedure to assess their resilience. Both the methodology and the procedure are general, can be applied to any infrastructure network and hazard, and can model different types of dependencies between networks. As an illustration, the paper models the direct effects of seismic events on the functionality of a potable water distribution network and the cascading effects of the damage of the electric power network (EPN) on the potable water distribution network (WN). The results quantify the loss of functionality and delay in the recovery process due to dependency of the WN on the EPN. The results show the importance of capturing the dependency between networks in modeling the resilience of critical infrastructure.
Modeling the resilience of critical infrastructure: the role of network dependencies
Guidotti, Roberto; Chmielewski, Hana; Unnikrishnan, Vipin; Gardoni, Paolo; McAllister, Therese; van de Lindt, John
2017-01-01
Water and wastewater network, electric power network, transportation network, communication network, and information technology network are among the critical infrastructure in our communities; their disruption during and after hazard events greatly affects communities’ well-being, economic security, social welfare, and public health. In addition, a disruption in one network may cause disruption to other networks and lead to their reduced functionality. This paper presents a unified theoretical methodology for the modeling of dependent/interdependent infrastructure networks and incorporates it in a six-step probabilistic procedure to assess their resilience. Both the methodology and the procedure are general, can be applied to any infrastructure network and hazard, and can model different types of dependencies between networks. As an illustration, the paper models the direct effects of seismic events on the functionality of a potable water distribution network and the cascading effects of the damage of the electric power network (EPN) on the potable water distribution network (WN). The results quantify the loss of functionality and delay in the recovery process due to dependency of the WN on the EPN. The results show the importance of capturing the dependency between networks in modeling the resilience of critical infrastructure. PMID:28825037
Using expression genetics to study the neurobiology of ethanol and alcoholism.
Farris, Sean P; Wolen, Aaron R; Miles, Michael F
2010-01-01
Recent simultaneous progress in human and animal model genetics and the advent of microarray whole genome expression profiling have produced prodigious data sets on genetic loci, potential candidate genes, and differential gene expression related to alcoholism and ethanol behaviors. Validated target genes or gene networks functioning in alcoholism are still of meager proportions. Genetical genomics, which combines genetic analysis of both traditional phenotypes and whole genome expression data, offers a potential methodology for characterizing brain gene networks functioning in alcoholism. This chapter will describe concepts, approaches, and recent findings in the field of genetical genomics as it applies to alcohol research. Copyright 2010 Elsevier Inc. All rights reserved.
Thompson-Bean, E; Das, R; McDaid, A
2016-10-31
We present a novel methodology for the design and manufacture of complex biologically inspired soft robotic fluidic actuators. The methodology is applied to the design and manufacture of a prosthetic for the hand. Real human hands are scanned to produce a 3D model of a finger, and pneumatic networks are implemented within it to produce a biomimetic bending motion. The finger is then partitioned into material sections, and a genetic algorithm based optimization, using finite element analysis, is employed to discover the optimal material for each section. This is based on two biomimetic performance criteria. Two sets of optimizations using two material sets are performed. Promising optimized material arrangements are fabricated using two techniques to validate the optimization routine, and the fabricated and simulated results are compared. We find that the optimization is successful in producing biomimetic soft robotic fingers and that fabrication of the fingers is possible. Limitations and paths for development are discussed. This methodology can be applied for other fluidic soft robotic devices.
Structural methodologies for auditing SNOMED.
Wang, Yue; Halper, Michael; Min, Hua; Perl, Yehoshua; Chen, Yan; Spackman, Kent A
2007-10-01
SNOMED is one of the leading health care terminologies being used worldwide. As such, quality assurance is an important part of its maintenance cycle. Methodologies for auditing SNOMED based on structural aspects of its organization are presented. In particular, automated techniques for partitioning SNOMED into smaller groups of concepts based primarily on relationships patterns are defined. Two abstraction networks, the area taxonomy and p-area taxonomy, are derived from the partitions. The high-level views afforded by these abstraction networks form the basis for systematic auditing. The networks tend to highlight errors that manifest themselves as irregularities at the abstract level. They also support group-based auditing, where sets of purportedly similar concepts are focused on for review. The auditing methodologies are demonstrated on one of SNOMED's top-level hierarchies. Errors discovered during the auditing process are reported.
White blood cells identification system based on convolutional deep neural learning networks.
Shahin, A I; Guo, Yanhui; Amin, K M; Sharawi, Amr A
2017-11-16
White blood cells (WBCs) differential counting yields valued information about human health and disease. The current developed automated cell morphology equipments perform differential count which is based on blood smear image analysis. Previous identification systems for WBCs consist of successive dependent stages; pre-processing, segmentation, feature extraction, feature selection, and classification. There is a real need to employ deep learning methodologies so that the performance of previous WBCs identification systems can be increased. Classifying small limited datasets through deep learning systems is a major challenge and should be investigated. In this paper, we propose a novel identification system for WBCs based on deep convolutional neural networks. Two methodologies based on transfer learning are followed: transfer learning based on deep activation features and fine-tuning of existed deep networks. Deep acrivation featues are extracted from several pre-trained networks and employed in a traditional identification system. Moreover, a novel end-to-end convolutional deep architecture called "WBCsNet" is proposed and built from scratch. Finally, a limited balanced WBCs dataset classification is performed through the WBCsNet as a pre-trained network. During our experiments, three different public WBCs datasets (2551 images) have been used which contain 5 healthy WBCs types. The overall system accuracy achieved by the proposed WBCsNet is (96.1%) which is more than different transfer learning approaches or even the previous traditional identification system. We also present features visualization for the WBCsNet activation which reflects higher response than the pre-trained activated one. a novel WBCs identification system based on deep learning theory is proposed and a high performance WBCsNet can be employed as a pre-trained network. Copyright © 2017. Published by Elsevier B.V.
Neural Network and Response Surface Methodology for Rocket Engine Component Optimization
NASA Technical Reports Server (NTRS)
Vaidyanathan, Rajkumar; Papita, Nilay; Shyy, Wei; Tucker, P. Kevin; Griffin, Lisa W.; Haftka, Raphael; Fitz-Coy, Norman; McConnaughey, Helen (Technical Monitor)
2000-01-01
The goal of this work is to compare the performance of response surface methodology (RSM) and two types of neural networks (NN) to aid preliminary design of two rocket engine components. A data set of 45 training points and 20 test points obtained from a semi-empirical model based on three design variables is used for a shear coaxial injector element. Data for supersonic turbine design is based on six design variables, 76 training, data and 18 test data obtained from simplified aerodynamic analysis. Several RS and NN are first constructed using the training data. The test data are then employed to select the best RS or NN. Quadratic and cubic response surfaces. radial basis neural network (RBNN) and back-propagation neural network (BPNN) are compared. Two-layered RBNN are generated using two different training algorithms, namely solverbe and solverb. A two layered BPNN is generated with Tan-Sigmoid transfer function. Various issues related to the training of the neural networks are addressed including number of neurons, error goals, spread constants and the accuracy of different models in representing the design space. A search for the optimum design is carried out using a standard gradient-based optimization algorithm over the response surfaces represented by the polynomials and trained neural networks. Usually a cubic polynominal performs better than the quadratic polynomial but exceptions have been noticed. Among the NN choices, the RBNN designed using solverb yields more consistent performance for both engine components considered. The training of RBNN is easier as it requires linear regression. This coupled with the consistency in performance promise the possibility of it being used as an optimization strategy for engineering design problems.
An outer approximation method for the road network design problem
2018-01-01
Best investment in the road infrastructure or the network design is perceived as a fundamental and benchmark problem in transportation. Given a set of candidate road projects with associated costs, finding the best subset with respect to a limited budget is known as a bilevel Discrete Network Design Problem (DNDP) of NP-hard computationally complexity. We engage with the complexity with a hybrid exact-heuristic methodology based on a two-stage relaxation as follows: (i) the bilevel feature is relaxed to a single-level problem by taking the network performance function of the upper level into the user equilibrium traffic assignment problem (UE-TAP) in the lower level as a constraint. It results in a mixed-integer nonlinear programming (MINLP) problem which is then solved using the Outer Approximation (OA) algorithm (ii) we further relax the multi-commodity UE-TAP to a single-commodity MILP problem, that is, the multiple OD pairs are aggregated to a single OD pair. This methodology has two main advantages: (i) the method is proven to be highly efficient to solve the DNDP for a large-sized network of Winnipeg, Canada. The results suggest that within a limited number of iterations (as termination criterion), global optimum solutions are quickly reached in most of the cases; otherwise, good solutions (close to global optimum solutions) are found in early iterations. Comparative analysis of the networks of Gao and Sioux-Falls shows that for such a non-exact method the global optimum solutions are found in fewer iterations than those found in some analytically exact algorithms in the literature. (ii) Integration of the objective function among the constraints provides a commensurate capability to tackle the multi-objective (or multi-criteria) DNDP as well. PMID:29590111
An outer approximation method for the road network design problem.
Asadi Bagloee, Saeed; Sarvi, Majid
2018-01-01
Best investment in the road infrastructure or the network design is perceived as a fundamental and benchmark problem in transportation. Given a set of candidate road projects with associated costs, finding the best subset with respect to a limited budget is known as a bilevel Discrete Network Design Problem (DNDP) of NP-hard computationally complexity. We engage with the complexity with a hybrid exact-heuristic methodology based on a two-stage relaxation as follows: (i) the bilevel feature is relaxed to a single-level problem by taking the network performance function of the upper level into the user equilibrium traffic assignment problem (UE-TAP) in the lower level as a constraint. It results in a mixed-integer nonlinear programming (MINLP) problem which is then solved using the Outer Approximation (OA) algorithm (ii) we further relax the multi-commodity UE-TAP to a single-commodity MILP problem, that is, the multiple OD pairs are aggregated to a single OD pair. This methodology has two main advantages: (i) the method is proven to be highly efficient to solve the DNDP for a large-sized network of Winnipeg, Canada. The results suggest that within a limited number of iterations (as termination criterion), global optimum solutions are quickly reached in most of the cases; otherwise, good solutions (close to global optimum solutions) are found in early iterations. Comparative analysis of the networks of Gao and Sioux-Falls shows that for such a non-exact method the global optimum solutions are found in fewer iterations than those found in some analytically exact algorithms in the literature. (ii) Integration of the objective function among the constraints provides a commensurate capability to tackle the multi-objective (or multi-criteria) DNDP as well.
Respondent-Driven Sampling: An Assessment of Current Methodology.
Gile, Krista J; Handcock, Mark S
2010-08-01
Respondent-Driven Sampling (RDS) employs a variant of a link-tracing network sampling strategy to collect data from hard-to-reach populations. By tracing the links in the underlying social network, the process exploits the social structure to expand the sample and reduce its dependence on the initial (convenience) sample.The current estimators of population averages make strong assumptions in order to treat the data as a probability sample. We evaluate three critical sensitivities of the estimators: to bias induced by the initial sample, to uncontrollable features of respondent behavior, and to the without-replacement structure of sampling.Our analysis indicates: (1) that the convenience sample of seeds can induce bias, and the number of sample waves typically used in RDS is likely insufficient for the type of nodal mixing required to obtain the reputed asymptotic unbiasedness; (2) that preferential referral behavior by respondents leads to bias; (3) that when a substantial fraction of the target population is sampled the current estimators can have substantial bias.This paper sounds a cautionary note for the users of RDS. While current RDS methodology is powerful and clever, the favorable statistical properties claimed for the current estimates are shown to be heavily dependent on often unrealistic assumptions. We recommend ways to improve the methodology.
Groenen, Carola J M; van Duijnhoven, Noortje T L; Faber, Marjan J; Koetsenruijter, Jan; Kremer, Jan A M; Vandenbussche, Frank P H A
2017-02-01
To improve Dutch maternity care, professionals start working in interdisciplinary patient-centred networks, which includes the patients as a member. The introduction of the case manager is expected to work positively on both the individual and the network level. However, case management is new in Dutch maternity care. The present study aims to define the profession that would be most suitable to fulfil the role of case manager. The maternal care network in the Nijmegen region was determined by using Social Network Analysis (SNA). SNA is a quantitative methodology that measures and analyses patient-related connections between different professionals working in a network. To identify the case manager we focused on the position, reach, and connections in the network of the maternal care professionals. Maternity healthcare professionals in a single region of the Netherlands with an average of 4,500 births/year. The participants were 214 individual healthcare workers from eight different professions. The total network showed 3948 connections between 214 maternity healthcare professionals with a density of 0.08. Each profession had some central individuals in the network. The 52 community-based midwives were responsible for 51% of all measured connections. The youth health doctors and nurses were mostly situated on the periphery and less connected. The betweenness centrality had the highest score in obstetricians and community-based midwives. Only the community-based midwives had connections with all other groups of professions. Almost all professionals in the network could reach other professionals in two steps. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Caetano, Marco Antonio Leonel; Yoneyama, Takashi
2015-07-01
This work concerns the study of the effects felt by a network as a whole when a specific node is perturbed. Many real world systems can be described by network models in which the interactions of the various agents can be represented as an edge of a graph. With a graph model in hand, it is possible to evaluate the effect of deleting some of its edges on the architecture and values of nodes of the network. Eventually a node may end up isolated from the rest of the network and an interesting problem is to have a quantitative measure of the impact of such an event. For instance, in the field of finance, the network models are very popular and the proposed methodology allows to carry out "what if" tests in terms of weakening the links between the economic agents, represented as nodes. The two main concepts employed in the proposed methodology are (i) the vibrational IC-Information Centrality, which can provide a measure of the relative importance of a particular node in a network and (ii) autocatalytic networks that can indicate the evolutionary trends of the network. Although these concepts were originally proposed in the context of other fields of knowledge, they were also found to be useful in analyzing financial networks. In order to illustrate the applicability of the proposed methodology, a case of study using the actual data comprising stock market indices of 12 countries is presented.
Advances in Artificial Neural Networks - Methodological Development and Application
USDA-ARS?s Scientific Manuscript database
Artificial neural networks as a major soft-computing technology have been extensively studied and applied during the last three decades. Research on backpropagation training algorithms for multilayer perceptron networks has spurred development of other neural network training algorithms for other ne...
[Cancer pain management: Systematic review and critical appraisal of clinical practice guidelines].
Martínez-Nicolás, I; Ángel-García, D; Saturno, P J; López-Soriano, F
2016-01-01
Although several clinical practice guidelines have been developed in the last decades, cancer pain management is still deficient. The purpose of this work was to carry out a comprehensive and systematic literature review of current clinical practice guidelines on cancer pain management, and critically appraise their methodology and content in order to evaluate their quality and validity to cope with this public health issue. A systematic review was performed in the main databases, using English, French and Spanish as languages, from 2008 to 2013. Reporting and methodological quality was rated with the Appraisal of Guidelines, Research and Evaluation II (AGREE-II) tool, including an inter-rater reliability analysis. Guideline recommendations were extracted and classified into several categories and levels of evidence, aiming to analyse guidelines variability and evidence-based content comprehensiveness. Six guidelines were included. A wide variability was found in both reporting and methodological quality of guidelines, as well as in the content and the level of evidence of their recommendations. The Scottish Intercollegiate Guidelines Network guideline was the best rated using AGREE-II, while the Sociedad Española de Oncología Médica guideline was the worst rated. The Ministry of Health Malaysia guideline was the most comprehensive, and the Scottish Intercollegiate Guidelines Network guideline was the second one. The current guidelines on cancer pain management have limited quality and content. We recommend Ministry of Health Malaysia and Scottish Intercollegiate Guidelines Network guidelines, whilst Sociedad Española de Oncología Médica guideline still needs to improve. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.
Chan, Chien A; Gygax, André F; Wong, Elaine; Leckie, Christopher A; Nirmalathas, Ampalavanapillai; Kilper, Daniel C
2013-01-02
Internet traffic has grown rapidly in recent years and is expected to continue to expand significantly over the next decade. Consequently, the resulting greenhouse gas (GHG) emissions of telecommunications service-supporting infrastructures have become an important issue. In this study, we develop a set of models for assessing the use-phase power consumption and carbon dioxide emissions of telecom network services to help telecom providers gain a better understanding of the GHG emissions associated with the energy required for their networks and services. Due to the fact that measuring the power consumption and traffic in a telecom network is a challenging task, these models utilize different granularities of available network information. As the granularity of the network measurement information decreases, the corresponding models have the potential to produce larger estimation errors. Therefore, we examine the accuracy of these models under various network scenarios using two approaches: (i) a sensitivity analysis through simulations and (ii) a case study of a deployed network. Both approaches show that the accuracy of the models depends on the network size, the total amount of network service traffic (i.e., for the service under assessment), and the number of network nodes used to process the service.
NASA Astrophysics Data System (ADS)
Curme, Chester
Technological advances have provided scientists with large high-dimensional datasets that describe the behaviors of complex systems: from the statistics of energy levels in complex quantum systems, to the time-dependent transcription of genes, to price fluctuations among assets in a financial market. In this environment, where it may be difficult to infer the joint distribution of the data, network science has flourished as a way to gain insight into the structure and organization of such systems by focusing on pairwise interactions. This work focuses on a particular setting, in which a system is described by multivariate time series data. We consider time-lagged correlations among elements in this system, in such a way that the measured interactions among elements are asymmetric. Finally, we allow these interactions to be characteristically weak, so that statistical uncertainties may be important to consider when inferring the structure of the system. We introduce a methodology for constructing statistically validated networks to describe such a system, extend the methodology to accommodate interactions with a periodic component, and show how consideration of bipartite community structures in these networks can aid in the construction of robust statistical models. An example of such a system is a financial market, in which high frequency returns data may be used to describe contagion, or the spreading of shocks in price among assets. These data provide the experimental testing ground for our methodology. We study NYSE data from both the present day and one decade ago, examine the time scales over which the validated lagged correlation networks exist, and relate differences in the topological properties of the networks to an increasing economic efficiency. We uncover daily periodicities in the validated interactions, and relate our findings to explanations of the Epps Effect, an empirical phenomenon of financial time series. We also study bipartite community structures in networks composed of market returns and news sentiment signals for 40 countries. We compare the degrees to which markets anticipate news, and news anticipate markets, and use the community structures to construct a recommender system for inputs to prediction models. Finally, we complement this work with novel investigations of the exogenous news items that may drive the financial system using topic models. This includes an analysis of how investors and the general public may interact with these news items using Internet search data, and how the diversity of stories in the news both responds to and influences market movements.
A survey of application: genomics and genetic programming, a new frontier.
Khan, Mohammad Wahab; Alam, Mansaf
2012-08-01
The aim of this paper is to provide an introduction to the rapidly developing field of genetic programming (GP). Particular emphasis is placed on the application of GP to genomics. First, the basic methodology of GP is introduced. This is followed by a review of applications in the areas of gene network inference, gene expression data analysis, SNP analysis, epistasis analysis and gene annotation. Finally this paper concluded by suggesting potential avenues of possible future research on genetic programming, opportunities to extend the technique, and areas for possible practical applications. Copyright © 2012 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Rossi, V.; Dubois, M.; Ser-Giacomi, E.; Monroy, P.; Lopez, C.; Hernandez-Garcia, E.
2016-02-01
Assessing the spatial structure and dynamics of marine populations is still a major challenge for ecologists. The necessity to manage marine resources from a large-scale perspective and considering the whole ecosystem is now recognized but the absence of appropriate tools to address these objectives limits the implementation of globally pertinent conservation planning. Inspired from Network Theory, we present a new methodological framework called Lagrangian Flow Network which allows a systematic characterization of multi-scale dispersal and connectivity of early life history stages of marine organisms. The network is constructed by subdividing the basin into an ensemble of equal-area subregions which are interconnected through the transport of propagules by ocean currents. The present version allows the identification of hydrodynamical provinces and the computation of various connectivity proxies measuring retention and exchange of larvae. Due to our spatial discretization and subsequent network representation, as well as our Lagrangian approach, further methodological improvements are handily accessible. These future developments include a parametrization of habitat patchiness, the implementation of realistic larval traits and the consideration of abiotic variables (e.g. temperature, salinity, planktonic resources...) and their effects on larval production and survival. While the model is potentially tunable to any species whose biological traits and ecological preferences are precisely known, it can also be used in a more generic configuration by efficient computing and analysis of a large number of experiments with relevant ecological parameters. It permits a better characterization of population connectivity at multiple scales and it informs its ecological and managerial interpretations.
Guest, James; Harrop, James S; Aarabi, Bizhan; Grossman, Robert G; Fawcett, James W; Fehlings, Michael G; Tator, Charles H
2012-09-01
The North American Clinical Trials Network (NACTN) includes 9 clinical centers funded by the US Department of Defense and the Christopher Reeve Paralysis Foundation. Its purpose is to accelerate clinical testing of promising therapeutics in spinal cord injury (SCI) through the development of a robust interactive infrastructure. This structure includes key committees that serve to provide longitudinal guidance to the Network. These committees include the Executive, Data Management, and Neurological Outcome Assessments Committees, and the Therapeutic Selection Committee (TSC), which is the subject of this manuscript. The NACTN brings unique elements to the SCI field. The Network's stability is not restricted to a single clinical trial. Network members have diverse expertise and include experts in clinical care, clinical trial design and methodology, pharmacology, preclinical and clinical research, and advanced rehabilitation techniques. Frequent systematic communication is assigned a high value, as is democratic process, fairness and efficiency of decision making, and resource allocation. This article focuses on how decision making occurs within the TSC to rank alternative therapeutics according to 2 main variables: quality of the preclinical data set, and fit with the Network's aims and capabilities. This selection process is important because if the Network's resources are committed to a therapeutic, alternatives cannot be pursued. A proposed methodology includes a multicriteria decision analysis that uses a Multi-Attribute Global Inference of Quality matrix to quantify the process. To rank therapeutics, the TSC uses a series of consensus steps designed to reduce individual and group bias and limit subjectivity. Given the difficulties encountered by industry in completing clinical trials in SCI, stable collaborative not-for-profit consortia, such as the NACTN, may be essential to clinical progress in SCI. The evolution of the NACTN also offers substantial opportunity to refine decision making and group dynamics. Making the best possible decisions concerning therapeutics selection for trial testing is a cornerstone of the Network's function.
Xu, W; LeBeau, J M
2018-05-01
We establish a series of deep convolutional neural networks to automatically analyze position averaged convergent beam electron diffraction patterns. The networks first calibrate the zero-order disk size, center position, and rotation without the need for pretreating the data. With the aligned data, additional networks then measure the sample thickness and tilt. The performance of the network is explored as a function of a variety of variables including thickness, tilt, and dose. A methodology to explore the response of the neural network to various pattern features is also presented. Processing patterns at a rate of ∼ 0.1 s/pattern, the network is shown to be orders of magnitude faster than a brute force method while maintaining accuracy. The approach is thus suitable for automatically processing big, 4D STEM data. We also discuss the generality of the method to other materials/orientations as well as a hybrid approach that combines the features of the neural network with least squares fitting for even more robust analysis. The source code is available at https://github.com/subangstrom/DeepDiffraction. Copyright © 2018 Elsevier B.V. All rights reserved.
Chindelevitch, Leonid; Trigg, Jason; Regev, Aviv; Berger, Bonnie
2014-01-01
Constraint-based models are currently the only methodology that allows the study of metabolism at the whole-genome scale. Flux balance analysis is commonly used to analyse constraint-based models. Curiously, the results of this analysis vary with the software being run, a situation that we show can be remedied by using exact rather than floating-point arithmetic. Here we introduce MONGOOSE, a toolbox for analysing the structure of constraint-based metabolic models in exact arithmetic. We apply MONGOOSE to the analysis of 98 existing metabolic network models and find that the biomass reaction is surprisingly blocked (unable to sustain non-zero flux) in nearly half of them. We propose a principled approach for unblocking these reactions and extend it to the problems of identifying essential and synthetic lethal reactions and minimal media. Our structural insights enable a systematic study of constraint-based metabolic models, yielding a deeper understanding of their possibilities and limitations. PMID:25291352
An evaluation of the directed flow graph methodology
NASA Technical Reports Server (NTRS)
Snyder, W. E.; Rajala, S. A.
1984-01-01
The applicability of the Directed Graph Methodology (DGM) to the design and analysis of special purpose image and signal processing hardware was evaluated. A special purpose image processing system was designed and described using DGM. The design, suitable for very large scale integration (VLSI) implements a region labeling technique. Two computer chips were designed, both using metal-nitride-oxide-silicon (MNOS) technology, as well as a functional system utilizing those chips to perform real time region labeling. The system is described in terms of DGM primitives. As it is currently implemented, DGM is inappropriate for describing synchronous, tightly coupled, special purpose systems. The nature of the DGM formalism lends itself more readily to modeling networks of general purpose processors.
Network analysis for a network disorder: The emerging role of graph theory in the study of epilepsy.
Bernhardt, Boris C; Bonilha, Leonardo; Gross, Donald W
2015-09-01
Recent years have witnessed a paradigm shift in the study and conceptualization of epilepsy, which is increasingly understood as a network-level disorder. An emblematic case is temporal lobe epilepsy (TLE), the most common drug-resistant epilepsy that is electroclinically defined as a focal epilepsy and pathologically associated with hippocampal sclerosis. In this review, we will summarize histopathological, electrophysiological, and neuroimaging evidence supporting the concept that the substrate of TLE is not limited to the hippocampus alone, but rather is broadly distributed across multiple brain regions and interconnecting white matter pathways. We will introduce basic concepts of graph theory, a formalism to quantify topological properties of complex systems that has recently been widely applied to study networks derived from brain imaging and electrophysiology. We will discuss converging graph theoretical evidence indicating that networks in TLE show marked shifts in their overall topology, providing insight into the neurobiology of TLE as a network-level disorder. Our review will conclude by discussing methodological challenges and future clinical applications of this powerful analytical approach. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Abdi, Abdi M.; Szu, Harold H.
2003-04-01
With the growing rate of interconnection among computer systems, network security is becoming a real challenge. Intrusion Detection System (IDS) is designed to protect the availability, confidentiality and integrity of critical network information systems. Today"s approach to network intrusion detection involves the use of rule-based expert systems to identify an indication of known attack or anomalies. However, these techniques are less successful in identifying today"s attacks. Hackers are perpetually inventing new and previously unanticipated techniques to compromise information infrastructure. This paper proposes a dynamic way of detecting network intruders on time serious data. The proposed approach consists of a two-step process. Firstly, obtaining an efficient multi-user detection method, employing the recently introduced complexity minimization approach as a generalization of a standard ICA. Secondly, we identified unsupervised learning neural network architecture based on Kohonen"s Self-Organizing Map for potential functional clustering. These two steps working together adaptively will provide a pseudo-real time novelty detection attribute to supplement the current intrusion detection statistical methodology.
Actor-Network Theory and methodology: Just what does it mean to say that nonhumans have agency?
Sayes, Edwin
2014-02-01
Actor-Network Theory is a controversial social theory. In no respect is this more so than the role it 'gives' to nonhumans: nonhumans have agency, as Latour provocatively puts it. This article aims to interrogate the multiple layers of this declaration to understand what it means to assert with Actor-Network Theory that nonhumans exercise agency. The article surveys a wide corpus of statements by the position's leading figures and emphasizes the wider methodological framework in which these statements are embedded. With this work done, readers will then be better placed to reject or accept the Actor-Network position - understanding more precisely what exactly it is at stake in this decision.
Scalable analysis of nonlinear systems using convex optimization
NASA Astrophysics Data System (ADS)
Papachristodoulou, Antonis
In this thesis, we investigate how convex optimization can be used to analyze different classes of nonlinear systems at various scales algorithmically. The methodology is based on the construction of appropriate Lyapunov-type certificates using sum of squares techniques. After a brief introduction on the mathematical tools that we will be using, we turn our attention to robust stability and performance analysis of systems described by Ordinary Differential Equations. A general framework for constrained systems analysis is developed, under which stability of systems with polynomial, non-polynomial vector fields and switching systems, as well estimating the region of attraction and the L2 gain can be treated in a unified manner. We apply our results to examples from biology and aerospace. We then consider systems described by Functional Differential Equations (FDEs), i.e., time-delay systems. Their main characteristic is that they are infinite dimensional, which complicates their analysis. We first show how the complete Lyapunov-Krasovskii functional can be constructed algorithmically for linear time-delay systems. Then, we concentrate on delay-independent and delay-dependent stability analysis of nonlinear FDEs using sum of squares techniques. An example from ecology is given. The scalable stability analysis of congestion control algorithms for the Internet is investigated next. The models we use result in an arbitrary interconnection of FDE subsystems, for which we require that stability holds for arbitrary delays, network topologies and link capacities. Through a constructive proof, we develop a Lyapunov functional for FAST---a recently developed network congestion control scheme---so that the Lyapunov stability properties scale with the system size. We also show how other network congestion control schemes can be analyzed in the same way. Finally, we concentrate on systems described by Partial Differential Equations. We show that axially constant perturbations of the Navier-Stokes equations for Hagen-Poiseuille flow are globally stable, even though the background noise is amplified as R3 where R is the Reynolds number, giving a 'robust yet fragile' interpretation. We also propose a sum of squares methodology for the analysis of systems described by parabolic PDEs. We conclude this work with an account for future research.
Short-term forecasting of turbidity in trunk main networks.
Meyers, Gregory; Kapelan, Zoran; Keedwell, Edward
2017-11-01
Water discolouration is an increasingly important and expensive issue due to rising customer expectations, tighter regulatory demands and ageing Water Distribution Systems (WDSs) in the UK and abroad. This paper presents a new turbidity forecasting methodology capable of aiding operational staff and enabling proactive management strategies. The turbidity forecasting methodology developed here is completely data-driven and does not require hydraulic or water quality network model that is expensive to build and maintain. The methodology is tested and verified on a real trunk main network with observed turbidity measurement data. Results obtained show that the methodology can detect if discolouration material is mobilised, estimate if sufficient turbidity will be generated to exceed a preselected threshold and approximate how long the material will take to reach the downstream meter. Classification based forecasts of turbidity can be reliably made up to 5 h ahead although at the expense of increased false alarm rates. The methodology presented here could be used as an early warning system that can enable a multitude of cost beneficial proactive management strategies to be implemented as an alternative to expensive trunk mains cleaning programs. Copyright © 2017 Elsevier Ltd. All rights reserved.
Dimitriadis, S I; Sun, Yu; Kwok, K; Laskaris, N A; Bezerianos, A
2013-01-01
The association of functional connectivity patterns with particular cognitive tasks has long been a topic of interest in neuroscience, e.g., studies of functional connectivity have demonstrated its potential use for decoding various brain states. However, the high-dimensionality of the pairwise functional connectivity limits its usefulness in some real-time applications. In the present study, the methodology of tensor subspace analysis (TSA) is used to reduce the initial high-dimensionality of the pairwise coupling in the original functional connectivity network to a space of condensed descriptive power, which would significantly decrease the computational cost and facilitate the differentiation of brain states. We assess the feasibility of the proposed method on EEG recordings when the subject was performing mental arithmetic task which differ only in the difficulty level (easy: 1-digit addition v.s. 3-digit additions). Two different cortical connective networks were detected, and by comparing the functional connectivity networks in different work states, it was found that the task-difficulty is best reflected in the connectivity structure of sub-graphs extending over parietooccipital sites. Incorporating this data-driven information within original TSA methodology, we succeeded in predicting the difficulty level from connectivity patterns in an efficient way that can be implemented so as to work in real-time.
Modern contact investigation methods for enhancing tuberculosis control in aboriginal communities.
Cook, Victoria J; Shah, Lena; Gardy, Jennifer
2012-05-25
The Aboriginal communities in Canada are challenged by a disproportionate burden of TB infection and disease. Contact investigation (CI) guidelines exist but these strategies do not take into account the unique social structure of different populations. Because of the limitations of traditional CI, new approaches are under investigation and include the use of social network analysis, geographic information systems and genomics, in addition to the widespread use of genotyping to better understand TB transmission. Guidelines for the routine use of network methods and other novel methodologies for TB CI and outbreak investigation do not exist despite the gathering evidence that these approaches can positively impact TB control efforts, even in Aboriginal communities. The feasibility and efficacy of these novel approaches to CI in Aboriginal communities requires further investigation. The successful integration of these novel methodologies will require community involvement, capacity building and ongoing support at every level. The outcome will not only be the systematic collection, analysis, and interpretation of CI data in high-burden communities to assess transmission but the prioritization of contacts who are candidates for treatment of LTBI which will break the cycle of transmission. Ultimately, the measure of success will be a clear and sustained decline in TB incidence in Aboriginal communities.
Abbott, Katherine M; Bettger, Janet Prvu; Hampton, Keith N; Kohler, Hans-Peter
2015-03-01
Studies indicate that social integration has a significant influence on physical and mental health. Older adults experience an increased risk of social isolation as their social networks decline with fewer traditional opportunities to add new social relationships. Deaths of similar aged friends, cognitive and functional impairments, and relocating to a nursing home (NH) or assisted-living (AL) facility contribute to difficulties in maintaining one's social network. Due to the paucity of research examining the social networks of people residing in AL and NH, this study was designed to develop and test the feasibility of using a combination of methodological approaches to capture social network data among older adults living in AL and a dementia special care unit NH. Social network analysis of both egocentric and sociocentric networks was conducted to visualize the social networks of 15 residents of an AL neighborhood and 12 residents of a dementia special care unit NH and to calculate measures network size, centrality, and reciprocity. The combined egocentric and sociocentric method was feasible and provided a robust indicator of resident social networks highlighting individuals who were socially integrated as well as isolated. © The Author(s) 2013 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Assessment of Cancer Education Seminars for Appalachian Populations
Pennell, Michael L.; Dignan, Mark B.; Paskett, Electra D.
2013-01-01
Cancer education seminars for Appalachian populations were conducted to: (1) increase knowledge of existing cancer disparities, (2) disseminate findings from Appalachian community-based participatory research (CBPR) projects, and (3) foster CBPR capacity building among community members by promoting social networking. Evaluation of the seminars was completed by: (1) using pre–post-surveys to assess changes in knowledge and attitudes at three regional and one national seminar and (2) measuring a change in the social network patterns of participants at a national seminar by analyzing the names of individuals known at the beginning and at the end of the seminar by each participant. Among participants, there was a significant increase in knowledge of Appalachian cancer disparities at two seminars [national, t(145)=3.41, p=0.001; Pennsylvania, t(189)=3.00, p=0.003] and a change in attitudes about Appalachia at one seminar [Ohio t(193)=−2.80, p=0.006]. Social network analysis, operationally defined for this study as familiarity with individuals attending the conference, showed participation in the national seminar fostered capacity building for future CBPR by the development of new network ties. Findings indicate that short-term outcomes of the seminars were accomplished. Future educational seminars should consider using social network analysis as a new evaluation methodology. PMID:22131064
NASA Astrophysics Data System (ADS)
Guruprasad, R.; Behera, B. K.
2015-10-01
Quantitative prediction of fabric mechanical properties is an essential requirement for design engineering of textile and apparel products. In this work, the possibility of prediction of bending rigidity of cotton woven fabrics has been explored with the application of Artificial Neural Network (ANN) and two hybrid methodologies, namely Neuro-genetic modeling and Adaptive Neuro-Fuzzy Inference System (ANFIS) modeling. For this purpose, a set of cotton woven grey fabrics was desized, scoured and relaxed. The fabrics were then conditioned and tested for bending properties. With the database thus created, a neural network model was first developed using back propagation as the learning algorithm. The second model was developed by applying a hybrid learning strategy, in which genetic algorithm was first used as a learning algorithm to optimize the number of neurons and connection weights of the neural network. The Genetic algorithm optimized network structure was further allowed to learn using back propagation algorithm. In the third model, an ANFIS modeling approach was attempted to map the input-output data. The prediction performances of the models were compared and a sensitivity analysis was reported. The results show that the prediction by neuro-genetic and ANFIS models were better in comparison with that of back propagation neural network model.
Software development for teleroentgenogram analysis
NASA Astrophysics Data System (ADS)
Goshkoderov, A. A.; Khlebnikov, N. A.; Obabkov, I. N.; Serkov, K. V.; Gajniyarov, I. M.; Aliev, A. A.
2017-09-01
A framework for the analysis and calculation of teleroentgenograms was developed. Software development was carried out in the Department of Children's Dentistry and Orthodontics in Ural State Medical University. The software calculates the teleroentgenogram by the original method which was developed in this medical department. Program allows designing its own methods for calculating the teleroentgenograms by new methods. It is planned to use the technology of machine learning (Neural networks) in the software. This will help to make the process of calculating the teleroentgenograms easier because methodological points will be placed automatically.
The Accounting Network: How Financial Institutions React to Systemic Crisis
Puliga, Michelangelo; Flori, Andrea; Pappalardo, Giuseppe; Chessa, Alessandro; Pammolli, Fabio
2016-01-01
The role of Network Theory in the study of the financial crisis has been widely spotted in the latest years. It has been shown how the network topology and the dynamics running on top of it can trigger the outbreak of large systemic crisis. Following this methodological perspective we introduce here the Accounting Network, i.e. the network we can extract through vector similarities techniques from companies’ financial statements. We build the Accounting Network on a large database of worldwide banks in the period 2001–2013, covering the onset of the global financial crisis of mid-2007. After a careful data cleaning, we apply a quality check in the construction of the network, introducing a parameter (the Quality Ratio) capable of trading off the size of the sample (coverage) and the representativeness of the financial statements (accuracy). We compute several basic network statistics and check, with the Louvain community detection algorithm, for emerging communities of banks. Remarkably enough sensible regional aggregations show up with the Japanese and the US clusters dominating the community structure, although the presence of a geographically mixed community points to a gradual convergence of banks into similar supranational practices. Finally, a Principal Component Analysis procedure reveals the main economic components that influence communities’ heterogeneity. Even using the most basic vector similarity hypotheses on the composition of the financial statements, the signature of the financial crisis clearly arises across the years around 2008. We finally discuss how the Accounting Networks can be improved to reflect the best practices in the financial statement analysis. PMID:27736865
The Accounting Network: How Financial Institutions React to Systemic Crisis.
Puliga, Michelangelo; Flori, Andrea; Pappalardo, Giuseppe; Chessa, Alessandro; Pammolli, Fabio
2016-01-01
The role of Network Theory in the study of the financial crisis has been widely spotted in the latest years. It has been shown how the network topology and the dynamics running on top of it can trigger the outbreak of large systemic crisis. Following this methodological perspective we introduce here the Accounting Network, i.e. the network we can extract through vector similarities techniques from companies' financial statements. We build the Accounting Network on a large database of worldwide banks in the period 2001-2013, covering the onset of the global financial crisis of mid-2007. After a careful data cleaning, we apply a quality check in the construction of the network, introducing a parameter (the Quality Ratio) capable of trading off the size of the sample (coverage) and the representativeness of the financial statements (accuracy). We compute several basic network statistics and check, with the Louvain community detection algorithm, for emerging communities of banks. Remarkably enough sensible regional aggregations show up with the Japanese and the US clusters dominating the community structure, although the presence of a geographically mixed community points to a gradual convergence of banks into similar supranational practices. Finally, a Principal Component Analysis procedure reveals the main economic components that influence communities' heterogeneity. Even using the most basic vector similarity hypotheses on the composition of the financial statements, the signature of the financial crisis clearly arises across the years around 2008. We finally discuss how the Accounting Networks can be improved to reflect the best practices in the financial statement analysis.
NASA Astrophysics Data System (ADS)
Mitra, Ashis; Majumdar, Prabal Kumar; Bannerjee, Debamalya
2013-03-01
This paper presents a comparative analysis of two modeling methodologies for the prediction of air permeability of plain woven handloom cotton fabrics. Four basic fabric constructional parameters namely ends per inch, picks per inch, warp count and weft count have been used as inputs for artificial neural network (ANN) and regression models. Out of the four regression models tried, interaction model showed very good prediction performance with a meager mean absolute error of 2.017 %. However, ANN models demonstrated superiority over the regression models both in terms of correlation coefficient and mean absolute error. The ANN model with 10 nodes in the single hidden layer showed very good correlation coefficient of 0.982 and 0.929 and mean absolute error of only 0.923 and 2.043 % for training and testing data respectively.
Analysis of biofluids in aqueous environment based on mid-infrared spectroscopy.
Fabian, Heinz; Lasch, Peter; Naumann, Dieter
2005-01-01
In this study we describe a semiautomatic Fourier transform infrared spectroscopic methodology for the analysis of liquid serum samples, which combines simple sample introduction with high sample throughput. The applicability of this new infrared technology to the analysis of liquid serum samples from a cohort of cattle naturally infected with bovine spongiform encephalopathy and from controls was explored in comparison to the conventional approach based on transmission infrared spectroscopy of dried serum films. Artifical neural network analysis of the infrared data was performed to differentiate between bovine spongiform encephalopathy-negative controls and animals in the late stage of the disease. After training of artifical neural network classifiers, infrared spectra of sera from an independent external validation data set were analyzed. In this way, sensitivities between 90 and 96% and specificities between 84 and 92% were achieved, respectively, depending upon the strategy of data collection and data analysis. Based on these results, the advantages and limitations of the liquid sample technique and the dried film approach for routine analysis of biofluids are discussed. 2005 Society of Photo-Optical Instrumentation Engineers.
NASA Astrophysics Data System (ADS)
Núñez, M.; Robie, T.; Vlachos, D. G.
2017-10-01
Kinetic Monte Carlo (KMC) simulation provides insights into catalytic reactions unobtainable with either experiments or mean-field microkinetic models. Sensitivity analysis of KMC models assesses the robustness of the predictions to parametric perturbations and identifies rate determining steps in a chemical reaction network. Stiffness in the chemical reaction network, a ubiquitous feature, demands lengthy run times for KMC models and renders efficient sensitivity analysis based on the likelihood ratio method unusable. We address the challenge of efficiently conducting KMC simulations and performing accurate sensitivity analysis in systems with unknown time scales by employing two acceleration techniques: rate constant rescaling and parallel processing. We develop statistical criteria that ensure sufficient sampling of non-equilibrium steady state conditions. Our approach provides the twofold benefit of accelerating the simulation itself and enabling likelihood ratio sensitivity analysis, which provides further speedup relative to finite difference sensitivity analysis. As a result, the likelihood ratio method can be applied to real chemistry. We apply our methodology to the water-gas shift reaction on Pt(111).
Violence on Canadian Television Networks
Paquette, Guy
2004-01-01
Introduction Over the past twenty years, the question of the effects of violence on television has figured prominently in public opinion and hundreds of studies have been devoted to this subject. Many researchers have determined that violence has a negative impact on behavior. The public, broadcasters and political figures all support the idea of reducing the total amount of violence on television - in particular in shows for children. A thousand programs aired between 1993 and 2001 on major non-specialty television networks in Canada were analyzed: TVA, TQS, as well as CTV and Global, private French and English networks, as well as the English CBC Radio and French Radio-Canada for the public networks. Method The methodology consists of a classic analysis of content where an act of violence constitutes a unit of analysis. Results The data collected revealed that the amount of violence has increased regularly since 1993 despite the stated willingness on the part of broadcasters to produce programs with less violence. The total number of violent acts, as well as the number of violent acts per hour, is increasing. Private networks deliver three times more violence than public networks. Researchers have also noted that a high proportion of violence occurs in programs airing before 21:00 hours, thereby exposing a large number of children to this violence. Conclusion Psychological violence is taking on a more significant role in Canadian Television. PMID:19030148
Violence on canadian television networks.
Paquette, Guy
2004-02-01
Over the past twenty years, the question of the effects of violence on television has figured prominently in public opinion and hundreds of studies have been devoted to this subject. Many researchers have determined that violence has a negative impact on behavior. The public, broadcasters and political figures all support the idea of reducing the total amount of violence on television - in particular in shows for children. A thousand programs aired between 1993 and 2001 on major non-specialty television networks in Canada were analyzed: TVA, TQS, as well as CTV and Global, private French and English networks, as well as the English CBC Radio and French Radio-Canada for the public networks. The methodology consists of a classic analysis of content where an act of violence constitutes a unit of analysis. The data collected revealed that the amount of violence has increased regularly since 1993 despite the stated willingness on the part of broadcasters to produce programs with less violence. The total number of violent acts, as well as the number of violent acts per hour, is increasing. Private networks deliver three times more violence than public networks. Researchers have also noted that a high proportion of violence occurs in programs airing before 21:00 hours, thereby exposing a large number of children to this violence. Psychological violence is taking on a more significant role in Canadian Television.
Morel, Carlos Medicis; Serruya, Suzanne Jacob; Penna, Gerson Oliveira; Guimarães, Reinaldo
2009-01-01
Background New approaches and tools were needed to support the strategic planning, implementation and management of a Program launched by the Brazilian Government to fund research, development and capacity building on neglected tropical diseases with strong focus on the North, Northeast and Center-West regions of the country where these diseases are prevalent. Methodology/Principal Findings Based on demographic, epidemiological and burden of disease data, seven diseases were selected by the Ministry of Health as targets of the initiative. Publications on these diseases by Brazilian researchers were retrieved from international databases, analyzed and processed with text-mining tools in order to standardize author- and institution's names and addresses. Co-authorship networks based on these publications were assembled, visualized and analyzed with social network analysis software packages. Network visualization and analysis generated new information, allowing better design and strategic planning of the Program, enabling decision makers to characterize network components by area of work, identify institutions as well as authors playing major roles as central hubs or located at critical network cut-points and readily detect authors or institutions participating in large international scientific collaborating networks. Conclusions/Significance Traditional criteria used to monitor and evaluate research proposals or R&D Programs, such as researchers' productivity and impact factor of scientific publications, are of limited value when addressing research areas of low productivity or involving institutions from endemic regions where human resources are limited. Network analysis was found to generate new and valuable information relevant to the strategic planning, implementation and monitoring of the Program. It afforded a more proactive role of the funding agencies in relation to public health and equity goals, to scientific capacity building objectives and a more consistent engagement of institutions and authors from endemic regions based on innovative criteria and parameters anchored on objective scientific data. PMID:19688044
HOLA: Human-like Orthogonal Network Layout.
Kieffer, Steve; Dwyer, Tim; Marriott, Kim; Wybrow, Michael
2016-01-01
Over the last 50 years a wide variety of automatic network layout algorithms have been developed. Some are fast heuristic techniques suitable for networks with hundreds of thousands of nodes while others are multi-stage frameworks for higher-quality layout of smaller networks. However, despite decades of research currently no algorithm produces layout of comparable quality to that of a human. We give a new "human-centred" methodology for automatic network layout algorithm design that is intended to overcome this deficiency. User studies are first used to identify the aesthetic criteria algorithms should encode, then an algorithm is developed that is informed by these criteria and finally, a follow-up study evaluates the algorithm output. We have used this new methodology to develop an automatic orthogonal network layout method, HOLA, that achieves measurably better (by user study) layout than the best available orthogonal layout algorithm and which produces layouts of comparable quality to those produced by hand.
Bolia, Robert S; Nelson, W Todd
2007-05-01
The recently promulgated doctrine of network-centric warfare suggests that increases in shared situation awareness and self-synchronization will be emergent properties of densely connected military networks. What it fails to say is how these enhancements are to be measured. The present article frames the discussion as a question of how to characterize team performance, and considers such performance in the context of its hypothetical components: situation awareness, workload, and error. This examination concludes that reliable measures of these constructs are lacking for teams, even when they exist for individual operators, and that this is due to philosophical and/or methodological flaws in their conceptual development. Additional research is recommended to overcome these deficiencies, as well as consideration of novel multidisciplinary approaches that draw on methodologies employed in the social, physical, and biological sciences.
Willis, Cameron; Kernoghan, Alison; Riley, Barbara; Popp, Janice; Best, Allan; Milward, H Brinton
2015-11-19
We conducted a mixed methods study from June 2014 to March 2015 to assess the perspectives of stakeholders in networks that adopt a population approach for chronic disease prevention (CDP). The purpose of the study was to identify important and feasible outcome measures for monitoring network performance. Participants from CDP networks in Canada completed an online concept mapping exercise, which was followed by interviews with network stakeholders to further understand the findings. Nine concepts were considered important outcomes of CDP networks: enhanced learning, improved use of resources, enhanced or increased relationships, improved collaborative action, network cohesion, improved system outcomes, improved population health outcomes, improved practice and policy planning, and improved intersectoral engagement. Three themes emerged from participant interviews related to measurement of the identified concepts: the methodological difficulties in measuring network outcomes, the dynamic nature of network evolution and function and implications for outcome assessment, and the challenge of measuring multisectoral engagement in CDP networks. Results from this study provide initial insights into concepts that can be used to describe the outcomes of networks for CDP and may offer foundations for strengthening network outcome-monitoring strategies and methodologies.
Kernoghan, Alison; Riley, Barbara; Popp, Janice; Best, Allan; Milward, H. Brinton
2015-01-01
Introduction We conducted a mixed methods study from June 2014 to March 2015 to assess the perspectives of stakeholders in networks that adopt a population approach for chronic disease prevention (CDP). The purpose of the study was to identify important and feasible outcome measures for monitoring network performance. Methods Participants from CDP networks in Canada completed an online concept mapping exercise, which was followed by interviews with network stakeholders to further understand the findings. Results Nine concepts were considered important outcomes of CDP networks: enhanced learning, improved use of resources, enhanced or increased relationships, improved collaborative action, network cohesion, improved system outcomes, improved population health outcomes, improved practice and policy planning, and improved intersectoral engagement. Three themes emerged from participant interviews related to measurement of the identified concepts: the methodological difficulties in measuring network outcomes, the dynamic nature of network evolution and function and implications for outcome assessment, and the challenge of measuring multisectoral engagement in CDP networks. Conclusion Results from this study provide initial insights into concepts that can be used to describe the outcomes of networks for CDP and may offer foundations for strengthening network outcome-monitoring strategies and methodologies. PMID:26583571
Sanz-Cabanillas, Juan Luis; Ruano, Juan; Gomez-Garcia, Francisco; Alcalde-Mellado, Patricia; Gay-Mimbrera, Jesus; Aguilar-Luque, Macarena; Maestre-Lopez, Beatriz; Gonzalez-Padilla, Marcelino; Carmona-Fernandez, Pedro J; Velez Garcia-Nieto, Antonio; Isla-Tejera, Beatriz
2017-01-01
Moderate-to-severe psoriasis is associated with significant comorbidity, an impaired quality of life, and increased medical costs, including those associated with treatments. Systematic reviews (SRs) and meta-analyses (MAs) of randomized clinical trials are considered two of the best approaches to the summarization of high-quality evidence. However, methodological bias can reduce the validity of conclusions from these types of studies and subsequently impair the quality of decision making. As co-authorship is among the most well-documented forms of research collaboration, the present study aimed to explore whether authors' collaboration methods might influence the methodological quality of SRs and MAs of psoriasis. Methodological quality was assessed by two raters who extracted information from full articles. After calculating total and per-item Assessment of Multiple Systematic Reviews (AMSTAR) scores, reviews were classified as low (0-4), medium (5-8), or high (9-11) quality. Article metadata and journal-related bibliometric indices were also obtained. A total of 741 authors from 520 different institutions and 32 countries published 220 reviews that were classified as high (17.2%), moderate (55%), or low (27.7%) methodological quality. The high methodological quality subnetwork was larger but had a lower connection density than the low and moderate methodological quality subnetworks; specifically, the former contained relatively fewer nodes (authors and reviews), reviews by authors, and collaborators per author. Furthermore, the high methodological quality subnetwork was highly compartmentalized, with several modules representing few poorly interconnected communities. In conclusion, structural differences in author-paper affiliation network may influence the methodological quality of SRs and MAs on psoriasis. As the author-paper affiliation network structure affects study quality in this research field, authors who maintain an appropriate balance between scientific quality and productivity are more likely to develop higher quality reviews.
Mehri, M
2012-12-01
An artificial neural network (ANN) approach was used to develop feed-forward multilayer perceptron models to estimate the nutritional requirements of digestible lysine (dLys), methionine (dMet), and threonine (dThr) in broiler chicks. Sixty data lines representing response of the broiler chicks during 3 to 16 d of age to dietary levels of dLys (0.88-1.32%), dMet (0.42-0.58%), and dThr (0.53-0.87%) were obtained from literature and used to train the networks. The prediction values of ANN were compared with those of response surface methodology to evaluate the fitness of these 2 methods. The models were tested using R(2), mean absolute deviation, mean absolute percentage error, and absolute average deviation. The random search algorithm was used to optimize the developed ANN models to estimate the optimal values of dietary dLys, dMet, and dThr. The ANN models were used to assess the relative importance of each dietary input on the bird performance using sensitivity analysis. The statistical evaluations revealed the higher accuracy of ANN to predict the bird performance compared with response surface methodology models. The optimization results showed that the maximum BW gain may be obtained with dietary levels of 1.11, 0.51, and 0.78% of dLys, dMet, and dThr, respectively. Minimum feed conversion ratio may be achieved with dietary levels of 1.13, 0.54, 0.78% of dLys, dMet, and dThr, respectively. The sensitivity analysis on the models indicated that dietary Lys is the most important variable in the growth performance of the broiler chicks, followed by dietary Thr and Met. The results of this research revealed that the experimental data of a response-surface-methodology design could be successfully used to develop the well-designed ANN for pattern recognition of bird growth and optimization of nutritional requirements. The comparison between the 2 methods also showed that the statistical methods may have little effect on the ideal ratios of dMet and dThr to dLys in broiler chicks using multivariate optimization.
Assessment of SRS ambient air monitoring network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abbott, K.; Jannik, T.
Three methodologies have been used to assess the effectiveness of the existing ambient air monitoring system in place at the Savannah River Site in Aiken, SC. Effectiveness was measured using two metrics that have been utilized in previous quantification of air-monitoring network performance; frequency of detection (a measurement of how frequently a minimum number of samplers within the network detect an event), and network intensity (a measurement of how consistent each sampler within the network is at detecting events). In addition to determining the effectiveness of the current system, the objective of performing this assessment was to determine what, ifmore » any, changes could make the system more effective. Methodologies included 1) the Waite method of determining sampler distribution, 2) the CAP88- PC annual dose model, and 3) a puff/plume transport model used to predict air concentrations at sampler locations. Data collected from air samplers at SRS in 2015 compared with predicted data resulting from the methodologies determined that the frequency of detection for the current system is 79.2% with sampler efficiencies ranging from 5% to 45%, and a mean network intensity of 21.5%. One of the air monitoring stations had an efficiency of less than 10%, and detected releases during just one sampling period of the entire year, adding little to the overall network intensity. By moving or removing this sampler, the mean network intensity increased to about 23%. Further work in increasing the network intensity and simulating accident scenarios to further test the ambient air system at SRS is planned« less
Kumar, Avishek; Butler, Brandon M; Kumar, Sudhir; Ozkan, S Banu
2015-12-01
Sequencing technologies are revealing many new non-synonymous single nucleotide variants (nsSNVs) in each personal exome. To assess their functional impacts, comparative genomics is frequently employed to predict if they are benign or not. However, evolutionary analysis alone is insufficient, because it misdiagnoses many disease-associated nsSNVs, such as those at positions involved in protein interfaces, and because evolutionary predictions do not provide mechanistic insights into functional change or loss. Structural analyses can aid in overcoming both of these problems by incorporating conformational dynamics and allostery in nSNV diagnosis. Finally, protein-protein interaction networks using systems-level methodologies shed light onto disease etiology and pathogenesis. Bridging these network approaches with structurally resolved protein interactions and dynamics will advance genomic medicine. Copyright © 2015 Elsevier Ltd. All rights reserved.
Dual Arm Work Package performance estimates and telerobot task network simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Draper, J.V.; Blair, L.M.
1997-02-01
This paper describes the methodology and results of a network simulation study of the Dual Arm Work Package (DAWP), to be employed for dismantling the Argonne National Laboratory CP-5 reactor. The development of the simulation model was based upon the results of a task analysis for the same system. This study was performed by the Oak Ridge National Laboratory (ORNL), in the Robotics and Process Systems Division. Funding was provided the US Department of Energy`s Office of Technology Development, Robotics Technology Development Program (RTDP). The RTDP is developing methods of computer simulation to estimate telerobotic system performance. Data were collectedmore » to provide point estimates to be used in a task network simulation model. Three skilled operators performed six repetitions of a pipe cutting task representative of typical teleoperation cutting operations.« less
Exploring Educational and Cultural Adaptation through Social Networking Sites
ERIC Educational Resources Information Center
Ryan, Sherry D.; Magro, Michael J.; Sharp, Jason H.
2011-01-01
Social networking sites have seen tremendous growth and are widely used around the world. Nevertheless, the use of social networking sites in educational contexts is an under explored area. This paper uses a qualitative methodology, autoethnography, to investigate how social networking sites, specifically Facebook[TM], can help first semester…
Controlling allosteric networks in proteins
NASA Astrophysics Data System (ADS)
Dokholyan, Nikolay
2013-03-01
We present a novel methodology based on graph theory and discrete molecular dynamics simulations for delineating allosteric pathways in proteins. We use this methodology to uncover the structural mechanisms responsible for coupling of distal sites on proteins and utilize it for allosteric modulation of proteins. We will present examples where inference of allosteric networks and its rewiring allows us to ``rescue'' cystic fibrosis transmembrane conductance regulator (CFTR), a protein associated with fatal genetic disease cystic fibrosis. We also use our methodology to control protein function allosterically. We design a novel protein domain that can be inserted into identified allosteric site of target protein. Using a drug that binds to our domain, we alter the function of the target protein. We successfully tested this methodology in vitro, in living cells and in zebrafish. We further demonstrate transferability of our allosteric modulation methodology to other systems and extend it to become ligh-activatable.
Source-reconstruction of the sensorimotor network from resting-state macaque electrocorticography.
Hindriks, R; Micheli, C; Bosman, C A; Oostenveld, R; Lewis, C; Mantini, D; Fries, P; Deco, G
2018-06-07
The discovery of hemodynamic (BOLD-fMRI) resting-state networks (RSNs) has brought about a fundamental shift in our thinking about the role of intrinsic brain activity. The electrophysiological underpinnings of RSNs remain largely elusive and it has been shown only recently that electric cortical rhythms are organized into the same RSNs as hemodynamic signals. Most electrophysiological studies into RSNs use magnetoencephalography (MEG) or scalp electroencephalography (EEG), which limits the spatial resolution with which electrophysiological RSNs can be observed. Due to their close proximity to the cortical surface, electrocorticographic (ECoG) recordings can potentially provide a more detailed picture of the functional organization of resting-state cortical rhythms, albeit at the expense of spatial coverage. In this study we propose using source-space spatial independent component analysis (spatial ICA) for identifying generators of resting-state cortical rhythms as recorded with ECoG and for reconstructing their functional connectivity. Network structure is assessed by two kinds of connectivity measures: instantaneous correlations between band-limited amplitude envelopes and oscillatory phase-locking. By simulating rhythmic cortical generators, we find that the reconstruction of oscillatory phase-locking is more challenging than that of amplitude correlations, particularly for low signal-to-noise levels. Specifically, phase-lags can both be over- and underestimated, which troubles the interpretation of lag-based connectivity measures. We illustrate the methodology on somatosensory beta rhythms recorded from a macaque monkey using ECoG. The methodology decomposes the resting-state sensorimotor network into three cortical generators, distributed across primary somatosensory and primary and higher-order motor areas. The generators display significant and reproducible amplitude correlations and phase-locking values with non-zero lags. Our findings illustrate the level of spatial detail attainable with source-projected ECoG and motivates wider use of the methodology for studying resting-state as well as event-related cortical dynamics in macaque and human. Copyright © 2018. Published by Elsevier Inc.
Hybrid analysis for indicating patients with breast cancer using temperature time series.
Silva, Lincoln F; Santos, Alair Augusto S M D; Bravo, Renato S; Silva, Aristófanes C; Muchaluat-Saade, Débora C; Conci, Aura
2016-07-01
Breast cancer is the most common cancer among women worldwide. Diagnosis and treatment in early stages increase cure chances. The temperature of cancerous tissue is generally higher than that of healthy surrounding tissues, making thermography an option to be considered in screening strategies of this cancer type. This paper proposes a hybrid methodology for analyzing dynamic infrared thermography in order to indicate patients with risk of breast cancer, using unsupervised and supervised machine learning techniques, which characterizes the methodology as hybrid. The dynamic infrared thermography monitors or quantitatively measures temperature changes on the examined surface, after a thermal stress. In the dynamic infrared thermography execution, a sequence of breast thermograms is generated. In the proposed methodology, this sequence is processed and analyzed by several techniques. First, the region of the breasts is segmented and the thermograms of the sequence are registered. Then, temperature time series are built and the k-means algorithm is applied on these series using various values of k. Clustering formed by k-means algorithm, for each k value, is evaluated using clustering validation indices, generating values treated as features in the classification model construction step. A data mining tool was used to solve the combined algorithm selection and hyperparameter optimization (CASH) problem in classification tasks. Besides the classification algorithm recommended by the data mining tool, classifiers based on Bayesian networks, neural networks, decision rules and decision tree were executed on the data set used for evaluation. Test results support that the proposed analysis methodology is able to indicate patients with breast cancer. Among 39 tested classification algorithms, K-Star and Bayes Net presented 100% classification accuracy. Furthermore, among the Bayes Net, multi-layer perceptron, decision table and random forest classification algorithms, an average accuracy of 95.38% was obtained. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
The complexity and robustness of metro networks
NASA Astrophysics Data System (ADS)
Derrible, Sybil; Kennedy, Christopher
2010-09-01
Transportation systems, being real-life examples of networks, are particularly interesting to analyze from the viewpoint of the new and rapidly emerging field of network science. Two particular concepts seem to be particularly relevant: scale-free patterns and small-worlds. By looking at 33 metro systems in the world, this paper adapts network science methodologies to the transportation literature, and offers one application to the robustness of metros; here, metro refers to urban rail transit with exclusive right-of-way, whether it is underground, at grade or elevated. We find that most metros are indeed scale-free (with scaling factors ranging from 2.10 to 5.52) and small-worlds; they show atypical behaviors, however, with increasing size. In particular, the presence of transfer-hubs (stations hosting more than three lines) results in relatively large scaling factors. The analysis provides insights/recommendations for increasing the robustness of metro networks. Smaller networks should focus on creating transfer stations, thus generating cycles to offer alternative routes. For larger networks, few stations seem to detain a certain monopole on transferring, it is therefore important to create additional transfers, possibly at the periphery of city centers; the Tokyo system seems to remarkably incorporate these properties.
Jacobs, Wura; Goodson, Patricia; Barry, Adam E; McLeroy, Kenneth R
2016-05-01
Despite previous research indicating an adolescents' alcohol, tobacco, and other drug (ATOD) use is dependent upon their sex and the sex composition of their social network, few social network studies consider sex differences and network sex composition as a determinant of adolescents' ATOD use behavior. This systematic literature review examining how social network analytic studies examine adolescent ATOD use behavior is guided by the following research questions: (1) How do studies conceptualize sex and network sex composition? (2) What types of network affiliations are employed to characterize adolescent networks? (3) What is the methodological quality of included studies? After searching several electronic databases (PsycINFO, EBSCO, and Communication Abstract) and applying our inclusion/exclusion criteria, 48 studies were included in the review. Overall, few studies considered sex composition of networks in which adolescents are embedded as a determinant that influences adolescent ATOD use. Although included studies all exhibited high methodological quality, the majority only used friendship networks to characterize adolescent social networks and subsequently failed to capture the influence of other network types, such as romantic networks. School-based prevention programs could be strengthened by (1) selecting and targeting peer leaders based on sex, and (2) leveraging other types of social networks beyond simply friendships. © 2016, American School Health Association.
Social network analysis of child and adult interorganizational connections.
Davis, Maryann; Koroloff, Nancy; Johnsen, Matthew
2012-01-01
Because most programs serve either children and their families or adults, a critical component of service and treatment continuity in mental health and related services for individuals transitioning into adulthood (ages 14-25) is coordination across programs on either side of the adult age divide. This study was conducted in Clark County, Washington, a community that had received a Partnership for Youth Transition grant from the Federal Center for Mental Health Services. Social Network Analysis methodology was used to describe the strength and direction of each organization's relationship to other organizations in the transition network. Interviews were conducted before grant implementation (n=103) and again four years later (n=99). The findings of the study revealed significant changes in the nature of relationships between organizations over time. While the overall density of the transition service network remained stable, specific ways of connecting did change. Some activities became more decentralized while others became more inclusive as evidenced by the increase in size of the largest K-core. This was particularly true for the activity of "receiving referrals." These changes reflected more direct contact between child and adult serving organizations. The two separate child and adult systems identified at baseline appeared more integrated by the end of the grant period. Having greater connectivity among all organizations regardless of ages served should benefit youth and young adults of transition age. This study provides further evidence that Social Network Analysis is a useful method for measuring change in service system integration over time.
Bianconi, Fortunato; Baldelli, Elisa; Ludovini, Vienna; Luovini, Vienna; Petricoin, Emanuel F; Crinò, Lucio; Valigi, Paolo
2015-10-19
The study of cancer therapy is a key issue in the field of oncology research and the development of target therapies is one of the main problems currently under investigation. This is particularly relevant in different types of tumor where traditional chemotherapy approaches often fail, such as lung cancer. We started from the general definition of robustness introduced by Kitano and applied it to the analysis of dynamical biochemical networks, proposing a new algorithm based on moment independent analysis of input/output uncertainty. The framework utilizes novel computational methods which enable evaluating the model fragility with respect to quantitative performance measures and parameters such as reaction rate constants and initial conditions. The algorithm generates a small subset of parameters that can be used to act on complex networks and to obtain the desired behaviors. We have applied the proposed framework to the EGFR-IGF1R signal transduction network, a crucial pathway in lung cancer, as an example of Cancer Systems Biology application in drug discovery. Furthermore, we have tested our framework on a pulse generator network as an example of Synthetic Biology application, thus proving the suitability of our methodology to the characterization of the input/output synthetic circuits. The achieved results are of immediate practical application in computational biology, and while we demonstrate their use in two specific examples, they can in fact be used to study a wider class of biological systems.
Sensitivity of surface meteorological analyses to observation networks
NASA Astrophysics Data System (ADS)
Tyndall, Daniel Paul
A computationally efficient variational analysis system for two-dimensional meteorological fields is developed and described. This analysis approach is most efficient when the number of analysis grid points is much larger than the number of available observations, such as for large domain mesoscale analyses. The analysis system is developed using MATLAB software and can take advantage of multiple processors or processor cores. A version of the analysis system has been exported as a platform independent application (i.e., can be run on Windows, Linux, or Macintosh OS X desktop computers without a MATLAB license) with input/output operations handled by commonly available internet software combined with data archives at the University of Utah. The impact of observation networks on the meteorological analyses is assessed by utilizing a percentile ranking of individual observation sensitivity and impact, which is computed by using the adjoint of the variational surface assimilation system. This methodology is demonstrated using a case study of the analysis from 1400 UTC 27 October 2010 over the entire contiguous United States domain. The sensitivity of this approach to the dependence of the background error covariance on observation density is examined. Observation sensitivity and impact provide insight on the influence of observations from heterogeneous observing networks as well as serve as objective metrics for quality control procedures that may help to identify stations with significant siting, reporting, or representativeness issues.
Network Meta-Analysis Using R: A Review of Currently Available Automated Packages
Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph
2014-01-01
Network meta-analysis (NMA) – a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously – has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA. PMID:25541687
Network meta-analysis using R: a review of currently available automated packages.
Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph
2014-01-01
Network meta-analysis (NMA)--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.
Deformable image registration using convolutional neural networks
NASA Astrophysics Data System (ADS)
Eppenhof, Koen A. J.; Lafarge, Maxime W.; Moeskops, Pim; Veta, Mitko; Pluim, Josien P. W.
2018-03-01
Deformable image registration can be time-consuming and often needs extensive parameterization to perform well on a specific application. We present a step towards a registration framework based on a three-dimensional convolutional neural network. The network directly learns transformations between pairs of three-dimensional images. The outputs of the network are three maps for the x, y, and z components of a thin plate spline transformation grid. The network is trained on synthetic random transformations, which are applied to a small set of representative images for the desired application. Training therefore does not require manually annotated ground truth deformation information. The methodology is demonstrated on public data sets of inspiration-expiration lung CT image pairs, which come with annotated corresponding landmarks for evaluation of the registration accuracy. Advantages of this methodology are its fast registration times and its minimal parameterization.
Laser dynamics: The system dynamics and network theory of optoelectronic integrated circuit design
NASA Astrophysics Data System (ADS)
Tarng, Tom Shinming-T. K.
Laser dynamics is the system dynamics, communication and network theory for the design of opto-electronic integrated circuit (OEIC). Combining the optical network theory and optical communication theory, the system analysis and design for the OEIC fundamental building blocks is considered. These building blocks include the direct current modulation, inject light modulation, wideband filter, super-gain optical amplifier, E/O and O/O optical bistability and current-controlled optical oscillator. Based on the rate equations, the phase diagram and phase portrait analysis is applied to the theoretical studies and numerical simulation. The OEIC system design methodologies are developed for the OEIC design. Stimulating-field-dependent rate equations are used to model the line-width narrowing/broadening mechanism for the CW mode and frequency chirp of semiconductor lasers. The momentary spectra are carrier-density-dependent. Furthermore, the phase portrait analysis and the nonlinear refractive index is used to simulate the single mode frequency chirp. The average spectra of chaos, period doubling, period pulsing, multi-loops and analog modulation are generated and analyzed. The bifurcation-chirp design chart with modulation depth and modulation frequency as parameters is provided for design purpose.
Toward a standardized soil carbon database platform in the US Critical Zone Observatory Network
NASA Astrophysics Data System (ADS)
Filley, T. R.; Marini, L.; Todd-Brown, K. E.; Malhotra, A.; Harden, J. W.; Kumar, P.
2017-12-01
Within the soil carbon community of the US Critical Zone Observatory (CZO) Network, efforts are underway to promote network-level data syntheses and modeling projects and to identify barriers to data intercomparability. This represents a challenging goal given the diversity of soil carbon sampling methodologies, spatial and vertical resolution, carbon pool isolation protocols, subsequent measurement techniques, and matrix terminology. During the last annual meeting of the CZO SOC Working Group, Dec 11, 2016, it was decided that integration with, and potentially adoption of, a widely used, active, and mature data aggregation, archival, and visualization platform was the easiest route to achieve this ultimate goal. Additionally, to assess the state of deep and shallow soil C data among the CZO sites it was recommended that a comprehensive survey must be undertaken to identify data gaps and catalog the various soil sampling and analysis methodologies. The International Soil Carbon Network (ISCN) has a long history of leadership in the development of soil C data aggregation, archiving, and visualization tools and currently houses data for over 70,000 soil cores contributed from international soil carbon community. Over the past year, members of the CZO network and the ISCN have met to discuss logistics of adopting the ISCN template within the CZO. Collaborative efforts among all of the CZO site data managers, led by the Intensively Managed Landscapes CZO, will evaluate feasibility of adoption of the ISCN template, or some modification thereof, and distribution to the appropriate soil scientists for data upload and aggregation. Partnering with ISCN also ensures that soil characteristics from the US CZO are placed in a developing global soil context and paves the way for future integration of data from other international CZO networks. This poster will provide an update of this overall effort along with a summary of data products, partnering networks, and recommendations for data language template and the future CZO APIs.
Characterizing Aeroallergens by Infrared Spectroscopy of Fungal Spores and Pollen
Zimmermann, Boris; Tkalčec, Zdenko; Mešić, Armin; Kohler, Achim
2015-01-01
Background Fungal spores and plant pollen cause respiratory diseases in susceptible individuals, such as asthma, allergic rhinitis and hypersensitivity pneumonitis. Aeroallergen monitoring networks are an important part of treatment strategies, but unfortunately traditional analysis is time consuming and expensive. We have explored the use of infrared spectroscopy of pollen and spores for an inexpensive and rapid characterization of aeroallergens. Methodology The study is based on measurement of spore and pollen samples by single reflectance attenuated total reflectance Fourier transform infrared spectroscopy (SR-ATR FTIR). The experimental set includes 71 spore (Basidiomycota) and 121 pollen (Pinales, Fagales and Poales) samples. Along with fresh basidiospores, the study has been conducted on the archived samples collected within the last 50 years. Results The spectroscopic-based methodology enables clear spectral differentiation between pollen and spores, as well as the separation of confamiliar and congeneric species. In addition, the analysis of the scattering signals inherent in the infrared spectra indicates that the FTIR methodology offers indirect estimation of morphology of pollen and spores. The analysis of fresh and archived spores shows that chemical composition of spores is well preserved even after decades of storage, including the characteristic taxonomy-related signals. Therefore, biochemical analysis of fungal spores by FTIR could provide economical, reliable and timely methodologies for improving fungal taxonomy, as well as for fungal identification and monitoring. This proof of principle study shows the potential for using FTIR as a rapid tool in aeroallergen studies. In addition, the presented method is ready to be immediately implemented in biological and ecological studies for direct measurement of pollen and spores from flowers and sporocarps. PMID:25867755
New generation of elastic network models.
López-Blanco, José Ramón; Chacón, Pablo
2016-04-01
The intrinsic flexibility of proteins and nucleic acids can be grasped from remarkably simple mechanical models of particles connected by springs. In recent decades, Elastic Network Models (ENMs) combined with Normal Model Analysis widely confirmed their ability to predict biologically relevant motions of biomolecules and soon became a popular methodology to reveal large-scale dynamics in multiple structural biology scenarios. The simplicity, robustness, low computational cost, and relatively high accuracy are the reasons behind the success of ENMs. This review focuses on recent advances in the development and application of ENMs, paying particular attention to combinations with experimental data. Successful application scenarios include large macromolecular machines, structural refinement, docking, and evolutionary conservation. Copyright © 2015 Elsevier Ltd. All rights reserved.
Architecture for networked electronic patient record systems.
Takeda, H; Matsumura, Y; Kuwata, S; Nakano, H; Sakamoto, N; Yamamoto, R
2000-11-01
There have been two major approaches to the development of networked electronic patient record (EPR) architecture. One uses object-oriented methodologies for constructing the model, which include the GEHR project, Synapses, HL7 RIM and so on. The second approach uses document-oriented methodologies, as applied in examples of HL7 PRA. It is practically beneficial to take the advantages of both approaches and to add solution technologies for network security such as PKI. In recognition of the similarity with electronic commerce, a certificate authority as a trusted third party will be organised for establishing networked EPR system. This paper describes a Japanese functional model that has been developed, and proposes a document-object-oriented architecture, which is-compared with other existing models.
Del Giudice, G; Padulano, R; Siciliano, D
2016-01-01
The lack of geometrical and hydraulic information about sewer networks often excludes the adoption of in-deep modeling tools to obtain prioritization strategies for funds management. The present paper describes a novel statistical procedure for defining the prioritization scheme for preventive maintenance strategies based on a small sample of failure data collected by the Sewer Office of the Municipality of Naples (IT). Novelty issues involve, among others, considering sewer parameters as continuous statistical variables and accounting for their interdependences. After a statistical analysis of maintenance interventions, the most important available factors affecting the process are selected and their mutual correlations identified. Then, after a Box-Cox transformation of the original variables, a methodology is provided for the evaluation of a vulnerability map of the sewer network by adopting a joint multivariate normal distribution with different parameter sets. The goodness-of-fit is eventually tested for each distribution by means of a multivariate plotting position. The developed methodology is expected to assist municipal engineers in identifying critical sewers, prioritizing sewer inspections in order to fulfill rehabilitation requirements.
NASA Technical Reports Server (NTRS)
Kocher, Joshua E; Gilliam, David P.
2005-01-01
Secure computing is a necessity in the hostile environment that the internet has become. Protection from nefarious individuals and organizations requires a solution that is more a methodology than a one time fix. One aspect of this methodology is having the knowledge of which network ports a computer has open to the world, These network ports are essentially the doorways from the internet into the computer. An assessment method which uses the nmap software to scan ports has been developed to aid System Administrators (SAs) with analysis of open ports on their system(s). Additionally, baselines for several operating systems have been developed so that SAs can compare their open ports to a baseline for a given operating system. Further, the tool is deployed on a website where SAs and Users can request a port scan of their computer. The results are then emailed to the requestor. This tool aids Users, SAs, and security professionals by providing an overall picture of what services are running, what ports are open, potential trojan programs or backdoors, and what ports can be closed.