Iterative cross section sequence graph for handwritten character segmentation.
Dawoud, Amer
2007-08-01
The iterative cross section sequence graph (ICSSG) is an algorithm for handwritten character segmentation. It expands the cross section sequence graph concept by applying it iteratively at equally spaced thresholds. The iterative thresholding reduces the effect of information loss associated with image binarization. ICSSG preserves the characters' skeletal structure by preventing the interference of pixels that causes flooding of adjacent characters' segments. Improving the structural quality of the characters' skeleton facilitates better feature extraction and classification, which improves the overall performance of optical character recognition (OCR). Experimental results showed significant improvements in OCR recognition rates compared to other well-established segmentation algorithms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hirdt, J.A.; Brown, D.A., E-mail: dbrown@bnl.gov
The EXFOR library contains the largest collection of experimental nuclear reaction data available as well as the data's bibliographic information and experimental details. We text-mined the REACTION and MONITOR fields of the ENTRYs in the EXFOR library in order to identify understudied reactions and quantities. Using the results of the text-mining, we created an undirected graph from the EXFOR datasets with each graph node representing a single reaction and quantity and graph links representing the various types of connections between these reactions and quantities. This graph is an abstract representation of the connections in EXFOR, similar to graphs of socialmore » networks, authorship networks, etc. We use various graph theoretical tools to identify important yet understudied reactions and quantities in EXFOR. Although we identified a few cross sections relevant for shielding applications and isotope production, mostly we identified charged particle fluence monitor cross sections. As a side effect of this work, we learn that our abstract graph is typical of other real-world graphs.« less
NASA Astrophysics Data System (ADS)
Hirdt, J. A.; Brown, D. A.
2016-01-01
The EXFOR library contains the largest collection of experimental nuclear reaction data available as well as the data's bibliographic information and experimental details. We text-mined the REACTION and MONITOR fields of the ENTRYs in the EXFOR library in order to identify understudied reactions and quantities. Using the results of the text-mining, we created an undirected graph from the EXFOR datasets with each graph node representing a single reaction and quantity and graph links representing the various types of connections between these reactions and quantities. This graph is an abstract representation of the connections in EXFOR, similar to graphs of social networks, authorship networks, etc. We use various graph theoretical tools to identify important yet understudied reactions and quantities in EXFOR. Although we identified a few cross sections relevant for shielding applications and isotope production, mostly we identified charged particle fluence monitor cross sections. As a side effect of this work, we learn that our abstract graph is typical of other real-world graphs.
Pre-Service Science Teachers' Interpretations of Graphs: A Cross-Sectional Study
ERIC Educational Resources Information Center
Çil, Emine; Kar, Hazel
2015-01-01
This study focuses on pre-service science teachers' interpretations of graphs. First, the paper presents data about the freshman and senior pre-service teachers' interpretations of graphs. Then it discusses the effects of pre-service science teacher training program on student teachers' interpretations of graphs. The participants in the study were…
The Crossing Number of Graphs: Theory and Computation
NASA Astrophysics Data System (ADS)
Mutzel, Petra
This survey concentrates on selected theoretical and computational aspects of the crossing number of graphs. Starting with its introduction by Turán, we will discuss known results for complete and complete bipartite graphs. Then we will focus on some historical confusion on the crossing number that has been brought up by Pach and Tóth as well as Székely. A connection to computational geometry is made in the section on the geometric version, namely the rectilinear crossing number. We will also mention some applications of the crossing number to geometrical problems. This review ends with recent results on approximation and exact computations.
Surface Hold Advisor Using Critical Sections
NASA Technical Reports Server (NTRS)
Law, Caleb Hoi Kei (Inventor); Hsiao, Thomas Kun-Lung (Inventor); Mittler, Nathan C. (Inventor); Couluris, George J. (Inventor)
2013-01-01
The Surface Hold Advisor Using Critical Sections is a system and method for providing hold advisories to surface controllers to prevent gridlock and resolve crossing and merging conflicts among vehicles traversing a vertex-edge graph representing a surface traffic network on an airport surface. The Advisor performs pair-wise comparisons of current position and projected path of each vehicle with other surface vehicles to detect conflicts, determine critical sections, and provide hold advisories to traffic controllers recommending vehicles stop at entry points to protected zones around identified critical sections. A critical section defines a segment of the vertex-edge graph where vehicles are in crossing or merging or opposite direction gridlock contention. The Advisor detects critical sections without reference to scheduled, projected or required times along assigned vehicle paths, and generates hold advisories to prevent conflicts without requiring network path direction-of-movement rules and without requiring rerouting, rescheduling or other network optimization solutions.
Data mining the EXFOR database
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, David A.; Hirdt, John; Herman, Michal
2013-12-13
The EXFOR database contains the largest collection of experimental nuclear reaction data available as well as this data's bibliographic information and experimental details. We created an undirected graph from the EXFOR datasets with graph nodes representing single observables and graph links representing the connections of various types between these observables. This graph is an abstract representation of the connections in EXFOR, similar to graphs of social networks, authorship networks, etc. Analysing this abstract graph, we are able to address very specific questions such as 1) what observables are being used as reference measurements by the experimental community? 2) are thesemore » observables given the attention needed by various standards organisations? 3) are there classes of observables that are not connected to these reference measurements? In addressing these questions, we propose several (mostly cross section) observables that should be evaluated and made into reaction reference standards.« less
Using data logging to measure Young’s modulus
NASA Astrophysics Data System (ADS)
Richardson, David
2018-03-01
Historically the Young’s modulus of a material is measured by increasing the applied force to a wire and measuring the extension. The cross sectional area and original length allow this to be plotted as a graph of stress versus strain. This article describes how data logging sensors can be used to measure how the force changes with extension, allowing a strain versus stress graph to be plotted into the region of plastic deformation.
Analysis of (n,2n) cross-section measurements for nuclei up to mass 238
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davey, W.G.; Goin, R.W.; Ross, J.R.
All suitable measurements of the energy dependence of (n,2n) cross sections of all isotopes up to mass 238 have been analyzed. The objectives were to display the quality of the measured data for each isotope and to examine the systematic dependence of the (n,2n) cross section upon N, Z, and A. Graphs and tables are presented of the ratio of the asymptotic (n,2n) and nonelastic cross section to the neutron-asymmetry parameter (N--Z)/A. Similar data are presented for the derived nuclear temperature, T, and level-density parameter, $alpha$, as a function of N, Z, and A. This analysis of the results ofmore » over 145 experiments on 61 isotopes is essentially a complete review of the current status of (n,2n) cross-section measurements. (auth)« less
X-Ray Form Factor, Attenuation and Scattering Tables
National Institute of Standards and Technology Data Gateway
SRD 66 X-Ray Form Factor, Attenuation and Scattering Tables (Web, free access) This database collects tables and graphs of the form factors, the photoabsorption cross section, and the total attenuation coefficient for any element (Z <= 92).
Positron total scattering cross-sections for alkali atoms
NASA Astrophysics Data System (ADS)
Sinha, Nidhi; Singh, Suvam; Antony, Bobby
2018-01-01
Positron-impact total scattering cross-sections for Li, Na, K, Rb, Cs and Fr atoms are calculated in the energy range from 5-5000 eV employing modified spherical complex optical potential formalism. The main aim of this work is to apply this formalism to the less studied positron-target collision systems. The results are compared with previous theoretical and experimental data, wherever available. In general, the present data show overall agreement and consistency with other results. Furthermore, we have done a comparative study of the results to investigate the effect of atomic size on the cross-sections as we descend through the group in the periodic table. We have also plotted a correlation graph of the present total cross-sections with polarizability and number of target electrons. The two correlation plots confirm the credibility and consistency of the present results. Besides, this is the first theoretical attempt to report positron-impact total cross-sections of alkali atoms over such a wide energy range.
Optimal graph based segmentation using flow lines with application to airway wall segmentation.
Petersen, Jens; Nielsen, Mads; Lo, Pechin; Saghir, Zaigham; Dirksen, Asger; de Bruijne, Marleen
2011-01-01
This paper introduces a novel optimal graph construction method that is applicable to multi-dimensional, multi-surface segmentation problems. Such problems are often solved by refining an initial coarse surface within the space given by graph columns. Conventional columns are not well suited for surfaces with high curvature or complex shapes but the proposed columns, based on properly generated flow lines, which are non-intersecting, guarantee solutions that do not self-intersect and are better able to handle such surfaces. The method is applied to segment human airway walls in computed tomography images. Comparison with manual annotations on 649 cross-sectional images from 15 different subjects shows significantly smaller contour distances and larger area of overlap than are obtained with recently published graph based methods. Airway abnormality measurements obtained with the method on 480 scan pairs from a lung cancer screening trial are reproducible and correlate significantly with lung function.
An instrument for monitoring stump oedema and shrinkage in amputees.
Fernie, G R; Holliday, P J; Lobb, R J
1978-08-01
A new system for measuring the cross-sectional area profiles of amputation stumps and whole limbs has been designed at the Amputee Research Centre. The instrument consists of a cylindrical tank supported on an elevator. The tank is raised to the height of the amputation stump and filled with water. A graph of the cross-sectional area profile of the amputation stump is generated by a mini-computer as the elevator descends. The cross-sectional area (A) is calculated from the expression: formula: (see text) where Hw = height of water in the tank He = height of the elevator Ac = a constant, related to the size of the measuring tank. This paper describes the instrument, which may find application in many other areas where there is a need to study shape.
MadDM: Computation of dark matter relic abundance
NASA Astrophysics Data System (ADS)
Backović, Mihailo; Kong, Kyoungchul; McCaskey, Mathew
2017-12-01
MadDM computes dark matter relic abundance and dark matter nucleus scattering rates in a generic model. The code is based on the existing MadGraph 5 architecture and as such is easily integrable into any MadGraph collider study. A simple Python interface offers a level of user-friendliness characteristic of MadGraph 5 without sacrificing functionality. MadDM is able to calculate the dark matter relic abundance in models which include a multi-component dark sector, resonance annihilation channels and co-annihilations. The direct detection module of MadDM calculates spin independent / spin dependent dark matter-nucleon cross sections and differential recoil rates as a function of recoil energy, angle and time. The code provides a simplified simulation of detector effects for a wide range of target materials and volumes.
Maciel, Alfredo; Presbítero, Gerardo; Piña, Cristina; del Pilar Gutiérrez, María; Guzmán, José; Munguía, Nadia
2015-01-01
A clear understanding of the dependence of mechanical properties of bone remains a task not fully achieved. In order to estimate the mechanical properties in bones for implants, pore cross-section area, calcium content, and apparent density were measured in trabecular bone samples for human implants. Samples of fresh and defatted bone tissue, extracted from one year old bovines, were cut in longitudinal and transversal orientation of the trabeculae. Pore cross-section area was measured with an image analyzer. Compression tests were conducted into rectangular prisms. Elastic modulus presents a linear tendency as a function of pore cross-section area, calcium content and apparent density regardless of the trabecular orientation. The best variable to estimate elastic modulus of trabecular bone for implants was pore cross-section area, and affirmations to consider Nukbone process appropriated for marrow extraction in trabecular bone for implantation purposes are proposed, according to bone mechanical properties. Considering stress-strain curves, defatted bone is stiffer than fresh bone. Number of pores against pore cross-section area present an exponential decay, consistent for all the samples. These graphs also are useful to predict elastic properties of trabecular samples of young bovines for implants.
A Graphical Examination of Uranium and Plutonium Fissility
ERIC Educational Resources Information Center
Reed, B. Cameron
2008-01-01
The issue of why only particular isotopes of uranium and plutonium are suitable for use in nuclear weapons is analyzed with the aid of graphs and semiquantitative discussions of parameters such as excitation energies, fission barriers, reaction cross-sections, and the role of processes such as [alpha]-decay and spontaneous fission. The goal is to…
A Collection of Features for Semantic Graphs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eliassi-Rad, T; Fodor, I K; Gallagher, B
2007-05-02
Semantic graphs are commonly used to represent data from one or more data sources. Such graphs extend traditional graphs by imposing types on both nodes and links. This type information defines permissible links among specified nodes and can be represented as a graph commonly referred to as an ontology or schema graph. Figure 1 depicts an ontology graph for data from National Association of Securities Dealers. Each node type and link type may also have a list of attributes. To capture the increased complexity of semantic graphs, concepts derived for standard graphs have to be extended. This document explains brieflymore » features commonly used to characterize graphs, and their extensions to semantic graphs. This document is divided into two sections. Section 2 contains the feature descriptions for static graphs. Section 3 extends the features for semantic graphs that vary over time.« less
Photoionization of sodium atoms and electron scattering from ionized sodium
NASA Technical Reports Server (NTRS)
Dasgupta, A.; Bhatia, A. K.
1985-01-01
The polarized-orbital method of Temkin (1957) is applied using polarized orbitals determined from Sternheimer's equation to compute the photoionization cross sections of Na atoms from threshold to about 60 eV. The approximations involved in the analysis are explained in detail; the explicit forms of the integrals and matrix expressions are given in appendices; and the results are presented in tables and graphs. Good agreement is found with the results of Chang and Kelly (1975), and the possibility that small amounts of molecular vapor in Na-photoionization experiments are responsible for the discrepancies between calculated and measured cross sections is considered.
Understanding Conic Sections Using Alternate Graph Paper
ERIC Educational Resources Information Center
Brown, Elizabeth M.; Jones, Elizabeth
2006-01-01
This article describes two alternative coordinate systems and their use in graphing conic sections. This alternative graph paper helps students explore the idea of eccentricity using the definitions of the conic sections.
Garcia-Ramos, Camille; Lin, Jack J; Kellermann, Tanja S; Bonilha, Leonardo; Prabhakaran, Vivek; Hermann, Bruce P
2016-01-01
The recent revision of the classification of the epilepsies released by the ILAE Commission on Classification and Terminology (2005–2009) has been a major development in the field. Papers in this section of the special issue were charged with examining the relevance of other techniques and approaches to examining, categorizing and classifying cognitive and behavioral comorbidities. In that light, we investigate the applicability of graph theory to understand the impact of epilepsy on cognition compared to controls, and then the patterns of cognitive development in normally developing children which would set the stage for prospective comparisons of children with epilepsy and controls. The overall goal is to examine the potential utility of other analytic tools and approaches to conceptualize the cognitive comorbidities in epilepsy. Given that the major cognitive domains representing cognitive function are interdependent, the associations between the neuropsychological abilities underlying these domains can be referred to as a cognitive network. Therefore, the architecture of this cognitive network can be quantified and assessed using graph theory methods, rendering a novel approach to the characterization of cognitive status. In this article we provide fundamental information about graph theory procedures, followed by application of these techniques to cross-sectional analysis of neuropsychological data in children with epilepsy compared to controls, finalizing with prospective analysis of neuropsychological development in younger and older healthy controls. PMID:27017326
Automated event generation for loop-induced processes
Hirschi, Valentin; Mattelaer, Olivier
2015-10-22
We present the first fully automated implementation of cross-section computation and event generation for loop-induced processes. This work is integrated in the MadGraph5_aMC@NLO framework. We describe the optimisations implemented at the level of the matrix element evaluation, phase space integration and event generation allowing for the simulation of large multiplicity loop-induced processes. Along with some selected differential observables, we illustrate our results with a table showing inclusive cross-sections for all loop-induced hadronic scattering processes with up to three final states in the SM as well as for some relevant 2 → 4 processes. Furthermore, many of these are computed heremore » for the first time.« less
ERIC Educational Resources Information Center
Hill, Matthew; Sharma, Manjula Devi
2015-01-01
To succeed within scientific disciplines, using representations, including those based on words, graphs, equations, and diagrams, is important. Research indicates that the use of discipline specific representations (sometimes referred to as expert generated representations), as well as multi-representational use, is critical for problem solving…
ERIC Educational Resources Information Center
Zhu, Zheng; Chen, Peijie; Zhuang, Jie
2013-01-01
Purpose: The purpose of this study was to develop and cross-validate an equation based on ActiGraph accelerometer GT3X output to predict children and youth's energy expenditure (EE) of physical activity (PA). Method: Participants were 367 Chinese children and youth (179 boys and 188 girls, aged 9 to 17 years old) who wore 1 ActiGraph GT3X…
Unsupervised Metric Fusion Over Multiview Data by Graph Random Walk-Based Cross-View Diffusion.
Wang, Yang; Zhang, Wenjie; Wu, Lin; Lin, Xuemin; Zhao, Xiang
2017-01-01
Learning an ideal metric is crucial to many tasks in computer vision. Diverse feature representations may combat this problem from different aspects; as visual data objects described by multiple features can be decomposed into multiple views, thus often provide complementary information. In this paper, we propose a cross-view fusion algorithm that leads to a similarity metric for multiview data by systematically fusing multiple similarity measures. Unlike existing paradigms, we focus on learning distance measure by exploiting a graph structure of data samples, where an input similarity matrix can be improved through a propagation of graph random walk. In particular, we construct multiple graphs with each one corresponding to an individual view, and a cross-view fusion approach based on graph random walk is presented to derive an optimal distance measure by fusing multiple metrics. Our method is scalable to a large amount of data by enforcing sparsity through an anchor graph representation. To adaptively control the effects of different views, we dynamically learn view-specific coefficients, which are leveraged into graph random walk to balance multiviews. However, such a strategy may lead to an over-smooth similarity metric where affinities between dissimilar samples may be enlarged by excessively conducting cross-view fusion. Thus, we figure out a heuristic approach to controlling the iteration number in the fusion process in order to avoid over smoothness. Extensive experiments conducted on real-world data sets validate the effectiveness and efficiency of our approach.
Garcia-Ramos, Camille; Lin, Jack J; Kellermann, Tanja S; Bonilha, Leonardo; Prabhakaran, Vivek; Hermann, Bruce P
2016-11-01
The recent revision of the classification of the epilepsies released by the ILAE Commission on Classification and Terminology (2005-2009) has been a major development in the field. Papers in this section of the special issue explore the relevance of other techniques to examine, categorize, and classify cognitive and behavioral comorbidities in epilepsy. In this review, we investigate the applicability of graph theory to understand the impact of epilepsy on cognition compared with controls and, then, the patterns of cognitive development in normally developing children which would set the stage for prospective comparisons of children with epilepsy and controls. The overall goal is to examine the potential utility of this analytic tool and approach to conceptualize the cognitive comorbidities in epilepsy. Given that the major cognitive domains representing cognitive function are interdependent, the associations between neuropsychological abilities underlying these domains can be referred to as a cognitive network. Therefore, the architecture of this cognitive network can be quantified and assessed using graph theory methods, rendering a novel approach to the characterization of cognitive status. We first provide fundamental information about graph theory procedures, followed by application of these techniques to cross-sectional analysis of neuropsychological data in children with epilepsy compared with that of controls, concluding with prospective analysis of neuropsychological development in younger and older healthy controls. This article is part of a Special Issue entitled "The new approach to classification: Rethinking cognition and behavior in epilepsy". Copyright © 2016 Elsevier Inc. All rights reserved.
Cellular automata model for urban road traffic flow considering pedestrian crossing street
NASA Astrophysics Data System (ADS)
Zhao, Han-Tao; Yang, Shuo; Chen, Xiao-Xu
2016-11-01
In order to analyze the effect of pedestrians' crossing street on vehicle flows, we investigated traffic characteristics of vehicles and pedestrians. Based on that, rules of lane changing, acceleration, deceleration, randomization and update are modified. Then we established two urban two-lane cellular automata models of traffic flow, one of which is about sections with non-signalized crosswalk and the other is on uncontrolled sections with pedestrians crossing street at random. MATLAB is used for numerical simulation of the different traffic conditions; meanwhile space-time diagram and relational graphs of traffic flow parameters are generated and then comparatively analyzed. Simulation results indicate that when vehicle density is lower than around 25 vehs/(km lane), pedestrians have modest impact on traffic flow, whereas when vehicle density is higher than about 60 vehs/(km lane), traffic speed and volume will decrease significantly especially on sections with non-signal-controlled crosswalk. The results illustrate that the proposed models reconstruct the traffic flow's characteristic with the situation where there are pedestrians crossing and can provide some practical reference for urban traffic management.
Chatrchyan, Serguei
2014-02-06
The production of a Z boson, decaying into two leptons and produced in association with one or more b jets, is studied using proton-proton collisions delivered by the LHC at a centre-of-mass energy of 7 TeV. The data were recorded in 2011 with the CMS detector and correspond to an integrated luminosity of 5 fb -1. The Z(ℓℓ) + b-jets cross sections (where ℓℓ = μμ or ee) are measured separately for a Z boson produced with exactly one b jet and with at least two b jets. In addition, a cross section ratio is extracted for a Z bosonmore » produced with at least one b jet, relative to a Z boson produced with at least one jet. The measured cross sections are compared to various theoretical predictions, and the data favour the predictions in the five-flavour scheme, where b quarks are assumed massless. The kinematic properties of the reconstructed particles are compared with the predictions from the MadGraph event generator using the pythia parton shower simulation.« less
Radiological health risks for exploratory class missions in space
NASA Technical Reports Server (NTRS)
Nachtwey, D. Stuart; Yang, Tracy Chui-Hsu
1991-01-01
The radiation risks to crewmembers on missions to the moon and Mars are studied. A graph is presented of the cross section as a function of linear energy transfer (LET) for cell inactivation and neoplastic cell transformation. Alternatives to conventional approaches to radiation protection using dose and Q are presented with attention given to a hybrid of the conventional system for particles with LET less than 100 keV/micron.
TreeNetViz: revealing patterns of networks over tree structures.
Gou, Liang; Zhang, Xiaolong Luke
2011-12-01
Network data often contain important attributes from various dimensions such as social affiliations and areas of expertise in a social network. If such attributes exhibit a tree structure, visualizing a compound graph consisting of tree and network structures becomes complicated. How to visually reveal patterns of a network over a tree has not been fully studied. In this paper, we propose a compound graph model, TreeNet, to support visualization and analysis of a network at multiple levels of aggregation over a tree. We also present a visualization design, TreeNetViz, to offer the multiscale and cross-scale exploration and interaction of a TreeNet graph. TreeNetViz uses a Radial, Space-Filling (RSF) visualization to represent the tree structure, a circle layout with novel optimization to show aggregated networks derived from TreeNet, and an edge bundling technique to reduce visual complexity. Our circular layout algorithm reduces both total edge-crossings and edge length and also considers hierarchical structure constraints and edge weight in a TreeNet graph. These experiments illustrate that the algorithm can reduce visual cluttering in TreeNet graphs. Our case study also shows that TreeNetViz has the potential to support the analysis of a compound graph by revealing multiscale and cross-scale network patterns. © 2011 IEEE
New methods for analyzing semantic graph based assessments in science education
NASA Astrophysics Data System (ADS)
Vikaros, Lance Steven
This research investigated how the scoring of semantic graphs (known by many as concept maps) could be improved and automated in order to address issues of inter-rater reliability and scalability. As part of the NSF funded SENSE-IT project to introduce secondary school science students to sensor networks (NSF Grant No. 0833440), semantic graphs illustrating how temperature change affects water ecology were collected from 221 students across 16 schools. The graphing task did not constrain students' use of terms, as is often done with semantic graph based assessment due to coding and scoring concerns. The graphing software used provided real-time feedback to help students learn how to construct graphs, stay on topic and effectively communicate ideas. The collected graphs were scored by human raters using assessment methods expected to boost reliability, which included adaptations of traditional holistic and propositional scoring methods, use of expert raters, topical rubrics, and criterion graphs. High levels of inter-rater reliability were achieved, demonstrating that vocabulary constraints may not be necessary after all. To investigate a new approach to automating the scoring of graphs, thirty-two different graph features characterizing graphs' structure, semantics, configuration and process of construction were then used to predict human raters' scoring of graphs in order to identify feature patterns correlated to raters' evaluations of graphs' topical accuracy and complexity. Results led to the development of a regression model able to predict raters' scoring with 77% accuracy, with 46% accuracy expected when used to score new sets of graphs, as estimated via cross-validation tests. Although such performance is comparable to other graph and essay based scoring systems, cross-context testing of the model and methods used to develop it would be needed before it could be recommended for widespread use. Still, the findings suggest techniques for improving the reliability and scalability of semantic graph based assessments without requiring constraint of how ideas are expressed.
NASA Astrophysics Data System (ADS)
Tower, M. M.; Haight, C. H.
1984-03-01
The development status of a single-pulse distributed-energy-source electromagnetic railgun (ER) based on the design of Tower (1982) is reviewed. The five-stage ER is 3.65 m long, with energy inputs every 30 cm starting at the breech and a 12.7-mm-square bore cross section, and is powered by a 660-kJ 6-kV modular capacitor bank. Lexan cubes weighing 2.5 grams have been accelerated to velocities up to 8.5 km/sec at 500 kA and conversion efficiency up to 20 percent. Design goal for a 20-mm-sq-cross-section ER is acceleration of a 60-g projectile to 3-4 km/sec at 35-percent efficiency. Drawings, photographs, and graphs of performance are provided.
Empirical Determination of Pattern Match Confidence in Labeled Graphs
2014-02-07
were explored; Erdős–Rényi [6] random graphs, Barabási–Albert preferential attachment graphs [2], and Watts– Strogatz [18] small world graphs. The ER...B. Erdos - Renyi Barabasi - Albert Gr ap h Ty pe Strogatz - Watts Direct Within 2 nodes Within 4 nodes Search Limit 1 10 100 1000 10000 100000 100...Barabási–Albert (BA, crosses) and Watts– Strogatz (WS, trian- gles) graphs were generated with sizes ranging from 50 to 2500 nodes, and labeled
NASA Astrophysics Data System (ADS)
Sharma, Harshita; Zerbe, Norman; Heim, Daniel; Wienert, Stephan; Lohmann, Sebastian; Hellwich, Olaf; Hufnagl, Peter
2016-03-01
This paper describes a novel graph-based method for efficient representation and subsequent classification in histological whole slide images of gastric cancer. Her2/neu immunohistochemically stained and haematoxylin and eosin stained histological sections of gastric carcinoma are digitized. Immunohistochemical staining is used in practice by pathologists to determine extent of malignancy, however, it is laborious to visually discriminate the corresponding malignancy levels in the more commonly used haematoxylin and eosin stain, and this study attempts to solve this problem using a computer-based method. Cell nuclei are first isolated at high magnification using an automatic cell nuclei segmentation strategy, followed by construction of cell nuclei attributed relational graphs of the tissue regions. These graphs represent tissue architecture comprehensively, as they contain information about cell nuclei morphology as vertex attributes, along with knowledge of neighborhood in the form of edge linking and edge attributes. Global graph characteristics are derived and ensemble learning is used to discriminate between three types of malignancy levels, namely, non-tumor, Her2/neu positive tumor and Her2/neu negative tumor. Performance is compared with state of the art methods including four texture feature groups (Haralick, Gabor, Local Binary Patterns and Varma Zisserman features), color and intensity features, and Voronoi diagram and Delaunay triangulation. Texture, color and intensity information is also combined with graph-based knowledge, followed by correlation analysis. Quantitative assessment is performed using two cross validation strategies. On investigating the experimental results, it can be concluded that the proposed method provides a promising way for computer-based analysis of histopathological images of gastric cancer.
Analyzing cross-college course enrollments via contextual graph mining
Liu, Xiaozhong; Chen, Yan
2017-01-01
The ability to predict what courses a student may enroll in the coming semester plays a pivotal role in the allocation of learning resources, which is a hot topic in the domain of educational data mining. In this study, we propose an innovative approach to characterize students’ cross-college course enrollments by leveraging a novel contextual graph. Specifically, different kinds of variables, such as students, courses, colleges and diplomas, as well as various types of variable relations, are utilized to depict the context of each variable, and then a representation learning algorithm node2vec is applied to extracting sophisticated graph-based features for the enrollment analysis. In this manner, the relations between any pair of variables can be measured quantitatively, which enables the variable type to transform from nominal to ratio. These graph-based features are examined by the random forest algorithm, and experiments on 24,663 students, 1,674 courses and 417,590 enrollment records demonstrate that the contextual graph can successfully improve analyzing the cross-college course enrollments, where three of the graph-based features have significantly stronger impacts on prediction accuracy than the others. Besides, the empirical results also indicate that the student’s course preference is the most important factor in predicting future course enrollments, which is consistent to the previous studies that acknowledge the course interest is a key point for course recommendations. PMID:29186171
Analyzing cross-college course enrollments via contextual graph mining.
Wang, Yongzhen; Liu, Xiaozhong; Chen, Yan
2017-01-01
The ability to predict what courses a student may enroll in the coming semester plays a pivotal role in the allocation of learning resources, which is a hot topic in the domain of educational data mining. In this study, we propose an innovative approach to characterize students' cross-college course enrollments by leveraging a novel contextual graph. Specifically, different kinds of variables, such as students, courses, colleges and diplomas, as well as various types of variable relations, are utilized to depict the context of each variable, and then a representation learning algorithm node2vec is applied to extracting sophisticated graph-based features for the enrollment analysis. In this manner, the relations between any pair of variables can be measured quantitatively, which enables the variable type to transform from nominal to ratio. These graph-based features are examined by the random forest algorithm, and experiments on 24,663 students, 1,674 courses and 417,590 enrollment records demonstrate that the contextual graph can successfully improve analyzing the cross-college course enrollments, where three of the graph-based features have significantly stronger impacts on prediction accuracy than the others. Besides, the empirical results also indicate that the student's course preference is the most important factor in predicting future course enrollments, which is consistent to the previous studies that acknowledge the course interest is a key point for course recommendations.
Evaluation of the MyWellness Key accelerometer.
Herrmann, S D; Hart, T L; Lee, C D; Ainsworth, B E
2011-02-01
to examine the concurrent validity of the Technogym MyWellness Key accelerometer against objective and subjective physical activity (PA) measures. randomised, cross-sectional design with two phases. The laboratory phase compared the MyWellness Key with the ActiGraph GT1M and the Yamax SW200 Digiwalker pedometer during graded treadmill walking, increasing speed each minute. The free-living phase compared the MyWellness Key with the ActiGraph, Digiwalker, Bouchard Activity cord (BAR) and Global Physical Activity Questionnaire (GPAQ) for seven continuous days. Data were analysed using Spearman rank-order correlation coefficients for all comparisons. laboratory and free-living phases. sixteen participants randomly stratified from 41 eligible respondents by sex (n=8 men; n=8 women) and PA levels (n=4 low, n=8 middle and n=4 high active). there was a strong association between the MyWellness Key and the ActiGraph accelerometer during controlled graded treadmill walking (r=0.91, p<0.01) and in free-living settings (r=0.73-0.76 for light to vigorous PA, respectively, p<0.01). No associations were observed between the MyWellness Key and the BAR and GPAQ (p>0.05). the MyWellness Key has a high concurrent validity with the ActiGraph accelerometer to detect PA in both controlled laboratory and free-living settings.
Khachatryan, V.; Sirunyan, A. M.; Tumasyan, A.; ...
2016-11-24
Here, a measurement is presented of the cross section for the electroweak production of a W boson in association with two jets in proton-proton collisions at a center-of-mass energy of 8 TeV. The data set was collected with the CMS detector and corresponds to an integrated luminosity of 19.3 fbmore » $$^{-1}$$. The measured fiducial cross section for W bosons decaying to electrons or muons and for p$$_{T}^{j1}$$ > 60 GeV, p$$_{T}^{j2}$$ > 50 GeV, |η$$^{j}$$| < 4.7, and m$$_{jj}$$ > 1000 GeV is 0.42 ± 0.04 (stat) ± 0.09 (syst) ± 0.01 (lumi) pb. This result is consistent with the standard model leading-order prediction of 0.50 ± 0.02 (scale) ± 0.02 (PDF) pb obtained with MadGraph5_amc@nlo 2.1 interfaced to pythia 6.4. This is the first cross section measurement for this process.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khachatryan, V.; Sirunyan, A. M.; Tumasyan, A.
Here, a measurement is presented of the cross section for the electroweak production of a W boson in association with two jets in proton-proton collisions at a center-of-mass energy of 8 TeV. The data set was collected with the CMS detector and corresponds to an integrated luminosity of 19.3 fbmore » $$^{-1}$$. The measured fiducial cross section for W bosons decaying to electrons or muons and for p$$_{T}^{j1}$$ > 60 GeV, p$$_{T}^{j2}$$ > 50 GeV, |η$$^{j}$$| < 4.7, and m$$_{jj}$$ > 1000 GeV is 0.42 ± 0.04 (stat) ± 0.09 (syst) ± 0.01 (lumi) pb. This result is consistent with the standard model leading-order prediction of 0.50 ± 0.02 (scale) ± 0.02 (PDF) pb obtained with MadGraph5_amc@nlo 2.1 interfaced to pythia 6.4. This is the first cross section measurement for this process.« less
A one-dimensional model of flow in a junction of thin channels, including arterial trees
NASA Astrophysics Data System (ADS)
Kozlov, V. A.; Nazarov, S. A.
2017-08-01
We study a Stokes flow in a junction of thin channels (of diameter O(h)) for fixed flows of the fluid at the inlet cross-sections and fixed peripheral pressure at the outlet cross-sections. On the basis of the idea of the pressure drop matrix, apart from Neumann conditions (fixed flow) and Dirichlet conditions (fixed pressure) at the outer vertices, the ordinary one-dimensional Reynolds equations on the edges of the graph are equipped with transmission conditions containing a small parameter h at the inner vertices, which are transformed into the classical Kirchhoff conditions as h\\to+0. We establish that the pre-limit transmission conditions ensure an exponentially small error O(e-ρ/h), ρ>0, in the calculation of the three-dimensional solution, but the Kirchhoff conditions only give polynomially small error. For the arterial tree, under the assumption that the walls of the blood vessels are rigid, for every bifurcation node a ( 2×2)-pressure drop matrix appears, and its influence on the transmission conditions is taken into account by means of small variations of the lengths of the graph and by introducing effective lengths of the one-dimensional description of blood vessels whilst keeping the Kirchhoff conditions and exponentially small approximation errors. We discuss concrete forms of arterial bifurcation and available generalizations of the results, in particular, the Navier-Stokes system of equations. Bibliography: 59 titles.
Gomez, Carlos; Poza, Jesus; Gomez-Pilar, Javier; Bachiller, Alejandro; Juan-Cruz, Celia; Tola-Arribas, Miguel A; Carreres, Alicia; Cano, Monica; Hornero, Roberto
2016-08-01
The aim of this pilot study was to analyze spontaneous electroencephalography (EEG) activity in Alzheimer's disease (AD) by means of Cross-Sample Entropy (Cross-SampEn) and two local measures derived from graph theory: clustering coefficient (CC) and characteristic path length (PL). Five minutes of EEG activity were recorded from 37 patients with dementia due to AD and 29 elderly controls. Our results showed that Cross-SampEn values were lower in the AD group than in the control one for all the interactions among EEG channels. This finding indicates that EEG activity in AD is characterized by a lower statistical dissimilarity among channels. Significant differences were found mainly for fronto-central interactions (p <; 0.01, permutation test). Additionally, the application of graph theory measures revealed diverse neural network changes, i.e. lower CC and higher PL values in AD group, leading to a less efficient brain organization. This study suggests the usefulness of our approach to provide further insights into the underlying brain dynamics associated with AD.
Cross over of recurrence networks to random graphs and random geometric graphs
NASA Astrophysics Data System (ADS)
Jacob, Rinku; Harikrishnan, K. P.; Misra, R.; Ambika, G.
2017-02-01
Recurrence networks are complex networks constructed from the time series of chaotic dynamical systems where the connection between two nodes is limited by the recurrence threshold. This condition makes the topology of every recurrence network unique with the degree distribution determined by the probability density variations of the representative attractor from which it is constructed. Here we numerically investigate the properties of recurrence networks from standard low-dimensional chaotic attractors using some basic network measures and show how the recurrence networks are different from random and scale-free networks. In particular, we show that all recurrence networks can cross over to random geometric graphs by adding sufficient amount of noise to the time series and into the classical random graphs by increasing the range of interaction to the system size. We also highlight the effectiveness of a combined plot of characteristic path length and clustering coefficient in capturing the small changes in the network characteristics.
47 CFR 80.761 - Conversion graphs.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 5 2010-10-01 2010-10-01 false Conversion graphs. 80.761 Section 80.761... MARITIME SERVICES Standards for Computing Public Coast Station VHF Coverage § 80.761 Conversion graphs. The following graphs must be employed where conversion from one to the other of the indicated types of units is...
47 CFR 80.761 - Conversion graphs.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 5 2011-10-01 2011-10-01 false Conversion graphs. 80.761 Section 80.761... MARITIME SERVICES Standards for Computing Public Coast Station VHF Coverage § 80.761 Conversion graphs. The following graphs must be employed where conversion from one to the other of the indicated types of units is...
An algorithm for finding a similar subgraph of all Hamiltonian cycles
NASA Astrophysics Data System (ADS)
Wafdan, R.; Ihsan, M.; Suhaimi, D.
2018-01-01
This paper discusses an algorithm to find a similar subgraph called findSimSubG algorithm. A similar subgraph is a subgraph with a maximum number of edges, contains no isolated vertex and is contained in every Hamiltonian cycle of a Hamiltonian Graph. The algorithm runs only on Hamiltonian graphs with at least two Hamiltonian cycles. The algorithm works by examining whether the initial subgraph of the first Hamiltonian cycle is a subgraph of comparison graphs. If the initial subgraph is not in comparison graphs, the algorithm will remove edges and vertices of the initial subgraph that are not in comparison graphs. There are two main processes in the algorithm, changing Hamiltonian cycle into a cycle graph and removing edges and vertices of the initial subgraph that are not in comparison graphs. The findSimSubG algorithm can find the similar subgraph without using backtracking method. The similar subgraph cannot be found on certain graphs, such as an n-antiprism graph, complete bipartite graph, complete graph, 2n-crossed prism graph, n-crown graph, n-möbius ladder, prism graph, and wheel graph. The complexity of this algorithm is O(m|V|), where m is the number of Hamiltonian cycles and |V| is the number of vertices of a Hamiltonian graph.
Rodríguez, Vanessa; Andrade, Allen D; García-Retamero, Rocio; Anam, Ramanakumar; Rodríguez, Remberto; Lisigurski, Miriam; Sharit, Joseph; Ruiz, Jorge G
2013-01-01
Studies reveal high levels of inadequate health literacy and numeracy in African Americans and older veterans. The authors aimed to investigate the distribution of health literacy, numeracy, and graph literacy in these populations. They conducted a cross-sectional survey of veterans receiving outpatient care and measured health literacy, numeracy, graph literacy, shared decision making, and trust in physicians. In addition, the authors compared subgroups of veterans using analyses of covariance. Participants were 502 veterans (22-82 years). Low, marginal, and adequate health literacy were found in, respectively, 29%, 26%, and 45% of the veterans. The authors found a significant main effect of race qualified by an age and race interaction. Inadequate health literacy was more common in African Americans than in Whites. Younger African Americans had lower health literacy (p <.001), graph literacy (p <.001), and numeracy (p <.001) than did Whites, even after the authors adjusted for covariates. Older and younger participants did not differ in health literacy, objective numeracy, or graph literacy after adjustment. The authors found no health literacy or age-related differences regarding preferences for shared decision making. African Americans expressed dissatisfaction with their current role in decision making (p =.03). Older participants trusted their physicians more than younger participants (p =.01). In conclusion, African Americans may be at a disadvantage when reviewing patient education materials, potentially affecting health care outcomes.
Tumor evolutionary directed graphs and the history of chronic lymphocytic leukemia.
Wang, Jiguang; Khiabanian, Hossein; Rossi, Davide; Fabbri, Giulia; Gattei, Valter; Forconi, Francesco; Laurenti, Luca; Marasca, Roberto; Del Poeta, Giovanni; Foà, Robin; Pasqualucci, Laura; Gaidano, Gianluca; Rabadan, Raul
2014-12-11
Cancer is a clonal evolutionary process, caused by successive accumulation of genetic alterations providing milestones of tumor initiation, progression, dissemination, and/or resistance to certain therapeutic regimes. To unravel these milestones we propose a framework, tumor evolutionary directed graphs (TEDG), which is able to characterize the history of genetic alterations by integrating longitudinal and cross-sectional genomic data. We applied TEDG to a chronic lymphocytic leukemia (CLL) cohort of 70 patients spanning 12 years and show that: (a) the evolution of CLL follows a time-ordered process represented as a global flow in TEDG that proceeds from initiating events to late events; (b) there are two distinct and mutually exclusive evolutionary paths of CLL evolution; (c) higher fitness clones are present in later stages of the disease, indicating a progressive clonal replacement with more aggressive clones. Our results suggest that TEDG may constitute an effective framework to recapitulate the evolutionary history of tumors.
NASA Astrophysics Data System (ADS)
Awasarmol, V. V.; Gaikwad, D. K.; Raut, S. D.; Pawar, P. P.
The mass attenuation coefficients (μm) for organic nonlinear optical materials measured at 122-1330 keV photon energies were investigated on the basis of mixture rule and compared with obtained values of WinXCOM program. It is observed that there is a good agreement between theoretical and experimental values of the samples. All samples were irradiated with six radioactive sources such as 57Co, 133Ba, 22Na, 137Cs, 54Mn and 60Co using transmission arrangement. Effective atomic and electron numbers or electron densities (Zeff and Neff), molar extinction coefficient (ε), mass energy absorption coefficient (μen/ρ) and effective atomic energy absorption cross section (σa,en) were determined experimentally and theoretically using the obtained μm values for investigated samples and graphs have been plotted. The graph shows that the variation of all samples decreases with increasing photon energy.
Fiber tracking of brain white matter based on graph theory.
Lu, Meng
2015-01-01
Brain white matter tractography is reconstructed via diffusion-weighted magnetic resonance images. Due to the complex structure of brain white matter fiber bundles, fiber crossing and fiber branching are abundant in human brain. And regular methods with diffusion tensor imaging (DTI) can't accurately handle this problem. the biggest problems of the brain tractography. Therefore, this paper presented a novel brain white matter tractography method based on graph theory, so the fiber tracking between two voxels is transformed into locating the shortest path in a graph. Besides, the presented method uses Q-ball imaging (QBI) as the source data instead of DTI, because QBI can provide accurate information about multiple fiber crossing and branching in one voxel using orientation distribution function (ODF). Experiments showed that the presented method can accurately handle the problem of brain white matter fiber crossing and branching, and reconstruct brain tractograhpy both in phantom data and real brain data.
Brain white matter fiber estimation and tractography using Q-ball imaging and Bayesian MODEL.
Lu, Meng
2015-01-01
Diffusion tensor imaging allows for the non-invasive in vivo mapping of the brain tractography. However, fiber bundles have complex structures such as fiber crossings, fiber branchings and fibers with large curvatures that tensor imaging (DTI) cannot accurately handle. This study presents a novel brain white matter tractography method using Q-ball imaging as the data source instead of DTI, because QBI can provide accurate information about multiple fiber crossings and branchings in a single voxel using an orientation distribution function (ODF). The presented method also uses graph theory to construct the Bayesian model-based graph, so that the fiber tracking between two voxels can be represented as the shortest path in a graph. Our experiment showed that our new method can accurately handle brain white matter fiber crossings and branchings, and reconstruct brain tractograhpy both in phantom data and real brain data.
Airola, Antti; Pyysalo, Sampo; Björne, Jari; Pahikkala, Tapio; Ginter, Filip; Salakoski, Tapio
2008-11-19
Automated extraction of protein-protein interactions (PPI) is an important and widely studied task in biomedical text mining. We propose a graph kernel based approach for this task. In contrast to earlier approaches to PPI extraction, the introduced all-paths graph kernel has the capability to make use of full, general dependency graphs representing the sentence structure. We evaluate the proposed method on five publicly available PPI corpora, providing the most comprehensive evaluation done for a machine learning based PPI-extraction system. We additionally perform a detailed evaluation of the effects of training and testing on different resources, providing insight into the challenges involved in applying a system beyond the data it was trained on. Our method is shown to achieve state-of-the-art performance with respect to comparable evaluations, with 56.4 F-score and 84.8 AUC on the AImed corpus. We show that the graph kernel approach performs on state-of-the-art level in PPI extraction, and note the possible extension to the task of extracting complex interactions. Cross-corpus results provide further insight into how the learning generalizes beyond individual corpora. Further, we identify several pitfalls that can make evaluations of PPI-extraction systems incomparable, or even invalid. These include incorrect cross-validation strategies and problems related to comparing F-score results achieved on different evaluation resources. Recommendations for avoiding these pitfalls are provided.
Structural network efficiency is associated with cognitive impairment in small-vessel disease.
Lawrence, Andrew J; Chung, Ai Wern; Morris, Robin G; Markus, Hugh S; Barrick, Thomas R
2014-07-22
To characterize brain network connectivity impairment in cerebral small-vessel disease (SVD) and its relationship with MRI disease markers and cognitive impairment. A cross-sectional design applied graph-based efficiency analysis to deterministic diffusion tensor tractography data from 115 patients with lacunar infarction and leukoaraiosis and 50 healthy individuals. Structural connectivity was estimated between 90 cortical and subcortical brain regions and efficiency measures of resulting graphs were analyzed. Networks were compared between SVD and control groups, and associations between efficiency measures, conventional MRI disease markers, and cognitive function were tested. Brain diffusion tensor tractography network connectivity was significantly reduced in SVD: networks were less dense, connection weights were lower, and measures of network efficiency were significantly disrupted. The degree of brain network disruption was associated with MRI measures of disease severity and cognitive function. In multiple regression models controlling for confounding variables, associations with cognition were stronger for network measures than other MRI measures including conventional diffusion tensor imaging measures. A total mediation effect was observed for the association between fractional anisotropy and mean diffusivity measures and executive function and processing speed. Brain network connectivity in SVD is disturbed, this disturbance is related to disease severity, and within a mediation framework fully or partly explains previously observed associations between MRI measures and SVD-related cognitive dysfunction. These cross-sectional results highlight the importance of network disruption in SVD and provide support for network measures as a disease marker in treatment studies. © 2014 American Academy of Neurology.
Structural network efficiency is associated with cognitive impairment in small-vessel disease
Chung, Ai Wern; Morris, Robin G.; Markus, Hugh S.; Barrick, Thomas R.
2014-01-01
Objective: To characterize brain network connectivity impairment in cerebral small-vessel disease (SVD) and its relationship with MRI disease markers and cognitive impairment. Methods: A cross-sectional design applied graph-based efficiency analysis to deterministic diffusion tensor tractography data from 115 patients with lacunar infarction and leukoaraiosis and 50 healthy individuals. Structural connectivity was estimated between 90 cortical and subcortical brain regions and efficiency measures of resulting graphs were analyzed. Networks were compared between SVD and control groups, and associations between efficiency measures, conventional MRI disease markers, and cognitive function were tested. Results: Brain diffusion tensor tractography network connectivity was significantly reduced in SVD: networks were less dense, connection weights were lower, and measures of network efficiency were significantly disrupted. The degree of brain network disruption was associated with MRI measures of disease severity and cognitive function. In multiple regression models controlling for confounding variables, associations with cognition were stronger for network measures than other MRI measures including conventional diffusion tensor imaging measures. A total mediation effect was observed for the association between fractional anisotropy and mean diffusivity measures and executive function and processing speed. Conclusions: Brain network connectivity in SVD is disturbed, this disturbance is related to disease severity, and within a mediation framework fully or partly explains previously observed associations between MRI measures and SVD-related cognitive dysfunction. These cross-sectional results highlight the importance of network disruption in SVD and provide support for network measures as a disease marker in treatment studies. PMID:24951477
Parasol: An Architecture for Cross-Cloud Federated Graph Querying
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lieberman, Michael; Choudhury, Sutanay; Hughes, Marisa
2014-06-22
Large scale data fusion of multiple datasets can often provide in- sights that examining datasets individually cannot. However, when these datasets reside in different data centers and cannot be collocated due to technical, administrative, or policy barriers, a unique set of problems arise that hamper querying and data fusion. To ad- dress these problems, a system and architecture named Parasol is presented that enables federated queries over graph databases residing in multiple clouds. Parasol’s design is flexible and requires only minimal assumptions for participant clouds. Query optimization techniques are also described that are compatible with Parasol’s lightweight architecture. Experiments onmore » a prototype implementation of Parasol indicate its suitability for cross-cloud federated graph queries.« less
Graph Drawing Aesthetics-Created by Users, Not Algorithms.
Purchase, H C; Pilcher, C; Plimmer, B
2012-01-01
Prior empirical work on layout aesthetics for graph drawing algorithms has concentrated on the interpretation of existing graph drawings. We report on experiments which focus on the creation and layout of graph drawings: participants were asked to draw graphs based on adjacency lists, and to lay them out "nicely." Two interaction methods were used for creating the drawings: a sketch interface which allows for easy, natural hand movements, and a formal point-and-click interface similar to a typical graph editing system. We find, in common with many other studies, that removing edge crossings is the most significant aesthetic, but also discover that aligning nodes and edges to an underlying grid is important. We observe that the aesthetics favored by participants during creation of a graph drawing are often not evident in the final product and that the participants did not make a clear distinction between the processes of creation and layout. Our results suggest that graph drawing systems should integrate automatic layout with the user's manual editing process, and provide facilities to support grid-based graph creation.
Garrett, W.B.; van de Vanter, E.K.; Graf, J.B.
1993-01-01
The U.S. Geological Survey collected streamflow and sediment-transport data at 5 streamflow-gaging stations on the Colorado River between Glen Canyon Dam and Lake Mead as a part of an interagency environmental study. The data were collected for about 6 mo in 1983 and about 4 mo in 1985-86; data also were collected at 3 sites on tributary streams in 1983. The data were used for development of unsteady flow-routing and sediment-transport models, sand-load rating curves, and evaluation of channel changes. For the 1983 sampling period, 1,076 composite cross-section suspended-sediment samples were analyzed; 809 of these samples were collected on the main stem of the Colorado River and 267 samples were from the tributaries. Bed-material samples were obtained at 1,988 verticals; 161 samples of material in transport near the bed (bedload) were collected to define the location of sand, gravel, and bed rock in the channel cross section; and 664 discharge measurements were made. For the 1985-86 sampling period, 765 composite cross-section suspended-sediment samples and 887 individual vertical samples from cross sections were analyzed. Bed-material samples were obtained at 531 verticals, 159 samples of bedload were collected, and 218 discharge measurements were made. All data are presented in tabular form. Some types of data also are presented in graphs to better show trends or variations. (USGS)
Rodríguez, Vanessa; Andrade, Allen D.; García-Retamero, Rocio; Anam, Ramanakumar; Rodríguez, Remberto; Lisigurski, Miriam; Sharit, Joseph; Ruiz, Jorge G.
2013-01-01
Studies reveal high levels of inadequate health literacy and numeracy in African Americans and older veterans. The authors aimed to investigate the distribution of health literacy, numeracy, and graph literacy in these populations. They conducted a cross-sectional survey of veterans receiving outpatient care and measured health literacy, numeracy, graph literacy, shared decision making, and trust in physicians. In addition, the authors compared subgroups of veterans using analyses of covariance. Participants were 502 veterans (22–82 years). Low, marginal, and adequate health literacy were found in, respectively, 29%, 26%, and 45% of the veterans. The authors found a significant main effect of race qualified by an age and race interaction. Inadequate health literacy was more common in African Americans than in Whites. Younger African Americans had lower health literacy (p < .001), graph literacy (p < .001), and numeracy (p < .001) than did Whites, even after the authors adjusted for covariates. Older and younger participants did not differ in health literacy, objective numeracy, or graph literacy after adjustment. The authors found no health literacy or age-related differences regarding preferences for shared decision making. African Americans expressed dissatisfaction with their current role in decision making (p = .03). Older participants trusted their physicians more than younger participants (p = .01). In conclusion, African Americans may be at a disadvantage when reviewing patient education materials, potentially affecting health care outcomes. PMID:24093361
NASA Astrophysics Data System (ADS)
Xie, Huimin
The following sections are included: * Definition of Dynamical Languages * Distinct Excluded Blocks * Definition and Properties * L and L″ in Chomsky Hierarchy * A Natural Equivalence Relation * Symbolic Flows * Symbolic Flows and Dynamical Languages * Subshifts of Finite Type * Sofic Systems * Graphs and Dynamical Languages * Graphs and Shannon-Graphs * Transitive Languages * Topological Entropy
X-ray fluorescence cross sections for K and L x rays of the elements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krause, M.O.; Nestor, C.W. Jr.; Sparks, C.J. Jr.
1978-06-01
X-ray fluorescence cross sections are calculated for the major x rays of the K series 5 less than or equal to Z less than or equal to 101, and the three L series 12 less than or equal to Z less than or equal to 101 in the energy range 1 to 200 keV. This calculation uses Scofield's theoretical partical photoionization cross sections, Krause's evaluation of fluorescence and Coster-Kronig yields, and Scofield's theoretical radiative rates. Values are presented in table and graph format, and an estimate of their accuracy is made. The following x rays are considered: K..cap alpha../sub 1/,more » K..cap alpha../sub 1/,/sub 2/, K..beta../sub 1/, K..beta../sub 1/,/sub 3/, L..cap alpha../sub 1/, L..cap alpha../sub 1/,/sub 2/, L..beta../sub 1/, L..beta../sub 2/,/sub 15/, L..beta../sub 3/, Ll, L..gamma../sub 1/, L..gamma../sub 4/, and L/sub 1/ ..-->.. L/sub 2/,/sub 3/. For use in x-ray fluorescence analysis, K..cap alpha.. and L..cap alpha.. fluorescence cross sections are presented at specific energies: TiK identical with 4.55 keV, CrK identical with 5.46 keV, CoK identical with 7.00 keV, CuK identical with 8.13 keV, MoK..cap alpha.. identical with 17.44 keV, AgK identical with 22.5 keV, DyK identical with 47.0 keV, and /sup 241/Am identical with 59.54 keV. Supplementary material includes fluorescence and Coster--Kronig yields, fractional radiative rates, fractional fluorescence yields, total L-shell fluorescence cross sections, fluorescence and Coster-Kronig yields in condensed matter, effective fluorescence yields, average L-shell fluorescence yield, L-subshell photoionization cross section ratios, and conversion factors from barns per atom to square centimeters per gram.« less
GRAPhEME: a setup to measure (n, xn γ) reaction cross sections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henning, Greg; Bacquias, A.; Capdevielle, O.
2015-07-01
Most of nuclear reactor developments are using evaluated data base for numerical simulations. However, the considered databases present still large uncertainties and disagreements. To improve their level of precision, new measurements are needed, in particular for (n, xn) reactions, which are of great importance as they modify the neutron spectrum, the neutron population, and produce radioactive species. The IPHC group started an experimental program to measure (n, xn gamma) reaction cross sections using prompt gamma spectroscopy and neutron energy determination by time of flight. Measurements of (n, xn gamma) cross section have been performed for {sup 235,238}U, {sup 232}Th, {supmore » nat,182,183,184,186}W, {sup nat}Zr. The experimental setup is installed at the neutron beam at GELINA (Geel, Belgium). The setup has recently been upgraded with the addition of a highly segmented 36 pixels planar HPGe detector. Significant efforts have been made to reduce radiation background and electromagnetic perturbations. The setup is equipped with a high rate digital acquisition system. The analysis of the segmented detector data requires a specific procedure to account for cross signals between pixels. An overall attention is paid to the precision of the measurement. The setup characteristic and the analysis procedure will be presented along with the acquisition and analysis challenges. Examples of results and their impact on models will be discussed. (authors)« less
Dalton, Melinda S.
2006-01-01
This report presents hydrologic data for selected reaches of the Chattahoochee River within the Chattahoochee River National Recreation Area (CRNRA). Data about transect location, width, depth, and velocity of flow for selected reaches of the river are presented in tabular form. The tables contain measurements collected from shoal and run habitats identified as critical sites for the CRNRA. In shoal habitats, measurements were collected while wading using a digital flowmeter and laser range finder. In run habitats, measurements were collected using acoustic Doppler current profiling. Fifty-three transects were established in six reaches throughout the CRNRA; 24 in shoal habitat, 26 in run habitat, and 3 in pool habitat. Illustrations in this report contain information about study area location, hydrology, transect locations, and cross-sectional information. A study area location figure is followed by figures identifying locations of transects within each individual reach. Cross-sectional information is presented for each transect, by reach, in a series of graphs. The data presented herein can be used to complete preliminary habitat assessments for the Chattahoochee River within the CRNRA. These preliminary assessments can be used to identify reaches of concern for future impacts associated with continual development in the Metropolitan Atlanta area and potential water allocation agreements between Georgia, Florida, and Alabama.
Measurements of velocity and discharge, Grand Canyon, Arizona, May 1994
Oberg, Kevin A.; Fisk, Gregory G.; ,
1995-01-01
The U.S. Geological Survey (USGS) evaluated the feasibility of utilizing an acoustic Doppler current profiler (ADCP) to collect velocity and discharge data in the Colorado River in Grand Canyon, Arizona, in May 1994. An ADCP is an instrument that can be used to measure water velocity and discharge from a moving boat. Measurements of velocity and discharge were made with an ADCP at 54 cross sections along the Colorado River between the Little Colorado River and Diamond Creek. Concurrent measurements of discharge with an ADCP and a Price-AA current meter were made at three U.S. Geological Survey streamflow-gaging stations: Colorado River above the Little Colorado River near Desert View, Colorado River near Grand Canyon, and Colorado River above Diamond Creek near Peach Springs. Discharges measured with an ADCP were within 3 percent of the rated discharge at each streamflow-gaging station. Discharges measured with the ADCP were within 4 percent of discharges measured with a Price-AA meter, except at the Colorado River above Diamond Creek. Vertical velocity profiles were measured with the ADCP from a stationary position at four cross sections along the Colorado River. Graphs of selected vertical velocity profiles collected in a cross section near National Canyon show considerable temporal variation among profile.
Aaboud, M.; Aad, G.; Abbott, B.; ...
2017-11-10
A measurement of b-hadron pair production is presented, based on a data set corresponding to an integrated luminosity of 11.4 fb –1 of proton-proton collisions recorded at √s=8 TeV with the ATLAS detector at the LHC. Events are selected in which a b-hadron is reconstructed in a decay channel containing J/ψ → μμ, and a second b-hadron is reconstructed in a decay channel containing a muon. Results are presented in a fiducial volume defined by kinematic requirements on three muons based on those used in the analysis. The fiducial cross section is measured to be 17.7 ± 0.1(stat.) ± 2.0(syst.)more » nb. A number of normalised differential cross sections are also measured, and compared to predictions from the Pythia8, Herwig++, MadGraph5_aMC@NLO+Pythia8 and Sherpa event generators, providing new constraints on heavy flavour production.« less
Aaboud, M.; Aad, G.; Abbott, B.; ...
2017-11-10
Here, a measurement of b-hadron pair production is presented, based on a data set corresponding to an integrated luminosity of 11.4 fb –1 of proton-proton collisions recorded at √s = 8 TeV with the ATLAS detector at the LHC. Events are selected in which a b-hadron is reconstructed in a decay channel containing J/ψ → μμ, and a second b-hadron is reconstructed in a decay channel containing a muon. Results are presented in a fiducial volume defined by kinematic requirements on three muons based on those used in the analysis. The fiducial cross section is measured to be 17.7 ±more » 0.1(stat.) ± 2.0(syst.) nb. A number of normalised differential cross sections are also measured, and compared to predictions from the Pythia8, Herwig++, MadGraph5_aMC@NLO+Pythia8 and Sherpa event generators, providing new constraints on heavy flavour production.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aaboud, M.; Aad, G.; Abbott, B.
A measurement of b-hadron pair production is presented, based on a data set corresponding to an integrated luminosity of 11.4 fb –1 of proton-proton collisions recorded at √s=8 TeV with the ATLAS detector at the LHC. Events are selected in which a b-hadron is reconstructed in a decay channel containing J/ψ → μμ, and a second b-hadron is reconstructed in a decay channel containing a muon. Results are presented in a fiducial volume defined by kinematic requirements on three muons based on those used in the analysis. The fiducial cross section is measured to be 17.7 ± 0.1(stat.) ± 2.0(syst.)more » nb. A number of normalised differential cross sections are also measured, and compared to predictions from the Pythia8, Herwig++, MadGraph5_aMC@NLO+Pythia8 and Sherpa event generators, providing new constraints on heavy flavour production.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aaboud, M.; Aad, G.; Abbott, B.
Here, a measurement of b-hadron pair production is presented, based on a data set corresponding to an integrated luminosity of 11.4 fb –1 of proton-proton collisions recorded at √s = 8 TeV with the ATLAS detector at the LHC. Events are selected in which a b-hadron is reconstructed in a decay channel containing J/ψ → μμ, and a second b-hadron is reconstructed in a decay channel containing a muon. Results are presented in a fiducial volume defined by kinematic requirements on three muons based on those used in the analysis. The fiducial cross section is measured to be 17.7 ±more » 0.1(stat.) ± 2.0(syst.) nb. A number of normalised differential cross sections are also measured, and compared to predictions from the Pythia8, Herwig++, MadGraph5_aMC@NLO+Pythia8 and Sherpa event generators, providing new constraints on heavy flavour production.« less
Layer-by-layer assembly of graphene oxide on thermosensitive liposomes for photo-chemotherapy.
Hashemi, Mohadeseh; Omidi, Meisam; Muralidharan, Bharadwaj; Tayebi, Lobat; Herpin, Matthew J; Mohagheghi, Mohammad Ali; Mohammadi, Javad; Smyth, Hugh D C; Milner, Thomas E
2018-01-01
Stimuli responsive polyelectrolyte nanoparticles have been developed for chemo-photothermal destruction of breast cancer cells. This novel system, called layer by layer Lipo-graph (LBL Lipo-graph), is composed of alternate layers of graphene oxide (GO) and graphene oxide conjugated poly (l-lysine) (GO-PLL) deposited on cationic liposomes encapsulating doxorubicin. Various concentrations of GO and GO-PLL were examined and the optimal LBL Lipo-graph was found to have a particle size of 267.9 ± 13 nm, zeta potential of +43.9 ± 6.9 mV and encapsulation efficiency of 86.4 ± 4.7%. The morphology of LBL Lipo-graph was examined by cryogenic-transmission electron microscopy (Cryo-TEM), atomic force microcopy (AFM) and scanning electron microscopy (SEM). The buildup of LBL Lipo-graph was confirmed via ultraviolet-visible (UV-Vis) spectrophotometry, thermogravimetric analysis (TGA) and differential scanning calorimetry (DSC) analysis. Infra-red (IR) response suggests that four layers are sufficient to induce a gel-to-liquid phase transition in response to near infra-red (NIR) laser irradiation. Light-matter interaction of LBL Lipo-graph was studied by calculating the absorption cross section in the frequency domain by utilizing Fourier analysis. Drug release assay indicates that the LBL Lipo-graph releases much faster in an acidic environment than a liposome control. A cytotoxicity assay was conducted to prove the efficacy of LBL Lipo-graph to destroy MD-MB-231 cells in response to NIR laser emission. Also, image stream flow cytometry and two photon microcopy provide supportive data for the potential application of LBL Lipo-graph for photothermal therapy. Study results suggest the novel dual-sensitive nanoparticles allow intracellular doxorubin delivery and respond to either acidic environments or NIR excitation. Stimuli sensitive hybrid nanoparticles have been synthesized using a layer-by-layer technique and demonstrated for dual chemo-photothermal destruction of breast cancer cells. The hybrid nanoparticles are composed of alternating layers of graphene oxide and graphene oxide conjugated poly-l-lysine coating the surface of a thermosensitive cationic liposome containing doxorubicin as a core. Data suggests that the hybrid nanoparticles may offer many advantages for chemo-photothermal therapy. Advantages include a decrease of the initial burst release which may result in the reduction in systemic toxicity, increase in pH responsivity around the tumor environment and improved NIR light absorption. Copyright © 2017 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
O'Neill, B; McDonough, S M; Wilson, J J; Bradbury, I; Hayes, K; Kirk, A; Kent, L; Cosgrove, D; Bradley, J M; Tully, M A
2017-01-14
There are challenges for researchers and clinicians to select the most appropriate physical activity tool, and a balance between precision and feasibility is needed. Currently it is unclear which physical activity tool should be used to assess physical activity in Bronchiectasis. The aim of this research is to compare assessment methods (pedometer and IPAQ) to our criterion method (ActiGraph) for the measurement of physical activity dimensions in Bronchiectasis (BE), and to assess their feasibility and acceptability. Patients in this analysis were enrolled in a cross-sectional study. The ActiGraph and pedometer were worn for seven consecutive days and the IPAQ was completed for the same period. Statistical analyses were performed using SPSS 20 (IBM). Descriptive statistics were used; the percentage agreement between ActiGraph and the other measures were calculated using limits of agreement. Feedback about the feasibility of the activity monitors and the IPAQ was obtained. There were 55 (22 male) data sets available. For step count there was no significant difference between the ActiGraph and Pedometer, however, total physical activity time (mins) as recorded by the ActiGraph was significantly higher than the pedometer (mean ± SD, 232 (75) vs. 63 (32)). Levels of agreement between the two devices was very good for step count (97% agreement); and variation in the levels of agreement were within accepted limits of ±2 standard deviations from the mean value. IPAQ reported more bouted- moderate - vigorous physical activity (MVPA) [mean, SD; 167(170) vs 6(9) mins/day], and significantly less sedentary time than ActiGraph [mean, SD; 362(115) vs 634(76) vmins/day]. There were low levels of agreement between the two tools (57% sedentary behaviour; 0% MVPA 10+ ), with IPAQ under-reporting sedentary behaviour and over-reporting MVPA 10+ compared to ActiGraph. The monitors were found to be feasible and acceptable by participants and researchers; while the IPAQ was accepta ble to use, most patients required assistance to complete it. Accurate measurement of physical activity is feasible in BE and will be valuable for future trials of therapeutic interventions. ActiGraph or pedometer could be used to measure simple daily step counts, but ActiGraph was superior as it measured intensity of physical activity and was a more precise measure of time spent walking. The IPAQ does not appear to represent an accurate measure of physical activity in this population. Clinical Trials Registration Number NCT01569009 : Physical Activity in Bronchiectasis.
The "No Crossing Constraint" in Autosegmental Phonology.
ERIC Educational Resources Information Center
Coleman, John; Local, John
A discussion of autosegmental phonology (AP), a theory of phonological representation that uses graphs rather than strings as the central data structure, considers its principal constraint, the "No Crossing Constraint" (NCC). The NCC is the statement that in a well-formed autosegmental diagram, lines of association may not cross. After…
NASA Astrophysics Data System (ADS)
Petrovszki, Judit; Timár, Gábor; Molnár, Gábor
2014-05-01
The multi-variable connection between the channel slope, bankfull discharge and sinuosity values were analysed to get a mathematical formula, which describes the responses of the rivers, and gives the probable sinuosity values for every slope and discharge values. Timár (2003) merged two planar diagrams into a quasi 3D graph. One of them displayed how the river pattern changes, according to the slope and bankfull discharge values (Leopold and Wolmann, 1957; Ackers and Charlton, 1971); the other based on flume experiments, and gives a connection between the slope and sinuosity (Schumm and Khan, 1972). The result graph suggests that the slope-sinuosity connection also works along the natural rivers, for every discharge values. The aim of this work was to prove this relation, and describe it numerically. The sinuosity values were calculated along the natural, meandering river beds, using historical maps (2nd Military Survey of the Habsburg Empire, from the 19th century). The available slope and discharge values were imported from a database measured after the main river control works, at the beginning of the 20th century (Viczián, 1905). Analysing the reports of the river control works, the natural slope could be computed for every river sections. The mean discharges were also converted to bankfull discharges. Neither long time series, nor cross sectional areas were obtainable, so other method was used to generate the bankfull discharge. After the above mentioned corrections a quadratic polynomial surface was fitted onto these points with least squares regression. The cross section of this surface follows the theoretical slope-sinuosity graph, verifying that the flume experiments and natural rivers behave similarly. The differences between the fitted surface and the original points were caused by other river parameters, which also affect the natural rivers (e.g. the sediment discharge). Furthermore, this graph confirms the connection between the slope and sinuosity, so the sinuosity is a useable parameter to detect the changing slope. The research is made in the frame of project OTKA-NK83400 (SourceSink Hungary). The European Union and the European Social Fund also have provided financial support to the project under the grant agreement no. TÁMOP 4.2.1./B-09/1/KMR-2010-0003. References: Ackers, P., Charlton, F. G. (1971): The slope and resistance of small meandering channels. Inst. Civil Engineers Proc. Supp. XV, Paper 73625. Leopold, L. B., Wolman, M. G. (1957): River chanel patterns; braided, meandering and straight. USGS Prof. Paper 282B: 1-73. Schumm, S. A., Khan, H. R. (1972): Experimental study of channel patterns. Geol. Soc. Am. Bull. 83:1755-1770. Timár, G. (2003): Controls on channel sinuosity changes: a case study of the Tisza River, the Great Hungarian Plain. Quaternary Science Reviews 22: 2199-2207. Viczián E. (1905): Magyarország vízierői. Pallas, Budapest, 349 o.
Asres, Yihunie Hibstie; Mathuthu, Manny; Birhane, Marelgn Derso
2018-04-22
This study provides current evidence about cross-section production processes in the theoretical and experimental results of neutron induced reaction of uranium isotope on projectile energy range of 1-100 MeV in order to improve the reliability of nuclear stimulation. In such fission reactions of 235 U within nuclear reactors, much amount of energy would be released as a product that able to satisfy the needs of energy to the world wide without polluting processes as compared to other sources. The main objective of this work is to transform a related knowledge in the neutron-induced fission reactions on 235 U through describing, analyzing and interpreting the theoretical results of the cross sections obtained from computer code COMPLET by comparing with the experimental data obtained from EXFOR. The cross section value of 235 U(n,2n) 234 U, 235 U(n,3n) 233 U, 235 U(n,γ) 236 U, 235 U(n,f) are obtained using computer code COMPLET and the corresponding experimental values were browsed by EXFOR, IAEA. The theoretical results are compared with the experimental data taken from EXFOR Data Bank. Computer code COMPLET has been used for the analysis with the same set of input parameters and the graphs were plotted by the help of spreadsheet & Origin-8 software. The quantification of uncertainties stemming from both experimental data and computer code calculation plays a significant role in the final evaluated results. The calculated results for total cross sections were compared with the experimental data taken from EXFOR in the literature, and good agreement was found between the experimental and theoretical data. This comparison of the calculated data was analyzed and interpreted with tabulation and graphical descriptions, and the results were briefly discussed within the text of this research work. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Brundage, Michael D; Smith, Katherine C; Little, Emily A; Bantug, Elissa T; Snyder, Claire F
2015-10-01
Patient-reported outcomes (PROs) promote patient-centered care by using PRO research results ("group-level data") to inform decision making and by monitoring individual patient's PROs ("individual-level data") to inform care. We investigated the interpretability of current PRO data presentation formats. This cross-sectional mixed-methods study randomized purposively sampled cancer patients and clinicians to evaluate six group-data or four individual-data formats. A self-directed exercise assessed participants' interpretation accuracy and ratings of ease-of-understanding and usefulness (0 = least to 10 = most) of each format. Semi-structured qualitative interviews explored helpful and confusing format attributes. We reached thematic saturation with 50 patients (44 % < college graduate) and 20 clinicians. For group-level data, patients rated simple line graphs highest for ease-of-understanding and usefulness (median 8.0; 33 % selected for easiest to understand/most useful) and clinicians rated simple line graphs highest for ease-of-understanding and usefulness (median 9.0, 8.5) but most often selected line graphs with confidence limits or norms (30 % for each format for easiest to understand/most useful). Qualitative results support that clinicians value confidence intervals, norms, and p values, but patients find them confusing. For individual-level data, both patients and clinicians rated line graphs highest for ease-of-understanding (median 8.0 patients, 8.5 clinicians) and usefulness (median 8.0, 9.0) and selected them as easiest to understand (50, 70 %) and most useful (62, 80 %). The qualitative interviews supported highlighting scores requiring clinical attention and providing reference values. This study has identified preferences and opportunities for improving on current formats for PRO presentation and will inform development of best practices for PRO presentation. Both patients and clinicians prefer line graphs across group-level data and individual-level data formats, but clinicians prefer greater detail (e.g., statistical details) for group-level data.
NASA Astrophysics Data System (ADS)
Aragón, C.; Aguilera, J. A.
2015-07-01
The authors regret that Tables 2 and 5 provided in the concerned paper contain erroneous values for the line cross section σl. The correct data are listed in the following tables. Also, the σl values in Tables 4 and 6 are calculated for T=14,000 K and Ne=2.5×1017 cm-3, instead of the T, Ne values indicated in the table footnotes. These corrections do not change the results presented in the manuscript J. Quant. Spectrosc. Radiat. Transf. 149 (2014) 90-102.
Kitty field, Campbell County, Wyoming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, C.R.
1970-09-01
Statistical data, a table of cumulative production, a production graph, and several maps and cross sections make up this report. This is a stratigraphic trap oil field with facies and permeability barriers within the Cretaceous Muddy Sandstone reservoir. The field produces from 2 sandstones with an average pay thickness of 28 ft. Cumulative production through the first 3 mo. of 1970 is 6,757,207 bbl of oil and 11,684,884 Mcf of gas from 127 wells and from proved productive area of 13,600 acres. Ultimate primary recovery reserves are estimated to be 23,000,000 bbl of oil and 50,000,000 Mcf gas.
NASA Technical Reports Server (NTRS)
Hsiao, C.; Libove, C.
1972-01-01
Analysis and numerical results are presented for the elastic shear stiffness of a corrugated shear web with a certain type of discrete attachments at the ends of the trough lines of the corrugations, namely point attachments to a rigid flange which interferes with the deformations of the end cross sections by preventing downward movement but permitting upward (lifting off) movement. The analysis is based on certain assumed modes of deformation of the cross sections in conjunction with the method of minimum total potential energy and the calculus of variations in order to obtain equations for the manner in which the assumed modes of deformation vary along the length of the corrugation. The numerical results are restricted to the case of equal-width crests and troughs but otherwise apply to a wide variety of geometries. They are in the form of graphs which give the overall shear stiffness as a fraction of the overall shear stiffness that could be obtained by having continuous attachment at the ends of the corrugations.
Semantic web for integrated network analysis in biomedicine.
Chen, Huajun; Ding, Li; Wu, Zhaohui; Yu, Tong; Dhanapalan, Lavanya; Chen, Jake Y
2009-03-01
The Semantic Web technology enables integration of heterogeneous data on the World Wide Web by making the semantics of data explicit through formal ontologies. In this article, we survey the feasibility and state of the art of utilizing the Semantic Web technology to represent, integrate and analyze the knowledge in various biomedical networks. We introduce a new conceptual framework, semantic graph mining, to enable researchers to integrate graph mining with ontology reasoning in network data analysis. Through four case studies, we demonstrate how semantic graph mining can be applied to the analysis of disease-causal genes, Gene Ontology category cross-talks, drug efficacy analysis and herb-drug interactions analysis.
Colson, B.E.; Ming, C.O.; Arcement, George J.
1979-01-01
Floodflow data that will provide a base for evaluating digital models relating to open-channel flow were obtained at 22 sites on streams in Alabama, Louisiana, and Mississippi. Thirty-five floods were measured. Analysis of the data indicated methods currently in use would be inaccurate where densely vegetated flood plains are crossed by highway embankments and single-opening bridges. This atlas presents flood information at the site on West Fork Amite River near Liberty, MS. Water depths , velocities, and discharges through bridge openings on West Fork Amite River near Liberty, MS for floods of December 6, 1971 , and March 25, 1973, are shown, together with peak water-surface elevations along embankments and along cross sections. Manning 's roughness coefficient values in different parts of the flood plain are shown on maps, and flood-frequency relations are shown on a graph. (USGS).
Offdiagonal complexity: A computationally quick complexity measure for graphs and networks
NASA Astrophysics Data System (ADS)
Claussen, Jens Christian
2007-02-01
A vast variety of biological, social, and economical networks shows topologies drastically differing from random graphs; yet the quantitative characterization remains unsatisfactory from a conceptual point of view. Motivated from the discussion of small scale-free networks, a biased link distribution entropy is defined, which takes an extremum for a power-law distribution. This approach is extended to the node-node link cross-distribution, whose nondiagonal elements characterize the graph structure beyond link distribution, cluster coefficient and average path length. From here a simple (and computationally cheap) complexity measure can be defined. This offdiagonal complexity (OdC) is proposed as a novel measure to characterize the complexity of an undirected graph, or network. While both for regular lattices and fully connected networks OdC is zero, it takes a moderately low value for a random graph and shows high values for apparently complex structures as scale-free networks and hierarchical trees. The OdC approach is applied to the Helicobacter pylori protein interaction network and randomly rewired surrogates.
Patterns and Practices for Future Architectures
2014-08-01
14. SUBJECT TERMS computing architecture, graph algorithms, high-performance computing, big data , GPU 15. NUMBER OF PAGES 44 16. PRICE CODE 17...at Vertex 1 6 Figure 4: Data Structures Created by Kernel 1 of Single CPU, List Implementation Using the Graph in the Example from Section 1.2 9...Figure 5: Kernel 2 of Graph500 BFS Reference Implementation: Single CPU, List 10 Figure 6: Data Structures for Sequential CSR Algorithm 12 Figure 7
Math Description Engine Software Development Kit
NASA Technical Reports Server (NTRS)
Shelton, Robert O.; Smith, Stephanie L.; Dexter, Dan E.; Hodgson, Terry R.
2010-01-01
The Math Description Engine Software Development Kit (MDE SDK) can be used by software developers to make computer-rendered graphs more accessible to blind and visually-impaired users. The MDE SDK generates alternative graph descriptions in two forms: textual descriptions and non-verbal sound renderings, or sonification. It also enables display of an animated trace of a graph sonification on a visual graph component, with color and line-thickness options for users having low vision or color-related impairments. A set of accessible graphical user interface widgets is provided for operation by end users and for control of accessible graph displays. Version 1.0 of the MDE SDK generates text descriptions for 2D graphs commonly seen in math and science curriculum (and practice). The mathematically rich text descriptions can also serve as a virtual math and science assistant for blind and sighted users, making graphs more accessible for everyone. The MDE SDK has a simple application programming interface (API) that makes it easy for programmers and Web-site developers to make graphs accessible with just a few lines of code. The source code is written in Java for cross-platform compatibility and to take advantage of Java s built-in support for building accessible software application interfaces. Compiled-library and NASA Open Source versions are available with API documentation and Programmer s Guide at http:/ / prim e.jsc.n asa. gov.
Consistent initial conditions for the Saint-Venant equations in river network modeling
NASA Astrophysics Data System (ADS)
Yu, Cheng-Wei; Liu, Frank; Hodges, Ben R.
2017-09-01
Initial conditions for flows and depths (cross-sectional areas) throughout a river network are required for any time-marching (unsteady) solution of the one-dimensional (1-D) hydrodynamic Saint-Venant equations. For a river network modeled with several Strahler orders of tributaries, comprehensive and consistent synoptic data are typically lacking and synthetic starting conditions are needed. Because of underlying nonlinearity, poorly defined or inconsistent initial conditions can lead to convergence problems and long spin-up times in an unsteady solver. Two new approaches are defined and demonstrated herein for computing flows and cross-sectional areas (or depths). These methods can produce an initial condition data set that is consistent with modeled landscape runoff and river geometry boundary conditions at the initial time. These new methods are (1) the pseudo time-marching method (PTM) that iterates toward a steady-state initial condition using an unsteady Saint-Venant solver and (2) the steady-solution method (SSM) that makes use of graph theory for initial flow rates and solution of a steady-state 1-D momentum equation for the channel cross-sectional areas. The PTM is shown to be adequate for short river reaches but is significantly slower and has occasional non-convergent behavior for large river networks. The SSM approach is shown to provide a rapid solution of consistent initial conditions for both small and large networks, albeit with the requirement that additional code must be written rather than applying an existing unsteady Saint-Venant solver.
ERIC Educational Resources Information Center
Hsu, P.-S.; Van Dyke, M.; Chen, Y.; Smith, T. J.
2016-01-01
The purpose of this mixed-methods study was to explore how seventh graders in a suburban school in the United States and sixth graders in an urban school in Taiwan developed argumentation skills and science knowledge in a project-based learning environment that incorporated a graph-oriented, computer-assisted application (GOCAA). A total of 42…
Data Mining Meets HCI: Making Sense of Large Graphs
2012-07-01
graph algo- rithms, won the Open Source Software World Challenge, Silver Award. We have released Pegasus as free , open-source software, downloaded by...METIS [77], spectral clustering [108], and the parameter- free “Cross-associations” (CA) [26]. Belief Propagation can also be used for clus- tering, as...number of tools have been developed to support “ landscape ” views of information. These include WebBook and Web- Forager [23], which use a book metaphor
Geographical Influences of an Emerging Network of Gang Rivalries
2011-03-17
Hollenbeck in the N × M environment grid. The semi-permeable boundaries encoded in the model are displayed in the center image. The shades of gray of...intensity. Light shades of gray correspond to high density values near one and dark shades correspond to low densities near zero. The boundary crossing...Threshold Graphs ( GTG ) For comparison to the networks produced by our simulations, we constructed an instance of a Geograph- ical Threshold Graph ( GTG
Electroweak and QCD corrections to top-pair hadroproduction in association with heavy bosons
Frixione, Stefano; Hirschi, V.; Pagani, D.; ...
2015-06-26
Here, we compute the contribution of order α S 2α 2 to the cross section of a top-antitop pair in association with at least one heavy Standard Model boson — Z, W ±, and Higgs — by including all effects of QCD, QED, and weak origin and by working in the automated MadGraph5_aMC@NLO framework. Furthermore, this next-to-leading order contribution is then combined with that of order αS3α, and with the two dominant lowest-order ones, α S 2α and α Sα 2, to obtain phenomenological results relevant to a 8, 13, and 100 TeV pp collider.
State transfer in highly connected networks and a quantum Babinet principle
NASA Astrophysics Data System (ADS)
Tsomokos, D. I.; Plenio, M. B.; de Vega, I.; Huelga, S. F.
2008-12-01
The transfer of a quantum state between distant nodes in two-dimensional networks is considered. The fidelity of state transfer is calculated as a function of the number of interactions in networks that are described by regular graphs. It is shown that perfect state transfer is achieved in a network of size N , whose structure is that of an (N/2) -cross polytope graph, if N is a multiple of 4 . The result is reminiscent of the Babinet principle of classical optics. A quantum Babinet principle is derived, which allows for the identification of complementary graphs leading to the same fidelity of state transfer, in analogy with complementary screens providing identical diffraction patterns.
Customized Corneal Cross-Linking-A Mathematical Model.
Caruso, Ciro; Epstein, Robert L; Ostacolo, Carmine; Pacente, Luigi; Troisi, Salvatore; Barbaro, Gaetano
2017-05-01
To improve the safety, reproducibility, and depth of effect of corneal cross-linking with the ultraviolet A (UV-A) exposure time and fluence customized according to the corneal thickness. Twelve human corneas were used for the experimental protocol. They were soaked using a transepithelial (EPI-ON) technique using riboflavin with the permeation enhancer vitamin E-tocopheryl polyethylene glycol succinate. The corneas were then placed on microscope slides and irradiated at 3 mW/cm for 30 minutes. The UV-A output parameters were measured to build a new equation describing the time-dependent loss of endothelial protection induced by riboflavin during cross-linking, as well as a pachymetry-dependent and exposure time-dependent prescription for input UV-A fluence. The proposed equation was used to establish graphs prescribing the maximum UV-A fluence input versus exposure time that always maintains corneal endothelium exposure below toxicity limits. Analysis modifying the Lambert-Beer law for riboflavin oxidation leads to graphs of the maximum safe level of UV-A radiation fluence versus the time applied and thickness of the treated cornea. These graphs prescribe UV-A fluence levels below 1.8 mW/cm for corneas of thickness 540 μm down to 1.2 mW/cm for corneas of thickness 350 μm. Irradiation times are typically below 15 minutes. The experimental and mathematical analyses establish the basis for graphs that prescribe maximum safe fluence and UV-A exposure time for corneas of different thicknesses. Because this clinically tested protocol specifies a corneal surface clear of shielding riboflavin on the corneal surface during UV-A irradiation, it allows for shorter UV-A irradiation time and lower fluence than in the Dresden protocol.
XLWrap - Querying and Integrating Arbitrary Spreadsheets with SPARQL
NASA Astrophysics Data System (ADS)
Langegger, Andreas; Wöß, Wolfram
In this paper a novel approach is presented for generating RDF graphs of arbitrary complexity from various spreadsheet layouts. Currently, none of the available spreadsheet-to-RDF wrappers supports cross tables and tables where data is not aligned in rows. Similar to RDF123, XLWrap is based on template graphs where fragments of triples can be mapped to specific cells of a spreadsheet. Additionally, it features a full expression algebra based on the syntax of OpenOffice Calc and various shift operations, which can be used to repeat similar mappings in order to wrap cross tables including multiple sheets and spreadsheet files. The set of available expression functions includes most of the native functions of OpenOffice Calc and can be easily extended by users of XLWrap.
Wear Detection of Drill Bit by Image-based Technique
NASA Astrophysics Data System (ADS)
Sukeri, Maziyah; Zulhilmi Paiz Ismadi, Mohd; Rahim Othman, Abdul; Kamaruddin, Shahrul
2018-03-01
Image processing for computer vision function plays an essential aspect in the manufacturing industries for the tool condition monitoring. This study proposes a dependable direct measurement method to measure the tool wear using image-based analysis. Segmentation and thresholding technique were used as the means to filter and convert the colour image to binary datasets. Then, the edge detection method was applied to characterize the edge of the drill bit. By using cross-correlation method, the edges of original and worn drill bits were correlated to each other. Cross-correlation graphs were able to detect the difference of the worn edge despite small difference between the graphs. Future development will focus on quantifying the worn profile as well as enhancing the sensitivity of the technique.
Solving Multi-variate Polynomial Equations in a Finite Field
2013-06-01
Algebraic Background In this section, some algebraic definitions and basics are discussed as they pertain to this re- search. For a more detailed...definitions and basics are discussed as they pertain to this research. For a more detailed treatment, consult a graph theory text such as [10]. A graph G...graph if V(G) can be partitioned into k subsets V1,V2, ...,Vk such that uv is only an edge of G if u and v belong to different partite sets. If, in
Hahm, Jarang; Lee, Hyekyoung; Park, Hyojin; Kang, Eunjoo; Kim, Yu Kyeong; Chung, Chun Kee; Kang, Hyejin; Lee, Dong Soo
2017-01-01
To explain gating of memory encoding, magnetoencephalography (MEG) was analyzed over multi-regional network of negative correlations between alpha band power during cue (cue-alpha) and gamma band power during item presentation (item-gamma) in Remember (R) and No-remember (NR) condition. Persistent homology with graph filtration on alpha-gamma correlation disclosed topological invariants to explain memory gating. Instruction compliance (R-hits minus NR-hits) was significantly related to negative coupling between the left superior occipital (cue-alpha) and the left dorsolateral superior frontal gyri (item-gamma) on permutation test, where the coupling was stronger in R than NR. In good memory performers (R-hits minus false alarm), the coupling was stronger in R than NR between the right posterior cingulate (cue-alpha) and the left fusiform gyri (item-gamma). Gating of memory encoding was dictated by inter-regional negative alpha-gamma coupling. Our graph filtration over MEG network revealed these inter-regional time-delayed cross-frequency connectivity serve gating of memory encoding. PMID:28169281
Westchester Community College. 1993 Fact Book.
ERIC Educational Resources Information Center
Lee, Marcia M.; And Others
Tables and graphs drawn from research reports prepared in 1993 by the Office of Institutional Research and Planning at Westchester Community College (WCC) in New York are presented. The sections of the report provide tables, charts, and graphs showing recent facts and statistics that affect: (1) administrative personnel, including a list of…
MCNP5 CALCULATIONS REPLICATING ARH-600 NITRATE DATA
DOE Office of Scientific and Technical Information (OSTI.GOV)
FINFROCK SH
This report serves to extend the previous document: 'MCNP Calculations Replicating ARH-600 Data' by replicating the nitrate curves found in ARH-600. This report includes the MCNP models used, the calculated critical dimension for each analyzed parameter set, and the resulting data libraries for use with the CritView code. As with the ARH-600 data, this report is not meant to replace the analysis of the fissile systems by qualified criticality personnel. The M CNP data is presented without accounting for the statistical uncertainty (although this is typically less than 0.001) or bias and, as such, the application of a reasonable safetymore » margin is required. The data that follows pertains to the uranyl nitrate and plutonium nitrate spheres, infinite cylinders, and infinite slabs of varying isotopic composition, reflector thickness, and molarity. Each of the cases was modeled in MCNP (version 5.1.40), using the ENDF/B-VI cross section set. Given a molarity, isotopic composition, and reflector thickness, the fissile concentration and diameter (or thicknesses in the case of the slab geometries) were varied. The diameter for which k-effective equals 1.00 for a given concentration could then be calculated and graphed. These graphs are included in this report. The pages that follow describe the regions modeled, formulas for calculating the various parameters, a list of cross-sections used in the calculations, a description of the automation routine and data, and finally the data output. The data of most interest are the critical dimensions of the various systems analyzed. This is presented graphically, and in table format, in Appendix B. Appendix C provides a text listing of the same data in a format that is compatible with the CritView code. Appendices D and E provide listing of example Template files and MCNP input files (these are discussed further in Section 4). Appendix F is a complete listing of all of the output data (i.e., all of the analyzed dimensions and the resulting k{sub eff} values).« less
Prediction of energy expenditure and physical activity in preschoolers.
Butte, Nancy F; Wong, William W; Lee, Jong Soo; Adolph, Anne L; Puyau, Maurice R; Zakeri, Issa F
2014-06-01
Accurate, nonintrusive, and feasible methods are needed to predict energy expenditure (EE) and physical activity (PA) levels in preschoolers. Herein, we validated cross-sectional time series (CSTS) and multivariate adaptive regression splines (MARS) models based on accelerometry and heart rate (HR) for the prediction of EE using room calorimetry and doubly labeled water (DLW) and established accelerometry cut points for PA levels. Fifty preschoolers, mean ± SD age of 4.5 ± 0.8 yr, participated in room calorimetry for minute-by-minute measurements of EE, accelerometer counts (AC) (Actiheart and ActiGraph GT3X+), and HR (Actiheart). Free-living 105 children, ages 4.6 ± 0.9 yr, completed the 7-d DLW procedure while wearing the devices. AC cut points for PA levels were established using smoothing splines and receiver operating characteristic curves. On the basis of calorimetry, mean percent errors for EE were -2.9% ± 10.8% and -1.1% ± 7.4% for CSTS models and -1.9% ± 9.6% and 1.3% ± 8.1% for MARS models using the Actiheart and ActiGraph+HR devices, respectively. On the basis of DLW, mean percent errors were -0.5% ± 9.7% and 4.1% ± 8.5% for CSTS models and 3.2% ± 10.1% and 7.5% ± 10.0% for MARS models using the Actiheart and ActiGraph+HR devices, respectively. Applying activity EE thresholds, final accelerometer cut points were determined: 41, 449, and 1297 cpm for Actiheart x-axis; 820, 3908, and 6112 cpm for ActiGraph vector magnitude; and 240, 2120, and 4450 cpm for ActiGraph x-axis for sedentary/light, light/moderate, and moderate/vigorous PA (MVPA), respectively. On the basis of confusion matrices, correctly classified rates were 81%-83% for sedentary PA, 58%-64% for light PA, and 62%-73% for MVPA. The lack of bias and acceptable limits of agreement affirms the validity of the CSTS and MARS models for the prediction of EE in preschool-aged children. Accelerometer cut points are satisfactory for the classification of sedentary, light, and moderate/vigorous levels of PA in preschoolers.
Analysis Tools for Interconnected Boolean Networks With Biological Applications.
Chaves, Madalena; Tournier, Laurent
2018-01-01
Boolean networks with asynchronous updates are a class of logical models particularly well adapted to describe the dynamics of biological networks with uncertain measures. The state space of these models can be described by an asynchronous state transition graph, which represents all the possible exits from every single state, and gives a global image of all the possible trajectories of the system. In addition, the asynchronous state transition graph can be associated with an absorbing Markov chain, further providing a semi-quantitative framework where it becomes possible to compute probabilities for the different trajectories. For large networks, however, such direct analyses become computationally untractable, given the exponential dimension of the graph. Exploiting the general modularity of biological systems, we have introduced the novel concept of asymptotic graph , computed as an interconnection of several asynchronous transition graphs and recovering all asymptotic behaviors of a large interconnected system from the behavior of its smaller modules. From a modeling point of view, the interconnection of networks is very useful to address for instance the interplay between known biological modules and to test different hypotheses on the nature of their mutual regulatory links. This paper develops two new features of this general methodology: a quantitative dimension is added to the asymptotic graph, through the computation of relative probabilities for each final attractor and a companion cross-graph is introduced to complement the method on a theoretical point of view.
An approximation method for improving dynamic network model fitting.
Carnegie, Nicole Bohme; Krivitsky, Pavel N; Hunter, David R; Goodreau, Steven M
There has been a great deal of interest recently in the modeling and simulation of dynamic networks, i.e., networks that change over time. One promising model is the separable temporal exponential-family random graph model (ERGM) of Krivitsky and Handcock, which treats the formation and dissolution of ties in parallel at each time step as independent ERGMs. However, the computational cost of fitting these models can be substantial, particularly for large, sparse networks. Fitting cross-sectional models for observations of a network at a single point in time, while still a non-negligible computational burden, is much easier. This paper examines model fitting when the available data consist of independent measures of cross-sectional network structure and the duration of relationships under the assumption of stationarity. We introduce a simple approximation to the dynamic parameters for sparse networks with relationships of moderate or long duration and show that the approximation method works best in precisely those cases where parameter estimation is most likely to fail-networks with very little change at each time step. We consider a variety of cases: Bernoulli formation and dissolution of ties, independent-tie formation and Bernoulli dissolution, independent-tie formation and dissolution, and dependent-tie formation models.
Directed acyclic graphs (DAGs): an aid to assess confounding in dental research.
Merchant, Anwar T; Pitiphat, Waranuch
2002-12-01
Confounding, a special type of bias, occurs when an extraneous factor is associated with the exposure and independently affects the outcome. In order to get an unbiased estimate of the exposure-outcome relationship, we need to identify potential confounders, collect information on them, design appropriate studies, and adjust for confounding in data analysis. However, it is not always clear which variables to collect information on and adjust for in the analyses. Inappropriate adjustment for confounding can even introduce bias where none existed. Directed acyclic graphs (DAGs) provide a method to select potential confounders and minimize bias in the design and analysis of epidemiological studies. DAGs have been used extensively in expert systems and robotics. Robins (1987) introduced the application of DAGs in epidemiology to overcome shortcomings of traditional methods to control for confounding, especially as they related to unmeasured confounding. DAGs provide a quick and visual way to assess confounding without making parametric assumptions. We introduce DAGs, starting with definitions and rules for basic manipulation, stressing more on applications than theory. We then demonstrate their application in the control of confounding through examples of observational and cross-sectional epidemiological studies.
Thermal Energy Storage in Phase Change Material.
1982-03-01
Graphs of the exnerimental results follow: tney are groupea in the tree categories: tube cross flow, ricked bed, and tube parallel flow. A. Tube Cross... Riordan , Michael, "Thermal Storage: A Rtsic Guile to the Ptate of the Art", Solar Age, Aril, 1978, P. 10. 5. Telkes, Maria, "Thermal Lner y Storage in
Little Shrimp, Big Results: A Model of an Integrative, Cross-Curricular Activity
ERIC Educational Resources Information Center
Ackerson, Nicole; Piser, Carol; Walka, Keith
2010-01-01
This integrative, cross-curricular lab engages middle school biology students in an exercise involving ecology, arthropod biology, and mathematics. Students research the anatomy and behavioral patterns of a species of brine shrimp, compare the anatomy of adult and juvenile brine shrimp, and graph and interpret results. In this article, the authors…
A Machine Learning Concept for DTN Routing
NASA Technical Reports Server (NTRS)
Dudukovich, Rachel; Hylton, Alan; Papachristou, Christos
2017-01-01
This paper discusses the concept and architecture of a machine learning based router for delay tolerant space networks. The techniques of reinforcement learning and Bayesian learning are used to supplement the routing decisions of the popular Contact Graph Routing algorithm. An introduction to the concepts of Contact Graph Routing, Q-routing and Naive Bayes classification are given. The development of an architecture for a cross-layer feedback framework for DTN (Delay-Tolerant Networking) protocols is discussed. Finally, initial simulation setup and results are given.
Edge compression techniques for visualization of dense directed graphs.
Dwyer, Tim; Henry Riche, Nathalie; Marriott, Kim; Mears, Christopher
2013-12-01
We explore the effectiveness of visualizing dense directed graphs by replacing individual edges with edges connected to 'modules'-or groups of nodes-such that the new edges imply aggregate connectivity. We only consider techniques that offer a lossless compression: that is, where the entire graph can still be read from the compressed version. The techniques considered are: a simple grouping of nodes with identical neighbor sets; Modular Decomposition which permits internal structure in modules and allows them to be nested; and Power Graph Analysis which further allows edges to cross module boundaries. These techniques all have the same goal--to compress the set of edges that need to be rendered to fully convey connectivity--but each successive relaxation of the module definition permits fewer edges to be drawn in the rendered graph. Each successive technique also, we hypothesize, requires a higher degree of mental effort to interpret. We test this hypothetical trade-off with two studies involving human participants. For Power Graph Analysis we propose a novel optimal technique based on constraint programming. This enables us to explore the parameter space for the technique more precisely than could be achieved with a heuristic. Although applicable to many domains, we are motivated by--and discuss in particular--the application to software dependency analysis.
40 CFR 141.602 - System specific studies.
Code of Federal Regulations, 2014 CFR
2014-07-01
... the storage facility with the highest residence time in each pressure zone, and a time series graph of... (a)(2)(ii) of this section, and a 24-hour time series graph of residence time for each subpart V...-compliance results generated during the time period beginning with the first reported result and ending with...
40 CFR 141.602 - System specific studies.
Code of Federal Regulations, 2010 CFR
2010-07-01
... the storage facility with the highest residence time in each pressure zone, and a time series graph of... (a)(2)(ii) of this section, and a 24-hour time series graph of residence time for each subpart V... compliance and non-compliance results generated during the time period beginning with the first reported...
40 CFR 141.602 - System specific studies.
Code of Federal Regulations, 2012 CFR
2012-07-01
... the storage facility with the highest residence time in each pressure zone, and a time series graph of... (a)(2)(ii) of this section, and a 24-hour time series graph of residence time for each subpart V... compliance and non-compliance results generated during the time period beginning with the first reported...
40 CFR 141.602 - System specific studies.
Code of Federal Regulations, 2013 CFR
2013-07-01
... the storage facility with the highest residence time in each pressure zone, and a time series graph of... (a)(2)(ii) of this section, and a 24-hour time series graph of residence time for each subpart V... compliance and non-compliance results generated during the time period beginning with the first reported...
40 CFR 141.602 - System specific studies.
Code of Federal Regulations, 2011 CFR
2011-07-01
... the storage facility with the highest residence time in each pressure zone, and a time series graph of... (a)(2)(ii) of this section, and a 24-hour time series graph of residence time for each subpart V... compliance and non-compliance results generated during the time period beginning with the first reported...
Intelligent Data Visualization for Cross-Checking Spacecraft System Diagnosis
NASA Technical Reports Server (NTRS)
Ong, James C.; Remolina, Emilio; Breeden, David; Stroozas, Brett A.; Mohammed, John L.
2012-01-01
Any reasoning system is fallible, so crew members and flight controllers must be able to cross-check automated diagnoses of spacecraft or habitat problems by considering alternate diagnoses and analyzing related evidence. Cross-checking improves diagnostic accuracy because people can apply information processing heuristics, pattern recognition techniques, and reasoning methods that the automated diagnostic system may not possess. Over time, cross-checking also enables crew members to become comfortable with how the diagnostic reasoning system performs, so the system can earn the crew s trust. We developed intelligent data visualization software that helps users cross-check automated diagnoses of system faults more effectively. The user interface displays scrollable arrays of timelines and time-series graphs, which are tightly integrated with an interactive, color-coded system schematic to show important spatial-temporal data patterns. Signal processing and rule-based diagnostic reasoning automatically identify alternate hypotheses and data patterns that support or rebut the original and alternate diagnoses. A color-coded matrix display summarizes the supporting or rebutting evidence for each diagnosis, and a drill-down capability enables crew members to quickly view graphs and timelines of the underlying data. This system demonstrates that modest amounts of diagnostic reasoning, combined with interactive, information-dense data visualizations, can accelerate system diagnosis and cross-checking.
NASA Astrophysics Data System (ADS)
Weinheimer, Oliver; Wielpütz, Mark O.; Konietzke, Philip; Heussel, Claus P.; Kauczor, Hans-Ulrich; Brochhausen, Christoph; Hollemann, David; Savage, Dasha; Galbán, Craig J.; Robinson, Terry E.
2017-02-01
Cystic Fibrosis (CF) results in severe bronchiectasis in nearly all cases. Bronchiectasis is a disease where parts of the airways are permanently dilated. The development and the progression of bronchiectasis is not evenly distributed over the entire lungs - rather, individual functional units are affected differently. We developed a fully automated method for the precise calculation of lobe-based airway taper indices. To calculate taper indices, some preparatory algorithms are needed. The airway tree is segmented, skeletonized and transformed to a rooted acyclic graph. This graph is used to label the airways. Then a modified version of the previously validated integral based method (IBM) for airway geometry determination is utilized. The rooted graph, the airway lumen and wall information are then used to calculate the airway taper indices. Using a computer-generated phantom simulating 10 cross sections of airways we present results showing a high accuracy of the modified IBM. The new taper index calculation method was applied to 144 volumetric inspiratory low-dose MDCT scans. The scans were acquired from 36 children with mild CF at 4 time-points (baseline, 3 month, 1 year, 2 years). We found a moderate correlation with the visual lobar Brody bronchiectasis scores by three raters (r2 = 0.36, p < .0001). The taper index has the potential to be a precise imaging biomarker but further improvements are needed. In combination with other imaging biomarkers, taper index calculation can be an important tool for monitoring the progression and the individual treatment of patients with bronchiectasis.
Registration of 3D spectral OCT volumes combining ICP with a graph-based approach
NASA Astrophysics Data System (ADS)
Niemeijer, Meindert; Lee, Kyungmoo; Garvin, Mona K.; Abràmoff, Michael D.; Sonka, Milan
2012-02-01
The introduction of spectral Optical Coherence Tomography (OCT) scanners has enabled acquisition of high resolution, 3D cross-sectional volumetric images of the retina. 3D-OCT is used to detect and manage eye diseases such as glaucoma and age-related macular degeneration. To follow-up patients over time, image registration is a vital tool to enable more precise, quantitative comparison of disease states. In this work we present a 3D registrationmethod based on a two-step approach. In the first step we register both scans in the XY domain using an Iterative Closest Point (ICP) based algorithm. This algorithm is applied to vessel segmentations obtained from the projection image of each scan. The distance minimized in the ICP algorithm includes measurements of the vessel orientation and vessel width to allow for a more robust match. In the second step, a graph-based method is applied to find the optimal translation along the depth axis of the individual A-scans in the volume to match both scans. The cost image used to construct the graph is based on the mean squared error (MSE) between matching A-scans in both images at different translations. We have applied this method to the registration of Optic Nerve Head (ONH) centered 3D-OCT scans of the same patient. First, 10 3D-OCT scans of 5 eyes with glaucoma imaged in vivo were registered for a qualitative evaluation of the algorithm performance. Then, 17 OCT data set pairs of 17 eyes with known deformation were used for quantitative assessment of the method's robustness.
Palacín, Arantxa; Gómez-Casado, Cristina; Rivas, Luis A.; Aguirre, Jacobo; Tordesillas, Leticia; Bartra, Joan; Blanco, Carlos; Carrillo, Teresa; Cuesta-Herranz, Javier; de Frutos, Consolación; Álvarez-Eire, Genoveva García; Fernández, Francisco J.; Gamboa, Pedro; Muñoz, Rosa; Sánchez-Monge, Rosa; Sirvent, Sofía; Torres, María J.; Varela-Losada, Susana; Rodríguez, Rosalía; Parro, Victor; Blanca, Miguel; Salcedo, Gabriel; Díaz-Perales, Araceli
2012-01-01
The study of cross-reactivity in allergy is key to both understanding. the allergic response of many patients and providing them with a rational treatment In the present study, protein microarrays and a co-sensitization graph approach were used in conjunction with an allergen microarray immunoassay. This enabled us to include a wide number of proteins and a large number of patients, and to study sensitization profiles among members of the LTP family. Fourteen LTPs from the most frequent plant food-induced allergies in the geographical area studied were printed into a microarray specifically designed for this research. 212 patients with fruit allergy and 117 food-tolerant pollen allergic subjects were recruited from seven regions of Spain with different pollen profiles, and their sera were tested with allergen microarray. This approach has proven itself to be a good tool to study cross-reactivity between members of LTP family, and could become a useful strategy to analyze other families of allergens. PMID:23272072
Learning of Multimodal Representations With Random Walks on the Click Graph.
Wu, Fei; Lu, Xinyan; Song, Jun; Yan, Shuicheng; Zhang, Zhongfei Mark; Rui, Yong; Zhuang, Yueting
2016-02-01
In multimedia information retrieval, most classic approaches tend to represent different modalities of media in the same feature space. With the click data collected from the users' searching behavior, existing approaches take either one-to-one paired data (text-image pairs) or ranking examples (text-query-image and/or image-query-text ranking lists) as training examples, which do not make full use of the click data, particularly the implicit connections among the data objects. In this paper, we treat the click data as a large click graph, in which vertices are images/text queries and edges indicate the clicks between an image and a query. We consider learning a multimodal representation from the perspective of encoding the explicit/implicit relevance relationship between the vertices in the click graph. By minimizing both the truncated random walk loss as well as the distance between the learned representation of vertices and their corresponding deep neural network output, the proposed model which is named multimodal random walk neural network (MRW-NN) can be applied to not only learn robust representation of the existing multimodal data in the click graph, but also deal with the unseen queries and images to support cross-modal retrieval. We evaluate the latent representation learned by MRW-NN on a public large-scale click log data set Clickture and further show that MRW-NN achieves much better cross-modal retrieval performance on the unseen queries/images than the other state-of-the-art methods.
Boddy, Lynne M; Noonan, Robert J; Kim, Youngwon; Rowlands, Alex V; Welk, Greg J; Knowles, Zoe R; Fairclough, Stuart J
2018-03-28
To examine the comparability of children's free-living sedentary time (ST) derived from raw acceleration thresholds for wrist mounted GENEActiv accelerometer data, with ST estimated using the waist mounted ActiGraph 100count·min -1 threshold. Secondary data analysis. 108 10-11-year-old children (n=43 boys) from Liverpool, UK wore one ActiGraph GT3X+ and one GENEActiv accelerometer on their right hip and left wrist, respectively for seven days. Signal vector magnitude (SVM; mg) was calculated using the ENMO approach for GENEActiv data. ST was estimated from hip-worn ActiGraph data, applying the widely used 100count·min -1 threshold. ROC analysis using 10-fold hold-out cross-validation was conducted to establish a wrist-worn GENEActiv threshold comparable to the hip ActiGraph 100count·min -1 threshold. GENEActiv data were also classified using three empirical wrist thresholds and equivalence testing was completed. Analysis indicated that a GENEActiv SVM value of 51mg demonstrated fair to moderate agreement (Kappa: 0.32-0.41) with the 100count·min -1 threshold. However, the generated and empirical thresholds for GENEActiv devices were not significantly equivalent to ActiGraph 100count·min -1 . GENEActiv data classified using the 35.6mg threshold intended for ActiGraph devices generated significantly equivalent ST estimates as the ActiGraph 100count·min -1 . The newly generated and empirical GENEActiv wrist thresholds do not provide equivalent estimates of ST to the ActiGraph 100count·min -1 approach. More investigation is required to assess the validity of applying ActiGraph cutpoints to GENEActiv data. Future studies are needed to examine the backward compatibility of ST data and to produce a robust method of classifying SVM-derived ST. Copyright © 2018 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
Integrating Semantic Information in Metadata Descriptions for a Geoscience-wide Resource Inventory.
NASA Astrophysics Data System (ADS)
Zaslavsky, I.; Richard, S. M.; Gupta, A.; Valentine, D.; Whitenack, T.; Ozyurt, I. B.; Grethe, J. S.; Schachne, A.
2016-12-01
Integrating semantic information into legacy metadata catalogs is a challenging issue and so far has been mostly done on a limited scale. We present experience of CINERGI (Community Inventory of Earthcube Resources for Geoscience Interoperability), an NSF Earthcube Building Block project, in creating a large cross-disciplinary catalog of geoscience information resources to enable cross-domain discovery. The project developed a pipeline for automatically augmenting resource metadata, in particular generating keywords that describe metadata documents harvested from multiple geoscience information repositories or contributed by geoscientists through various channels including surveys and domain resource inventories. The pipeline examines available metadata descriptions using text parsing, vocabulary management and semantic annotation and graph navigation services of GeoSciGraph. GeoSciGraph, in turn, relies on a large cross-domain ontology of geoscience terms, which bridges several independently developed ontologies or taxonomies including SWEET, ENVO, YAGO, GeoSciML, GCMD, SWO, and CHEBI. The ontology content enables automatic extraction of keywords reflecting science domains, equipment used, geospatial features, measured properties, methods, processes, etc. We specifically focus on issues of cross-domain geoscience ontology creation, resolving several types of semantic conflicts among component ontologies or vocabularies, and constructing and managing facets for improved data discovery and navigation. The ontology and keyword generation rules are iteratively improved as pipeline results are presented to data managers for selective manual curation via a CINERGI Annotator user interface. We present lessons learned from applying CINERGI metadata augmentation pipeline to a number of federal agency and academic data registries, in the context of several use cases that require data discovery and integration across multiple earth science data catalogs of varying quality and completeness. The inventory is accessible at http://cinergi.sdsc.edu, and the CINERGI project web page is http://earthcube.org/group/cinergi
BootGraph: probabilistic fiber tractography using bootstrap algorithms and graph theory.
Vorburger, Robert S; Reischauer, Carolin; Boesiger, Peter
2013-02-01
Bootstrap methods have recently been introduced to diffusion-weighted magnetic resonance imaging to estimate the measurement uncertainty of ensuing diffusion parameters directly from the acquired data without the necessity to assume a noise model. These methods have been previously combined with deterministic streamline tractography algorithms to allow for the assessment of connection probabilities in the human brain. Thereby, the local noise induced disturbance in the diffusion data is accumulated additively due to the incremental progression of streamline tractography algorithms. Graph based approaches have been proposed to overcome this drawback of streamline techniques. For this reason, the bootstrap method is in the present work incorporated into a graph setup to derive a new probabilistic fiber tractography method, called BootGraph. The acquired data set is thereby converted into a weighted, undirected graph by defining a vertex in each voxel and edges between adjacent vertices. By means of the cone of uncertainty, which is derived using the wild bootstrap, a weight is thereafter assigned to each edge. Two path finding algorithms are subsequently applied to derive connection probabilities. While the first algorithm is based on the shortest path approach, the second algorithm takes all existing paths between two vertices into consideration. Tracking results are compared to an established algorithm based on the bootstrap method in combination with streamline fiber tractography and to another graph based algorithm. The BootGraph shows a very good performance in crossing situations with respect to false negatives and permits incorporating additional constraints, such as a curvature threshold. By inheriting the advantages of the bootstrap method and graph theory, the BootGraph method provides a computationally efficient and flexible probabilistic tractography setup to compute connection probability maps and virtual fiber pathways without the drawbacks of streamline tractography algorithms or the assumption of a noise distribution. Moreover, the BootGraph can be applied to common DTI data sets without further modifications and shows a high repeatability. Thus, it is very well suited for longitudinal studies and meta-studies based on DTI. Copyright © 2012 Elsevier Inc. All rights reserved.
An Ellipse Morphs to a Cosine Graph!
ERIC Educational Resources Information Center
King, L .R.
2013-01-01
We produce a continuum of curves all of the same length, beginning with an ellipse and ending with a cosine graph. The curves in the continuum are made by cutting and unrolling circular cones whose section is the ellipse; the initial cone is degenerate (it is the plane of the ellipse); the final cone is a circular cylinder. The curves of the…
ERIC Educational Resources Information Center
Thomas, Ryan Vail
2016-01-01
The goal of this study is to explore and characterize the effects of using a dynamic graphing utility (DGU) on conceptual understanding and attitudes toward mathematics, measured by the responses of college algebra students to an attitude survey and concepts assessment. Two sections of college algebra taught by the primary researcher are included…
Moving beyond the Bar Plot and the Line Graph to Create Informative and Attractive Graphics
ERIC Educational Resources Information Center
Larson-Hall, Jenifer
2017-01-01
Graphics are often mistaken for a mere frill in the methodological arsenal of data analysis when in fact they can be one of the simplest and at the same time most powerful methods of communicating statistical information (Tufte, 2001). The first section of the article argues for the statistical necessity of graphs, echoing and amplifying similar…
World Eagle, The Monthly Social Studies Resource: Data, Maps, Graphs. 1990-1991.
ERIC Educational Resources Information Center
World Eagle, 1991
1991-01-01
This document consists of the 10 issues of "World Eagle" issued during the 1990-1991 school year. World Eagle is a monthly social studies resource in which demographic and geographic information is presented in the forms of maps, graphs, charts, and text. Each issue of World Eagle has a section that focuses on a particular topic, along with other…
Kim, Miji
2015-02-01
The purpose of this study was to examine the association between objective measures of sleep quality and obesity in older community-dwelling people. This cross-sectional study included 189 community-dwelling adults aged ≥ 80 yr (83.4 ± 2.5 yr [age range, 80-95 yr]). Participants wore an accelerometer (ActiGraph GT3X+) on their non-dominant wrist 24 hr per day for 7 consecutive nights. Sleep parameters measured included total sleep time, sleep efficiency, and wake after sleep onset (WASO) during the night. Associations between sleep parameters and obesity were investigated by using multivariate logistic regression analysis. In multivariate models, those with sleep efficiency lower than 85% had a 2.85-fold increased odds of obesity, compared with those with sleep efficiency of 85% or higher. Similarly, those with WASO of ≥ 60 min (compared with < 60 min) had a 3.13-fold increased odds of obesity. However, there were no significant associations between total sleep time or self-reported napping duration and obesity. We found that poor sleep quality was an independent risk factor for obesity in community-dwelling Japanese adults aged ≥ 80 yr, even after controlling for potential confounding factors, including daily physical activity.
Loprinzi, Paul D; Edwards, Meghan
2015-09-01
Emerging work suggests an inverse association between physical activity and erectile dysfunction (ED). The majority of this cross-sectional research comes from convenience samples and all studies on this topic have employed self-report physical activity methodology. Therefore, the purpose of this brief-report, confirmatory research study was to examine the association between objectively measured physical activity and ED in a national sample of Americans. Data from the 2003-2004 National Health and Nutrition Examination Survey were used. Six hundred ninety-two adults between the ages of 50 and 85 years (representing 33.2 million adults) constituted the analytic sample. Participants wore an ActiGraph 7164 accelerometer (ActiGraph, Pensacola, FL, USA) for up to 7 days with ED assessed via self-report. The main outcome measure used was ED assessed via self-report. After adjustments, for every 30 min/day increase in moderate-to-vigorous physical activity, participants had a 43% reduced odds of having ED (odds ratioadjusted = 0.57; 95% confidence interval: 0.40-0.81; P = 0.004). This confirmatory study employing an objective measure of physical activity in a national sample suggests an inverse association between physical activity and ED. © 2015 International Society for Sexual Medicine.
NASA Astrophysics Data System (ADS)
Ayu Cyntya Dewi, Dyah; Shaufiah; Asror, Ibnu
2018-03-01
SMS (Short Message Service) is on e of the communication services that still be the main choice, although now the phone grow with various applications. Along with the development of various other communication media, some countries lowered SMS rates to keep the interest of mobile users. It resulted in increased spam SMS that used by several parties, one of them for advertisement. Given the kind of multi-lingual documents in a message SMS, the Web, and others, necessary for effective multilingual or cross-lingual processing techniques is becoming increasingly important. The steps that performed in this research is data / messages first preprocessing then represented into a graph model. Then calculated using GKNN method. From this research we get the maximum accuracy is 98.86 with training data in Indonesian language and testing data in indonesian language with K 10 and threshold 0.001.
Cross-layer shared protection strategy towards data plane in software defined optical networks
NASA Astrophysics Data System (ADS)
Xiong, Yu; Li, Zhiqiang; Zhou, Bin; Dong, Xiancun
2018-04-01
In order to ensure reliable data transmission on the data plane and minimize resource consumption, a novel protection strategy towards data plane is proposed in software defined optical networks (SDON). Firstly, we establish a SDON architecture with hierarchical structure of data plane, which divides the data plane into four layers for getting fine-grained bandwidth resource. Then, we design the cross-layer routing and resource allocation based on this network architecture. Through jointly considering the bandwidth resource on all the layers, the SDN controller could allocate bandwidth resource to working path and backup path in an economical manner. Next, we construct auxiliary graphs and transform the shared protection problem into the graph vertex coloring problem. Therefore, the resource consumption on backup paths can be reduced further. The simulation results demonstrate that the proposed protection strategy can achieve lower protection overhead and higher resource utilization ratio.
BFL: a node and edge betweenness based fast layout algorithm for large scale networks
Hashimoto, Tatsunori B; Nagasaki, Masao; Kojima, Kaname; Miyano, Satoru
2009-01-01
Background Network visualization would serve as a useful first step for analysis. However, current graph layout algorithms for biological pathways are insensitive to biologically important information, e.g. subcellular localization, biological node and graph attributes, or/and not available for large scale networks, e.g. more than 10000 elements. Results To overcome these problems, we propose the use of a biologically important graph metric, betweenness, a measure of network flow. This metric is highly correlated with many biological phenomena such as lethality and clusters. We devise a new fast parallel algorithm calculating betweenness to minimize the preprocessing cost. Using this metric, we also invent a node and edge betweenness based fast layout algorithm (BFL). BFL places the high-betweenness nodes to optimal positions and allows the low-betweenness nodes to reach suboptimal positions. Furthermore, BFL reduces the runtime by combining a sequential insertion algorim with betweenness. For a graph with n nodes, this approach reduces the expected runtime of the algorithm to O(n2) when considering edge crossings, and to O(n log n) when considering only density and edge lengths. Conclusion Our BFL algorithm is compared against fast graph layout algorithms and approaches requiring intensive optimizations. For gene networks, we show that our algorithm is faster than all layout algorithms tested while providing readability on par with intensive optimization algorithms. We achieve a 1.4 second runtime for a graph with 4000 nodes and 12000 edges on a standard desktop computer. PMID:19146673
The effect of lab based instruction on ACT science scores
NASA Astrophysics Data System (ADS)
Hamilton, Michelle
Standardized tests, although unpopular, are required for a multitude of reasons. One of these tests is the ACT. The ACT is a college readiness test that many high school juniors take to gain college admittance. Students throughout the United States are unprepared for this assessment. The average high school junior is three points behind twenty-four, the ACT recommended score, for the science section. The science section focuses on reading text and, interpreting graphs, charts, tables and diagrams with an emphasis on experimental design and relationships among variables. For students to become better at interpreting and understanding scientific graphics they must have vast experience developing their own graphics. The purpose of this study was to provide students the opportunity to generate their own graphics to master interpretation of them on the ACT. According to a t-test the results show that students who are continually exposed to creating graphs are able to understand and locate information from graphs at a significantly faster rate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dr. George L. Mesina; Steven P. Miller
The XMGR5 graphing package [1] for drawing RELAP5 [2] plots is being re-written in Java [3]. Java is a robust programming language that is available at no cost for most computer platforms from Sun Microsystems, Inc. XMGR5 is an extension of an XY plotting tool called ACE/gr extended to plot data from several US Nuclear Regulatory Commission (NRC) applications. It is also the most popular graphing package worldwide for making RELAP5 plots. In Section 1, a short review of XMGR5 is given, followed by a brief overview of Java. In Section 2, shortcomings of both tkXMGR [4] and XMGR5 aremore » discussed and the value of converting to Java is given. Details of the conversion to Java are given in Section 3. The progress to date, some conclusions and future work are given in Section 4. Some screen shots of the Java version are shown.« less
Directed differential connectivity graph of interictal epileptiform discharges
Amini, Ladan; Jutten, Christian; Achard, Sophie; David, Olivier; Soltanian-Zadeh, Hamid; Hossein-Zadeh, Gh. Ali; Kahane, Philippe; Minotti, Lorella; Vercueil, Laurent
2011-01-01
In this paper, we study temporal couplings between interictal events of spatially remote regions in order to localize the leading epileptic regions from intracerebral electroencephalogram (iEEG). We aim to assess whether quantitative epileptic graph analysis during interictal period may be helpful to predict the seizure onset zone of ictal iEEG. Using wavelet transform, cross-correlation coefficient, and multiple hypothesis test, we propose a differential connectivity graph (DCG) to represent the connections that change significantly between epileptic and non-epileptic states as defined by the interictal events. Post-processings based on mutual information and multi-objective optimization are proposed to localize the leading epileptic regions through DCG. The suggested approach is applied on iEEG recordings of five patients suffering from focal epilepsy. Quantitative comparisons of the proposed epileptic regions within ictal onset zones detected by visual inspection and using electrically stimulated seizures, reveal good performance of the present method. PMID:21156385
Publications - GMC 391 | Alaska Division of Geological & Geophysical
DGGS GMC 391 Publication Details Title: Core descriptions, photographs and thin section photomicro , Inc., 2010, Core descriptions, photographs and thin section photomicro-graphs from the Humble Oil DDH DVD. Keywords Core Drilling; Thin Section Top of Page Department of Natural Resources, Division of
Seafood Manual for School Food Service Personnel.
ERIC Educational Resources Information Center
Whitaker, Carol S.; Webb, Anita H.
Seafood information pertinent to the needs of school food service personnel is presented. Each of five sections contains information considered important by school food service managers and supervisors as indicated in a national survey (1977). Provided in section one are a narrative section, graph, and chart on seafood nutritive value. The next…
Traffic Safety for Special Children
ERIC Educational Resources Information Center
Wilson, Val; MacKenzie, R. A.
1974-01-01
In a 6 weeks' unit on traffic education using flannel graphs, filmstrips and models, 12 special class students (IQ 55-82) ages 7- to 11-years-old learned six basic skills including crossing a road, obeying traffic lights and walking on country roads. (CL)
Classroom Proven Motivational Mathematics Games, Monograph No. 1.
ERIC Educational Resources Information Center
Michigan Council of Teachers of Mathematics.
This collection includes 50 mathematical games and puzzles for classroom use at all grade levels. Also included is a wide variety of activities with cubes, flash cards, graphs, dots, number patterns, geometric shapes, cross-number puzzles, and magic squares. (MM)
ERIC Educational Resources Information Center
Morse, Dana F.
2007-01-01
This study took place at Skaneateles High School in Skaneateles, New York in a grade 10 Integrated Math AB course with 52 students in 3 sections using the TI-84 Plus family graphing calculators and the TI-Navigator classroom learning system with a projector and interactive whiteboard. New York State is phasing in a new curriculum that integrates…
Zang, Pengxiao; Gao, Simon S; Hwang, Thomas S; Flaxel, Christina J; Wilson, David J; Morrison, John C; Huang, David; Li, Dengwang; Jia, Yali
2017-03-01
To improve optic disc boundary detection and peripapillary retinal layer segmentation, we propose an automated approach for structural and angiographic optical coherence tomography. The algorithm was performed on radial cross-sectional B-scans. The disc boundary was detected by searching for the position of Bruch's membrane opening, and retinal layer boundaries were detected using a dynamic programming-based graph search algorithm on each B-scan without the disc region. A comparison of the disc boundary using our method with that determined by manual delineation showed good accuracy, with an average Dice similarity coefficient ≥0.90 in healthy eyes and eyes with diabetic retinopathy and glaucoma. The layer segmentation accuracy in the same cases was on average less than one pixel (3.13 μm).
Zang, Pengxiao; Gao, Simon S.; Hwang, Thomas S.; Flaxel, Christina J.; Wilson, David J.; Morrison, John C.; Huang, David; Li, Dengwang; Jia, Yali
2017-01-01
To improve optic disc boundary detection and peripapillary retinal layer segmentation, we propose an automated approach for structural and angiographic optical coherence tomography. The algorithm was performed on radial cross-sectional B-scans. The disc boundary was detected by searching for the position of Bruch’s membrane opening, and retinal layer boundaries were detected using a dynamic programming-based graph search algorithm on each B-scan without the disc region. A comparison of the disc boundary using our method with that determined by manual delineation showed good accuracy, with an average Dice similarity coefficient ≥0.90 in healthy eyes and eyes with diabetic retinopathy and glaucoma. The layer segmentation accuracy in the same cases was on average less than one pixel (3.13 μm). PMID:28663830
NASA Technical Reports Server (NTRS)
Hargraves, W. R.; Delulio, E. B.; Justus, C. G.
1977-01-01
The Global Reference Atmospheric Model is used along with the revised perturbation statistics to evaluate and computer graph various atmospheric statistics along a space shuttle reference mission and abort trajectory. The trajectory plots are height vs. ground range, with height from ground level to 155 km and ground range along the reentry trajectory. Cross sectional plots, height vs. latitude or longitude, are also generated for 80 deg longitude, with heights from 30 km to 90 km and latitude from -90 deg to +90 deg, and for 45 deg latitude, with heights from 30 km to 90 km and longitudes from 180 deg E to 180 deg W. The variables plotted are monthly average pressure, density, temperature, wind components, and wind speed and standard deviations and 99th inter-percentile range for each of these variables.
Detecting and analyzing research communities in longitudinal scientific networks.
Leone Sciabolazza, Valerio; Vacca, Raffaele; Kennelly Okraku, Therese; McCarty, Christopher
2017-01-01
A growing body of evidence shows that collaborative teams and communities tend to produce the highest-impact scientific work. This paper proposes a new method to (1) Identify collaborative communities in longitudinal scientific networks, and (2) Evaluate the impact of specific research institutes, services or policies on the interdisciplinary collaboration between these communities. First, we apply community-detection algorithms to cross-sectional scientific collaboration networks and analyze different types of co-membership in the resulting subgroups over time. This analysis summarizes large amounts of longitudinal network data to extract sets of research communities whose members have consistently collaborated or shared collaborators over time. Second, we construct networks of cross-community interactions and estimate Exponential Random Graph Models to predict the formation of interdisciplinary collaborations between different communities. The method is applied to longitudinal data on publication and grant collaborations at the University of Florida. Results show that similar institutional affiliation, spatial proximity, transitivity effects, and use of the same research services predict higher degree of interdisciplinary collaboration between research communities. Our application also illustrates how the identification of research communities in longitudinal data and the analysis of cross-community network formation can be used to measure the growth of interdisciplinary team science at a research university, and to evaluate its association with research policies, services or institutes.
Detecting and analyzing research communities in longitudinal scientific networks
Vacca, Raffaele; Kennelly Okraku, Therese; McCarty, Christopher
2017-01-01
A growing body of evidence shows that collaborative teams and communities tend to produce the highest-impact scientific work. This paper proposes a new method to (1) Identify collaborative communities in longitudinal scientific networks, and (2) Evaluate the impact of specific research institutes, services or policies on the interdisciplinary collaboration between these communities. First, we apply community-detection algorithms to cross-sectional scientific collaboration networks and analyze different types of co-membership in the resulting subgroups over time. This analysis summarizes large amounts of longitudinal network data to extract sets of research communities whose members have consistently collaborated or shared collaborators over time. Second, we construct networks of cross-community interactions and estimate Exponential Random Graph Models to predict the formation of interdisciplinary collaborations between different communities. The method is applied to longitudinal data on publication and grant collaborations at the University of Florida. Results show that similar institutional affiliation, spatial proximity, transitivity effects, and use of the same research services predict higher degree of interdisciplinary collaboration between research communities. Our application also illustrates how the identification of research communities in longitudinal data and the analysis of cross-community network formation can be used to measure the growth of interdisciplinary team science at a research university, and to evaluate its association with research policies, services or institutes. PMID:28797047
NASA Astrophysics Data System (ADS)
Boucharin, Alexis; Oguz, Ipek; Vachet, Clement; Shi, Yundi; Sanchez, Mar; Styner, Martin
2011-03-01
The use of regional connectivity measurements derived from diffusion imaging datasets has become of considerable interest in the neuroimaging community in order to better understand cortical and subcortical white matter connectivity. Current connectivity assessment methods are based on streamline fiber tractography, usually applied in a Monte-Carlo fashion. In this work we present a novel, graph-based method that performs a fully deterministic, efficient and stable connectivity computation. The method handles crossing fibers and deals well with multiple seed regions. The computation is based on a multi-directional graph propagation method applied to sampled orientation distribution function (ODF), which can be computed directly from the original diffusion imaging data. We show early results of our method on synthetic and real datasets. The results illustrate the potential of our method towards subjectspecific connectivity measurements that are performed in an efficient, stable and reproducible manner. Such individual connectivity measurements would be well suited for application in population studies of neuropathology, such as Autism, Huntington's Disease, Multiple Sclerosis or leukodystrophies. The proposed method is generic and could easily be applied to non-diffusion data as long as local directional data can be derived.
The use of atlas registration and graph cuts for prostate segmentation in magnetic resonance images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Korsager, Anne Sofie, E-mail: asko@hst.aau.dk; Østergaard, Lasse Riis; Fortunati, Valerio
2015-04-15
Purpose: An automatic method for 3D prostate segmentation in magnetic resonance (MR) images is presented for planning image-guided radiotherapy treatment of prostate cancer. Methods: A spatial prior based on intersubject atlas registration is combined with organ-specific intensity information in a graph cut segmentation framework. The segmentation is tested on 67 axial T{sub 2}-weighted MR images in a leave-one-out cross validation experiment and compared with both manual reference segmentations and with multiatlas-based segmentations using majority voting atlas fusion. The impact of atlas selection is investigated in both the traditional atlas-based segmentation and the new graph cut method that combines atlas andmore » intensity information in order to improve the segmentation accuracy. Best results were achieved using the method that combines intensity information, shape information, and atlas selection in the graph cut framework. Results: A mean Dice similarity coefficient (DSC) of 0.88 and a mean surface distance (MSD) of 1.45 mm with respect to the manual delineation were achieved. Conclusions: This approaches the interobserver DSC of 0.90 and interobserver MSD 0f 1.15 mm and is comparable to other studies performing prostate segmentation in MR.« less
A Multilevel Gamma-Clustering Layout Algorithm for Visualization of Biological Networks
Hruz, Tomas; Lucas, Christoph; Laule, Oliver; Zimmermann, Philip
2013-01-01
Visualization of large complex networks has become an indispensable part of systems biology, where organisms need to be considered as one complex system. The visualization of the corresponding network is challenging due to the size and density of edges. In many cases, the use of standard visualization algorithms can lead to high running times and poorly readable visualizations due to many edge crossings. We suggest an approach that analyzes the structure of the graph first and then generates a new graph which contains specific semantic symbols for regular substructures like dense clusters. We propose a multilevel gamma-clustering layout visualization algorithm (MLGA) which proceeds in three subsequent steps: (i) a multilevel γ-clustering is used to identify the structure of the underlying network, (ii) the network is transformed to a tree, and (iii) finally, the resulting tree which shows the network structure is drawn using a variation of a force-directed algorithm. The algorithm has a potential to visualize very large networks because it uses modern clustering heuristics which are optimized for large graphs. Moreover, most of the edges are removed from the visual representation which allows keeping the overview over complex graphs with dense subgraphs. PMID:23864855
SVEN: Informative Visual Representation of Complex Dynamic Structure
2014-12-23
nodes in the diagram can be chosen to minimize crossings, but this is the Traveling Salesman Problem , and even if an optimal solution was found, there...visualization problem inherits the challenges of optimizing the aesthetic properties of the static views of the graphs, it also introduces a new problem of how to...inevitable problem of having an overwhelming number of edge crossings for larger datasets is addressed by reducing the opacity of the lines drawn
Healthy Kids, Healthy Cuba: findings from a group model building process in the rural Southwest.
Keane, Patricia; Ortega, Alejandro; Linville, Jeanette
2015-01-01
Healthy Kids, Healthy Cuba (HKHCuba) is 1 of 49 community partnerships that participated in the national Healthy Kids, Healthy Communities program of the Robert Wood Johnson Foundation. One method of evaluation was to introduce systems thinking at the community level by identifying the essential parts of the HKHCuba system and how it influences policy and environmental changes to promote healthy eating and active living as well as to prevent childhood obesity in this unique, triethnic, rural community in New Mexico. In this cross-sectional design, 12 HKHCuba partners participated in a group model building (GMB) session to develop behavior over time graphs and a causal loop diagram. Twenty-seven influences identified in the behavior over time graphs emerged as feedback loops and 5 subsystems emerged within the causal loop diagram. In addition to specific strategy-related influences (eg, access to healthy food, participation in community gardens), sense of cultural pride, sense of community, and social engagement, particularly among youth, were highly salient topics. The GMB process provided participants with the opportunity to explore the connections across their specific areas of work and make connections between policy and environmental influences on healthy eating and active living behaviors. The GMB processes and systems thinking approaches were new to the majority of participants, received positively, and perhaps should have been introduced earlier in the project period.
Defining and Measuring Transnational Social Structures
ERIC Educational Resources Information Center
Molina, José Luis; Petermann, Sören; Herz, Andreas
2015-01-01
Transnational social fields and transnational social spaces are often used interchangeably to describe and analyze emergent structures of cross-border formations. In this article, we suggest measuring two key aspects of these social structures: embeddedness and span of migrants' personal networks. While clustered graphs allow assessing…
Automatic segmentation of colon glands using object-graphs.
Gunduz-Demir, Cigdem; Kandemir, Melih; Tosun, Akif Burak; Sokmensuer, Cenk
2010-02-01
Gland segmentation is an important step to automate the analysis of biopsies that contain glandular structures. However, this remains a challenging problem as the variation in staining, fixation, and sectioning procedures lead to a considerable amount of artifacts and variances in tissue sections, which may result in huge variances in gland appearances. In this work, we report a new approach for gland segmentation. This approach decomposes the tissue image into a set of primitive objects and segments glands making use of the organizational properties of these objects, which are quantified with the definition of object-graphs. As opposed to the previous literature, the proposed approach employs the object-based information for the gland segmentation problem, instead of using the pixel-based information alone. Working with the images of colon tissues, our experiments demonstrate that the proposed object-graph approach yields high segmentation accuracies for the training and test sets and significantly improves the segmentation performance of its pixel-based counterparts. The experiments also show that the object-based structure of the proposed approach provides more tolerance to artifacts and variances in tissues.
2016-01-01
Background Certain hand activities cause deformation and displacement of the median nerve at the carpal tunnel due to the gliding motion of tendons surrounding it. As smartphone usage escalates, this raises the public’s concern whether hand activities while using smartphones can lead to median nerve problems. Objective The aims of this study were to 1) develop kinematic graphs and 2) investigate the associated deformation and rotational information of median nerve in the carpal tunnel during hand activities. Methods Dominant wrists of 30 young adults were examined with ultrasonography by placing a transducer transversely on their wrist crease. Ultrasound video clips were recorded when the subject performing 1) thumb opposition with the wrist in neutral position, 2) thumb opposition with the wrist in ulnar deviation and 3) pinch grip with the wrist in neutral position. Six still images that were separated by 0.2-second intervals were then captured from the ultrasound video for the determination of 1) cross-sectional area (CSA), 2) flattening ratio (FR), 3) rotational displacement (RD) and 4) translational displacement (TD) of median nerve in the carpal tunnel, and these collected information of deformation, rotational and displacement of median nerve were compared between 1) two successive time points during a single hand activity and 2) different hand motions at the same time point. Finally, kinematic graphs were constructed to demonstrate the mobility of median nerve during different hand activities. Results Performing different hand activities during this study led to a gradual reduction in CSA of the median nerve, with thumb opposition together with the wrist in ulnar deviation causing the greatest extent of deformation of the median nerve. Thumb opposition with the wrist in ulnar deviation also led to the largest extent of TD when compared to the other two hand activities of this study. Kinematic graphs showed that the motion pathways of median nerve during different hand activities were complex. Conclusion We observed that the median nerve in the carpal tunnel was rotated, deformed and displaced during the hand activities that people may be performed when using a smartphone, suggesting an increased risk of carpal tunnel syndrome (CTS). In addition, the kinematic graphs of median nerve developed in the present study provide new clues for further studies on the pathophysiology of CTS, and alerting smartphone users to establish proper postural habits when using handheld electronic devices. PMID:27367447
a Super Voxel-Based Riemannian Graph for Multi Scale Segmentation of LIDAR Point Clouds
NASA Astrophysics Data System (ADS)
Li, Minglei
2018-04-01
Automatically segmenting LiDAR points into respective independent partitions has become a topic of great importance in photogrammetry, remote sensing and computer vision. In this paper, we cast the problem of point cloud segmentation as a graph optimization problem by constructing a Riemannian graph. The scale space of the observed scene is explored by an octree-based over-segmentation with different depths. The over-segmentation produces many super voxels which restrict the structure of the scene and will be used as nodes of the graph. The Kruskal coordinates are used to compute edge weights that are proportional to the geodesic distance between nodes. Then we compute the edge-weight matrix in which the elements reflect the sectional curvatures associated with the geodesic paths between super voxel nodes on the scene surface. The final segmentation results are generated by clustering similar super voxels and cutting off the weak edges in the graph. The performance of this method was evaluated on LiDAR point clouds for both indoor and outdoor scenes. Additionally, extensive comparisons to state of the art techniques show that our algorithm outperforms on many metrics.
Gratton, Caterina; Sun, Haoxin; Petersen, Steven E
2018-03-01
Executive control functions are associated with frontal, parietal, cingulate, and insular brain regions that interact through distributed large-scale networks. Here, we discuss how fMRI functional connectivity can shed light on the organization of control networks and how they interact with other parts of the brain. In the first section of our review, we present convergent evidence from fMRI functional connectivity, activation, and lesion studies that there are multiple dissociable control networks in the brain with distinct functional properties. In the second section, we discuss how graph theoretical concepts can help illuminate the mechanisms by which control networks interact with other brain regions to carry out goal-directed functions, focusing on the role of specialized hub regions for mediating cross-network interactions. Again, we use a combination of functional connectivity, lesion, and task activation studies to bolster this claim. We conclude that a large-scale network perspective provides important neurobiological constraints on the neural underpinnings of executive control, which will guide future basic and translational research into executive function and its disruption in disease. © 2017 Society for Psychophysiological Research.
Cesarean Section Rate Analysis in University Hospital Tuzla - According to Robson's Classification.
Fatusic, Jasenko; Hudic, Igor; Fatusic, Zlatan; Zildzic-Moralic, Aida; Zivkovic, Milorad
2016-06-01
For last decades, there has public concern about increasing Cesarean Section (CS) rates, and it is an issue of international public health concern. According to World Health Organisation (WHO) there is no justification to have more than 10-15% CS births. WHO proposes the Robson ten-group classification, as a global standard for assessing, monitoring and comparing cesarean section rates. The aim of this study was to investigate Cesarean section rate at University Hospital Tuzla, Bosnia and Herzegovina. Cross sectional study was conducted for one-year period, 2015. Statistical analysis and graph-table presentation was performed using Excel 2010 and Microsoft Office programs. Out of 3,672 births, a total of 936 births were performed by CS. Percentage of the total number of CS to the total birth number was 25,47%. According to Robson classification, the largest was group 5 with relative contribution of 29,80%. On second and third place were group 1 and 2 with relative contribution of 26,06% and 15,78% respectively. Groups 1, 2, 5 made account of realtive contribution of 71,65%. All other groups had entirely relative contribution of 28,35%. Robson 10-group classification provides easy way in collecting information about CS rate. It is important that efforts to reduce the overall CS rate should focus on reducing the primary CS. Data from our study confirm this attitude.
Orbital evolution studies of planet-crossing asteroids
NASA Astrophysics Data System (ADS)
Hahn, Gerhard; Lagerkvist, Claes-Ingvar
The orbits of 26 planet-crossing Aten-Apollo-Amor asteroids are predicted on the basis of numerical integrations covering 33,000 or 100,000 yrs; the values reported supplement the preliminary findings of Hahn and Lagerkvist (1987). A solar-system dynamics model accounting for the effects of all planets from Venus to Neptune is employed, along with the 15th-order integration algorithm RADAU (Everhart, 1985). The results are presented in extensive tables and graphs and discussed in detail.
The Application of a Statistical Analysis Software Package to Explosive Testing
1993-12-01
deviation not corrected for test interval. M refer to equation 2. s refer to equation 3. G refer to section 2.1, C 36 Appendix I : Program Structured ...APPENDIX I: Program Structured Diagrams 37 APPENDIX II: Bruceton Reference Graphs 39 APPENDIX III: Input and Output Data File Format 44 APPENDIX IV...directly from Graph II, which has been digitised and incorporated into the program . IfM falls below 0.3, the curve that is closest to diff( eq . 3a) is
Giving USA 1997: The Annual Report on Philanthropy for the Year 1996.
ERIC Educational Resources Information Center
Kaplan, Ann E., Ed.
This report presents a comprehensive review of private philanthropy in the United States during 1996. After a preliminary section, the first section presents data on giving, using text, graphs, and charts. Sections cover: overall 1996 contributions; changes in giving by source and use; total giving (1966-1996); inflation-adjusted giving in 5-year…
From brain topography to brain topology: relevance of graph theory to functional neuroscience.
Minati, Ludovico; Varotto, Giulia; D'Incerti, Ludovico; Panzica, Ferruccio; Chan, Dennis
2013-07-10
Although several brain regions show significant specialization, higher functions such as cross-modal information integration, abstract reasoning and conscious awareness are viewed as emerging from interactions across distributed functional networks. Analytical approaches capable of capturing the properties of such networks can therefore enhance our ability to make inferences from functional MRI, electroencephalography and magnetoencephalography data. Graph theory is a branch of mathematics that focuses on the formal modelling of networks and offers a wide range of theoretical tools to quantify specific features of network architecture (topology) that can provide information complementing the anatomical localization of areas responding to given stimuli or tasks (topography). Explicit modelling of the architecture of axonal connections and interactions among areas can furthermore reveal peculiar topological properties that are conserved across diverse biological networks, and highly sensitive to disease states. The field is evolving rapidly, partly fuelled by computational developments that enable the study of connectivity at fine anatomical detail and the simultaneous interactions among multiple regions. Recent publications in this area have shown that graph-based modelling can enhance our ability to draw causal inferences from functional MRI experiments, and support the early detection of disconnection and the modelling of pathology spread in neurodegenerative disease, particularly Alzheimer's disease. Furthermore, neurophysiological studies have shown that network topology has a profound link to epileptogenesis and that connectivity indices derived from graph models aid in modelling the onset and spread of seizures. Graph-based analyses may therefore significantly help understand the bases of a range of neurological conditions. This review is designed to provide an overview of graph-based analyses of brain connectivity and their relevance to disease aimed principally at general neuroscientists and clinicians.
A graph grammar approach to artificial life.
Kniemeyer, Ole; Buck-Sorlin, Gerhard H; Kurth, Winfried
2004-01-01
We present the high-level language of relational growth grammars (RGGs) as a formalism designed for the specification of ALife models. RGGs can be seen as an extension of the well-known parametric Lindenmayer systems and contain rule-based, procedural, and object-oriented features. They are defined as rewriting systems operating on graphs with the edges coming from a set of user-defined relations, whereas the nodes can be associated with objects. We demonstrate their ability to represent genes, regulatory networks of metabolites, and morphologically structured organisms, as well as developmental aspects of these entities, in a common formal framework. Mutation, crossing over, selection, and the dynamics of a network of gene regulation can all be represented with simple graph rewriting rules. This is demonstrated in some detail on the classical example of Dawkins' biomorphs and the ABC model of flower morphogenesis: other applications are briefly sketched. An interactive program was implemented, enabling the execution of the formalism and the visualization of the results.
Unapparent Information Revelation: Text Mining for Counterterrorism
NASA Astrophysics Data System (ADS)
Srihari, Rohini K.
Unapparent information revelation (UIR) is a special case of text mining that focuses on detecting possible links between concepts across multiple text documents by generating an evidence trail explaining the connection. A traditional search involving, for example, two or more person names will attempt to find documents mentioning both these individuals. This research focuses on a different interpretation of such a query: what is the best evidence trail across documents that explains a connection between these individuals? For example, all may be good golfers. A generalization of this task involves query terms representing general concepts (e.g. indictment, foreign policy). Previous approaches to this problem have focused on graph mining involving hyperlinked documents, and link analysis exploiting named entities. A new robust framework is presented, based on (i) generating concept chain graphs, a hybrid content representation, (ii) performing graph matching to select candidate subgraphs, and (iii) subsequently using graphical models to validate hypotheses using ranked evidence trails. We adapt the DUC data set for cross-document summarization to evaluate evidence trails generated by this approach
Assessment of tautomer distribution using the condensed reaction graph approach
NASA Astrophysics Data System (ADS)
Gimadiev, T. R.; Madzhidov, T. I.; Nugmanov, R. I.; Baskin, I. I.; Antipin, I. S.; Varnek, A.
2018-03-01
We report the first direct QSPR modeling of equilibrium constants of tautomeric transformations (logK T ) in different solvents and at different temperatures, which do not require intermediate assessment of acidity (basicity) constants for all tautomeric forms. The key step of the modeling consisted in the merging of two tautomers in one sole molecular graph ("condensed reaction graph") which enables to compute molecular descriptors characterizing entire equilibrium. The support vector regression method was used to build the models. The training set consisted of 785 transformations belonging to 11 types of tautomeric reactions with equilibrium constants measured in different solvents and at different temperatures. The models obtained perform well both in cross-validation (Q2 = 0.81 RMSE = 0.7 logK T units) and on two external test sets. Benchmarking studies demonstrate that our models outperform results obtained with DFT B3LYP/6-311 ++ G(d,p) and ChemAxon Tautomerizer applicable only in water at room temperature.
Active and passive spatial learning in human navigation: acquisition of graph knowledge.
Chrastil, Elizabeth R; Warren, William H
2015-07-01
It is known that active exploration of a new environment leads to better spatial learning than does passive visual exposure. We ask whether specific components of active learning differentially contribute to particular forms of spatial knowledge-the exploration-specific learning hypothesis. Previously, we found that idiothetic information during walking is the primary active contributor to metric survey knowledge (Chrastil & Warren, 2013). In this study, we test the contributions of 3 components to topological graph and route knowledge: visual information, idiothetic information, and cognitive decision making. Four groups of participants learned the locations of 8 objects in a virtual hedge maze by (a) walking or (b) watching a video, crossed with (1) either making decisions about their path or (2) being guided through the maze. Route and graph knowledge were assessed by walking in the maze corridors from a starting object to the remembered location of a test object, with frequent detours. Decision making during exploration significantly contributed to subsequent route finding in the walking condition, whereas idiothetic information did not. Participants took novel routes and the metrically shortest routes on the majority of both direct and barrier trials, indicating that labeled graph knowledge-not merely route knowledge-was acquired. We conclude that, consistent with the exploration-specific learning hypothesis, decision making is the primary component of active learning for the acquisition of topological graph knowledge, whereas idiothetic information is the primary component for metric survey knowledge. (c) 2015 APA, all rights reserved.
The effect of choir formation on the acoustical attributes of the singing voice
NASA Astrophysics Data System (ADS)
Atkinson, Debra Sue
Research shows that many things can influence choral tone and choral blend. Some of these are vowel uniformity, vibrato, choral formation, strategic placement of singers, and spacing between singers. This study sought to determine the effect that changes in choral formation and spacing between singers would have on four randomly selected voices of an ensemble as revealed through long-term average spectra (LTAS) of the individual singers. All members of the ensemble were given the opportunity to express their preferences for each of the choral formations and the four randomly selected choristers were asked specific questions regarding the differences between choral singing and solo singing. The results indicated that experienced singers preferred singing in a mixed-spread choral formation. However, the graphs of the choral excerpts as compared to the solo recordings revealed that the choral graphs for the soprano and bass were very similar to the graphs of their solos, but the graphs of the tenor and the alto were different from their solo graphs. It is obvious from the results of this study that the four selected singers did sing with slightly different techniques in the choral formations than they did while singing their solos. The members of this ensemble were accustomed to singing in many different formations. Therefore, it was easy for them to consciously think about how they sang in each of the four formations (mixed-close, mixed-spread, sectional-close, and sectional-spread) and answer the questionnaire accordingly. This would not be as easy for a group that never changed choral formations. Therefore, the results of this study cannot be generalized to choirs who only sing in sectional formation. As researchers learn more about choral acoustics and the effects of choral singing on the voice, choral conductors will be able to make better decisions about the methods used to achieve their desired choral blend. It is up to the choral conductors to glean the knowledge from the research that is taking place and use it for the betterment of choral music.
The entropic boundary law in BF theory
NASA Astrophysics Data System (ADS)
Livine, Etera R.; Terno, Daniel R.
2009-01-01
We compute the entropy of a closed bounded region of space for pure 3d Riemannian gravity formulated as a topological BF theory for the gauge group SU(2) and show its holographic behavior. More precisely, we consider a fixed graph embedded in space and study the flat connection spin network state without and with particle-like topological defects. We regularize and compute exactly the entanglement for a bipartite splitting of the graph and show it scales at leading order with the number of vertices on the boundary (or equivalently with the number of loops crossing the boundary). More generally these results apply to BF theory with any compact gauge group in any space-time dimension.
NASA Technical Reports Server (NTRS)
Lance, D. G.; Nettles, A. T.
1991-01-01
Low velocity instrumented impact testing was utilized to examine the effects of an outer lamina of ultra-high molecular weight polyethylene (Spectra) on the damage tolerance of carbon epoxy composites. Four types of 16-ply quasi-isotropic panels (0, +45, 90, -45) were tested. Some panels contained no Spectra, while others had a lamina of Spectra bonded to the top (impacted side), bottom, or both sides of the composite plates. The specimens were impacted with energies up to 8.5 J. Force time plots and maximum force versus impact energy graphs were generated for comparison purposes. Specimens were also subjected to cross-sectional analysis and compression after impact tests. The results show that while the Spectra improved the maximum load that the panels could withstand before fiber breakage, the Spectra seemingly reduced the residual strength of the composites.
NASA Technical Reports Server (NTRS)
Norikane, L.; Freeman, A.; Way, J.; Okonek, S.; Casey, R.
1992-01-01
Recent updates to a geographical information system (GIS) called VICAR (Video Image Communication and Retrieval)/IBIS are described. The system is designed to handle data from many different formats (vector, raster, tabular) and many different sources (models, radar images, ground truth surveys, optical images). All the data are referenced to a single georeference plane, and average or typical values for parameters defined within a polygonal region are stored in a tabular file, called an info file. The info file format allows tracking of data in time, maintenance of links between component data sets and the georeference image, conversion of pixel values to `actual' values (e.g., radar cross-section, luminance, temperature), graph plotting, data manipulation, generation of training vectors for classification algorithms, and comparison between actual measurements and model predictions (with ground truth data as input).
The AME2016 atomic mass evaluation (II). Tables, graphs and references
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Meng; Audi, G.; Kondev, F. G.
This paper is the second part of the new evaluation of atomic masses, Ame2016. Using least-squares adjustments to all evaluated and accepted experimental data, described in Part I, we derive tables with numerical values and graphs to replace those given in Ame2012. The first table lists the recommended atomic mass values and their uncertainties. It is followed by a table of the influences of data on primary nuclides, a table of various reaction and decay energies, and finally, a series of graphs of separation and decay energies. The last section of this paper lists all references of the input datamore » used in the Ame2016 and the Nubase2016 evaluations (first paper in this issue). Amdc: http://amdc.impcas.ac.cn/« less
JSXGraph--Dynamic Mathematics with JavaScript
ERIC Educational Resources Information Center
Gerhauser, Michael; Valentin, Bianca; Wassermann, Alfred
2010-01-01
Since Java applets seem to be on the retreat in web application, other approaches for displaying interactive mathematics in the web browser are needed. One such alternative could be our open-source project JSXGraph. It is a cross-browser library for displaying interactive geometry, function plotting, graphs, and data visualization in a web…
The Primary Theme Club. Home & Family.
ERIC Educational Resources Information Center
Instructor, 1996
1996-01-01
This cross-curricular primary unit helps teachers and students get to know one another. Students collect information about their families, then create bulletin boards, class albums, graphs, and art projects. One activity is for students to invite their families into the classroom to share their projects and feast on traditional family foods. (SM)
Public Education in New Zealand.
ERIC Educational Resources Information Center
Ministry of Education, Wellington (New Zealand).
Intended to stimulate public discussion on the aims and policies of New Zealand education, this background paper has three major sections. The first section discusses the role of education in relation to equal opportunity, democracy, cultural difference, national development, and personal development. In part two, graphs, tables, and text give a…
Dowd, Kieran P.; Harrington, Deirdre M.; Donnelly, Alan E.
2012-01-01
Background The activPAL has been identified as an accurate and reliable measure of sedentary behaviour. However, only limited information is available on the accuracy of the activPAL activity count function as a measure of physical activity, while no unit calibration of the activPAL has been completed to date. This study aimed to investigate the criterion validity of the activPAL, examine the concurrent validity of the activPAL, and perform and validate a value calibration of the activPAL in an adolescent female population. The performance of the activPAL in estimating posture was also compared with sedentary thresholds used with the ActiGraph accelerometer. Methodologies Thirty adolescent females (15 developmental; 15 cross-validation) aged 15–18 years performed 5 activities while wearing the activPAL, ActiGraph GT3X, and the Cosmed K4B2. A random coefficient statistics model examined the relationship between metabolic equivalent (MET) values and activPAL counts. Receiver operating characteristic analysis was used to determine activity thresholds and for cross-validation. The random coefficient statistics model showed a concordance correlation coefficient of 0.93 (standard error of the estimate = 1.13). An optimal moderate threshold of 2997 was determined using mixed regression, while an optimal vigorous threshold of 8229 was determined using receiver operating statistics. The activPAL count function demonstrated very high concurrent validity (r = 0.96, p<0.01) with the ActiGraph count function. Levels of agreement for sitting, standing, and stepping between direct observation and the activPAL and ActiGraph were 100%, 98.1%, 99.2% and 100%, 0%, 100%, respectively. Conclusions These findings suggest that the activPAL is a valid, objective measurement tool that can be used for both the measurement of physical activity and sedentary behaviours in an adolescent female population. PMID:23094069
Mortelliti, Caroline L; Mortelliti, Anthony J
2016-08-01
To elucidate the relatively large incremental percent change (IPC) in cross sectional area (CSA) in currently available small endotracheal tubes (ETTs), and to make recommendation for lesser incremental change in CSA in these smaller ETTs, in order to minimize iatrogenic airway injury. The CSAs of a commercially available line of ETTs were calculated, and the IPC of the CSA between consecutive size ETTs was calculated and graphed. The average IPC in CSA with large ETTs was applied to calculate identical IPC in the CSA for a theoretical, smaller ETT series, and the dimensions of a new theoretical series of proposed small ETTs were defined. The IPC of CSA in the larger (5.0-8.0 mm inner diameter (ID)) ETTs was 17.07%, and the IPC of CSA in the smaller ETTs (2.0-4.0 mm ID) is remarkably larger (38.08%). Applying the relatively smaller IPC of CSA from larger ETTs to a theoretical sequence of small ETTs, starting with the 2.5 mm ID ETT, suggests that intermediate sizes of small ETTs (ID 2.745 mm, 3.254 mm, and 3.859 mm) should exist. We recommend manufacturers produce additional small ETT size options at the intuitive intermediate sizes of 2.75 mm, 3.25 mm, and 3.75 mm ID in order to improve airway management for infants and small children. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Design of reinforced areas of concrete column using quadratic polynomials
NASA Astrophysics Data System (ADS)
Arif Gunadi, Tjiang; Parung, Herman; Rachman Djamaluddin, Abd; Arwin Amiruddin, A.
2017-11-01
Designing of reinforced concrete columns mostly carried out by a simple planning method which uses column interaction diagram. However, the application of this method is limited because it valids only for certain compressive strenght of the concrete and yield strength of the reinforcement. Thus, a more applicable method is still in need. Another method is the use of quadratic polynomials as a basis for the approach in designing reinforced concrete columns, where the ratio of neutral lines to the effective height of a cross section (ξ) if associated with ξ in the same cross-section with different reinforcement ratios is assumed to form a quadratic polynomial. This is identical to the basic principle used in the Simpson rule for numerical integral using quadratic polynomials and had a sufficiently accurate level of accuracy. The basis of this approach to be used both the normal force equilibrium and the moment equilibrium. The abscissa of the intersection of the two curves is the ratio that had been mentioned, since it fulfill both of the equilibrium. The application of this method is relatively more complicated than the existing method but provided with tables and graphs (N vs ξN ) and (M vs ξM ) so that its used could be simplified. The uniqueness of these tables are only distinguished based on the compresssive strength of the concrete, so in application it could be combined with various yield strenght of the reinforcement available in the market. This method could be solved by using programming languages such as Fortran.
Evaluation of punching shear strength of flat slabs supported on rectangular columns
NASA Astrophysics Data System (ADS)
Filatov, Valery
2018-03-01
The article presents the methodology and results of an analytical study of structural parameters influence on the value of punching force for the joint of columns and flat reinforced concrete slab. This design solution is typical for monolithic reinforced concrete girderless frames, which have a wide application in the construction of high-rise buildings. As the results of earlier studies show the punching shear strength of slabs at rectangular columns can be lower than at square columns with a similar length of the control perimeter. The influence of two structural parameters on the punching strength of the plate is investigated - the ratio of the side of the column cross-section to the effective depth of slab C/d and the ratio of the sides of the rectangular column Cmax/Cmin. According to the results of the study, graphs of reduction the control perimeter depending on the structural parameters are presented for columns square and rectangular cross-sections. Comparison of results obtained by proposed approach and MC2010 simplified method are shown, that proposed approach gives a more conservative estimate of the influence of the structural parameters. A significant influence of the considered structural parameters on punching shear strength of reinforced concrete slabs is confirmed by the results of experimental studies. The results of the study confirm the necessity of taking into account the considered structural parameters when calculating the punching shear strength of flat reinforced concrete slabs and further development of code design methods.
Maternal education level and low birth weight: a meta-analysis.
Silvestrin, Sonia; Silva, Clécio Homrich da; Hirakata, Vânia Naomi; Goldani, André A S; Silveira, Patrícia P; Goldani, Marcelo Z
2013-01-01
To assess the association between maternal education level and birth weight, considering the circumstances in which the excess use of technology in healthcare, as well as the scarcity of these resources, may result in similar outcomes. A meta-analysis of cohort and cross-sectional studies was performed; the studies were selected by systematic review in the MEDLINE database using the following Key**words socioeconomic factors, infant, low birth weight, cohort studies, cross-sectional studies. The summary measures of effect were obtained by random effect model, and its results were obtained through forest plot graphs. The publication bias was assessed by Egger's test, and the Newcastle-Ottawa scale was used to assess study quality. The initial search found 729 articles. Of these, 594 were excluded after reading the title and abstract; 21, after consensus meetings among the three reviewers; 102, after reading the full text; and three for not having the proper outcome. Of the nine final articles, 88.8% had quality ≥ six stars (Newcastle-Ottawa Scale), showing good quality studies. The heterogeneity of the articles was considered moderate. High maternal education showed a 33% protective effect against low birth weight, whereas medium degree of education showed no significant protection when compared to low maternal education. The hypothesis of similarity between the extreme degrees of social distribution, translated by maternal education level in relation to the proportion of low birth weight, was not confirmed. Copyright © 2013 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.
Digitization of a geologic map for the Quebec-Maine-Gulf of Maine global geoscience transect
Wright, Bruce E.; Stewart, David B.
1990-01-01
The Bedrock Geologic Map of Maine was digitized and combined with digital geologic data for Quebec and the Gulf of Maine for the Quebec-Maine-Gulf of Maine Geologic Transect Project. This map is being combined with digital geophysical data to produce three-dimensional depictions of the subsurface geology and to produce cross sections of the Earth's crust. It is an essential component of a transect that stretches from the craton near Quebec City, Quebec, to the Atlantic Ocean Basin south of Georges Bank. The transect is part of the Global Geosciences Transect Project of the International Lithosphere Program. The Digital Line Graph format is used for storage of the digitized data. A coding scheme similar to that used for base category planimetric data was developed to assign numeric codes to the digitized geologic data. These codes were used to assign attributes to polygon and line features to describe rock type, age, name, tectonic setting of original deposition, mineralogy, and composition of igneous plutonic rocks, as well as faults and other linear features. The digital geologic data can be readily edited, rescaled, and reprojected. The attribute codes allow generalization and selective retrieval of the geologic features. The codes allow assignment of map colors based on age, lithology, or other attribute. The Digital Line Graph format is a general transfer format that is supported by many software vendors and is easily transferred between systems.
NASA Astrophysics Data System (ADS)
Rysavy, Steven; Flores, Arturo; Enciso, Reyes; Okada, Kazunori
2008-03-01
This paper presents an experimental study for assessing the applicability of general-purpose 3D segmentation algorithms for analyzing dental periapical lesions in cone-beam computed tomography (CBCT) scans. In the field of Endodontics, clinical studies have been unable to determine if a periapical granuloma can heal with non-surgical methods. Addressing this issue, Simon et al. recently proposed a diagnostic technique which non-invasively classifies target lesions using CBCT. Manual segmentation exploited in their study, however, is too time consuming and unreliable for real world adoption. On the other hand, many technically advanced algorithms have been proposed to address segmentation problems in various biomedical and non-biomedical contexts, but they have not yet been applied to the field of dentistry. Presented in this paper is a novel application of such segmentation algorithms to the clinically-significant dental problem. This study evaluates three state-of-the-art graph-based algorithms: a normalized cut algorithm based on a generalized eigen-value problem, a graph cut algorithm implementing energy minimization techniques, and a random walks algorithm derived from discrete electrical potential theory. In this paper, we extend the original 2D formulation of the above algorithms to segment 3D images directly and apply the resulting algorithms to the dental CBCT images. We experimentally evaluate quality of the segmentation results for 3D CBCT images, as well as their 2D cross sections. The benefits and pitfalls of each algorithm are highlighted.
Kreitz, Silke; de Celis Alonso, Benito; Uder, Michael; Hess, Andreas
2018-01-01
Resting state (RS) connectivity has been increasingly studied in healthy and diseased brains in humans and animals. This paper presents a new method to analyze RS data from fMRI that combines multiple seed correlation analysis with graph-theory (MSRA). We characterize and evaluate this new method in relation to two other graph-theoretical methods and ICA. The graph-theoretical methods calculate cross-correlations of regional average time-courses, one using seed regions of the same size (SRCC) and the other using whole brain structure regions (RCCA). We evaluated the reproducibility, power, and capacity of these methods to characterize short-term RS modulation to unilateral physiological whisker stimulation in rats. Graph-theoretical networks found with the MSRA approach were highly reproducible, and their communities showed large overlaps with ICA components. Additionally, MSRA was the only one of all tested methods that had the power to detect significant RS modulations induced by whisker stimulation that are controlled by family-wise error rate (FWE). Compared to the reduced resting state network connectivity during task performance, these modulations implied decreased connectivity strength in the bilateral sensorimotor and entorhinal cortex. Additionally, the contralateral ventromedial thalamus (part of the barrel field related lemniscal pathway) and the hypothalamus showed reduced connectivity. Enhanced connectivity was observed in the amygdala, especially the contralateral basolateral amygdala (involved in emotional learning processes). In conclusion, MSRA is a powerful analytical approach that can reliably detect tiny modulations of RS connectivity. It shows a great promise as a method for studying RS dynamics in healthy and pathological conditions.
Kreitz, Silke; de Celis Alonso, Benito; Uder, Michael; Hess, Andreas
2018-01-01
Resting state (RS) connectivity has been increasingly studied in healthy and diseased brains in humans and animals. This paper presents a new method to analyze RS data from fMRI that combines multiple seed correlation analysis with graph-theory (MSRA). We characterize and evaluate this new method in relation to two other graph-theoretical methods and ICA. The graph-theoretical methods calculate cross-correlations of regional average time-courses, one using seed regions of the same size (SRCC) and the other using whole brain structure regions (RCCA). We evaluated the reproducibility, power, and capacity of these methods to characterize short-term RS modulation to unilateral physiological whisker stimulation in rats. Graph-theoretical networks found with the MSRA approach were highly reproducible, and their communities showed large overlaps with ICA components. Additionally, MSRA was the only one of all tested methods that had the power to detect significant RS modulations induced by whisker stimulation that are controlled by family-wise error rate (FWE). Compared to the reduced resting state network connectivity during task performance, these modulations implied decreased connectivity strength in the bilateral sensorimotor and entorhinal cortex. Additionally, the contralateral ventromedial thalamus (part of the barrel field related lemniscal pathway) and the hypothalamus showed reduced connectivity. Enhanced connectivity was observed in the amygdala, especially the contralateral basolateral amygdala (involved in emotional learning processes). In conclusion, MSRA is a powerful analytical approach that can reliably detect tiny modulations of RS connectivity. It shows a great promise as a method for studying RS dynamics in healthy and pathological conditions. PMID:29875622
Monarch Butterflies: Spirits of Loved Ones
ERIC Educational Resources Information Center
Crumpecker, Cheryl
2011-01-01
The study of the beautiful monarch butterfly lends itself to a vast array of subject matter, and offers the opportunity to meet a large and varied number of standards and objectives for many grade levels. Art projects featuring monarchs may include many cross-curricular units such as math (symmetry and number graphing), science (adaptation and…
ERIC Educational Resources Information Center
Emanouilidis, Emanuel
2008-01-01
Latin squares were first introduced and studied by the famous mathematician Leonhard Euler in the 1700s. Through the years, Latin squares have been used in areas such as statistics, graph theory, coding theory, the generation of random numbers as well as in the design and analysis of experiments. Recently, with the international popularity of…
Facts and Figures: Past, Present, and Future.
ERIC Educational Resources Information Center
Moraine Valley Community College., Palos Hills, IL. Office of Institutional Research.
Historical data is provided on students, programs, and costs at Moraine Valley Community College (MVCC), in Palos Hills, Illinois, in the form of tables and graphs divided into five sections. The first section provides census data and demographic information for the MVCC district, including a map of the district, population trends from 1980 to…
NASA Astrophysics Data System (ADS)
Szabó, György; Fáth, Gábor
2007-07-01
Game theory is one of the key paradigms behind many scientific disciplines from biology to behavioral sciences to economics. In its evolutionary form and especially when the interacting agents are linked in a specific social network the underlying solution concepts and methods are very similar to those applied in non-equilibrium statistical physics. This review gives a tutorial-type overview of the field for physicists. The first four sections introduce the necessary background in classical and evolutionary game theory from the basic definitions to the most important results. The fifth section surveys the topological complications implied by non-mean-field-type social network structures in general. The next three sections discuss in detail the dynamic behavior of three prominent classes of models: the Prisoner's Dilemma, the Rock-Scissors-Paper game, and Competing Associations. The major theme of the review is in what sense and how the graph structure of interactions can modify and enrich the picture of long term behavioral patterns emerging in evolutionary games.
Automatic Assignment of Methyl-NMR Spectra of Supramolecular Machines Using Graph Theory.
Pritišanac, Iva; Degiacomi, Matteo T; Alderson, T Reid; Carneiro, Marta G; Ab, Eiso; Siegal, Gregg; Baldwin, Andrew J
2017-07-19
Methyl groups are powerful probes for the analysis of structure, dynamics and function of supramolecular assemblies, using both solution- and solid-state NMR. Widespread application of the methodology has been limited due to the challenges associated with assigning spectral resonances to specific locations within a biomolecule. Here, we present Methyl Assignment by Graph Matching (MAGMA), for the automatic assignment of methyl resonances. A graph matching protocol examines all possibilities for each resonance in order to determine an exact assignment that includes a complete description of any ambiguity. MAGMA gives 100% accuracy in confident assignments when tested against both synthetic data, and 9 cross-validated examples using both solution- and solid-state NMR data. We show that this remarkable accuracy enables a user to distinguish between alternative protein structures. In a drug discovery application on HSP90, we show the method can rapidly and efficiently distinguish between possible ligand binding modes. By providing an exact and robust solution to methyl resonance assignment, MAGMA can facilitate significantly accelerated studies of supramolecular machines using methyl-based NMR spectroscopy.
Calculating massive 3-loop graphs for operator matrix elements by the method of hyperlogarithms
NASA Astrophysics Data System (ADS)
Ablinger, Jakob; Blümlein, Johannes; Raab, Clemens; Schneider, Carsten; Wißbrock, Fabian
2014-08-01
We calculate convergent 3-loop Feynman diagrams containing a single massive loop equipped with twist τ=2 local operator insertions corresponding to spin N. They contribute to the massive operator matrix elements in QCD describing the massive Wilson coefficients for deep-inelastic scattering at large virtualities. Diagrams of this kind can be computed using an extended version of the method of hyperlogarithms, originally being designed for massless Feynman diagrams without operators. The method is applied to Benz- and V-type graphs, belonging to the genuine 3-loop topologies. In case of the V-type graphs with five massive propagators, new types of nested sums and iterated integrals emerge. The sums are given in terms of finite binomially and inverse binomially weighted generalized cyclotomic sums, while the 1-dimensionally iterated integrals are based on a set of ∼30 square-root valued letters. We also derive the asymptotic representations of the nested sums and present the solution for N∈C. Integrals with a power-like divergence in N-space ∝aN,a∈R,a>1, for large values of N emerge. They still possess a representation in x-space, which is given in terms of root-valued iterated integrals in the present case. The method of hyperlogarithms is also used to calculate higher moments for crossed box graphs with different operator insertions.
A vision-based approach for tramway rail extraction
NASA Astrophysics Data System (ADS)
Zwemer, Matthijs H.; van de Wouw, Dennis W. J. M.; Jaspers, Egbert; Zinger, Sveta; de With, Peter H. N.
2015-03-01
The growing traffic density in cities fuels the desire for collision assessment systems on public transportation. For this application, video analysis is broadly accepted as a cornerstone. For trams, the localization of tramway tracks is an essential ingredient of such a system, in order to estimate a safety margin for crossing traffic participants. Tramway-track detection is a challenging task due to the urban environment with clutter, sharp curves and occlusions of the track. In this paper, we present a novel and generic system to detect the tramway track in advance of the tram position. The system incorporates an inverse perspective mapping and a-priori geometry knowledge of the rails to find possible track segments. The contribution of this paper involves the creation of a new track reconstruction algorithm which is based on graph theory. To this end, we define track segments as vertices in a graph, in which edges represent feasible connections. This graph is then converted to a max-cost arborescence graph, and the best path is selected according to its location and additional temporal information based on a maximum a-posteriori estimate. The proposed system clearly outperforms a railway-track detector. Furthermore, the system performance is validated on 3,600 manually annotated frames. The obtained results are promising, where straight tracks are found in more than 90% of the images and complete curves are still detected in 35% of the cases.
Topology polymorphism graph for lung tumor segmentation in PET-CT images.
Cui, Hui; Wang, Xiuying; Zhou, Jianlong; Eberl, Stefan; Yin, Yong; Feng, Dagan; Fulham, Michael
2015-06-21
Accurate lung tumor segmentation is problematic when the tumor boundary or edge, which reflects the advancing edge of the tumor, is difficult to discern on chest CT or PET. We propose a 'topo-poly' graph model to improve identification of the tumor extent. Our model incorporates an intensity graph and a topology graph. The intensity graph provides the joint PET-CT foreground similarity to differentiate the tumor from surrounding tissues. The topology graph is defined on the basis of contour tree to reflect the inclusion and exclusion relationship of regions. By taking into account different topology relations, the edges in our model exhibit topological polymorphism. These polymorphic edges in turn affect the energy cost when crossing different topology regions under a random walk framework, and hence contribute to appropriate tumor delineation. We validated our method on 40 patients with non-small cell lung cancer where the tumors were manually delineated by a clinical expert. The studies were separated into an 'isolated' group (n = 20) where the lung tumor was located in the lung parenchyma and away from associated structures / tissues in the thorax and a 'complex' group (n = 20) where the tumor abutted / involved a variety of adjacent structures and had heterogeneous FDG uptake. The methods were validated using Dice's similarity coefficient (DSC) to measure the spatial volume overlap and Hausdorff distance (HD) to compare shape similarity calculated as the maximum surface distance between the segmentation results and the manual delineations. Our method achieved an average DSC of 0.881 ± 0.046 and HD of 5.311 ± 3.022 mm for the isolated cases and DSC of 0.870 ± 0.038 and HD of 9.370 ± 3.169 mm for the complex cases. Student's t-test showed that our model outperformed the other methods (p-values <0.05).
NASA Astrophysics Data System (ADS)
Aleksanyan, Grayr; Shcherbakov, Ivan; Kucher, Artem; Sulyz, Andrew
2018-04-01
Continuous monitoring of the patient's breathing by the method of multi-angle electric impedance tomography allows to obtain images of conduction change in the chest cavity during the monitoring. Direct analysis of images is difficult due to the large amount of information and low resolution images obtained by multi-angle electrical impedance tomography. This work presents a method for obtaining a graph of respiratory activity of the lungs based on the results of continuous lung monitoring using the multi-angle electrical impedance tomography method. The method makes it possible to obtain a graph of the respiratory activity of the left and right lungs separately, as well as a summary graph, to which it is possible to apply methods of processing the results of spirography.
"How Long Is a Piece of String?"
ERIC Educational Resources Information Center
Aitchison, Kate
2001-01-01
Provides a lesson plan designed to form three one-hour sessions with mixed ability groups of 11-12 year olds. The activity involves students estimating road distances by counting the number of map grid lines crossed in going from A to B. The activity is designed to introduce students to graphing calculators. Screenshots from the calculator are…
Intrathoracic airway wall detection using graph search and scanner PSF information
NASA Astrophysics Data System (ADS)
Reinhardt, Joseph M.; Park, Wonkyu; Hoffman, Eric A.; Sonka, Milan
1997-05-01
Measurements of the in vivo bronchial tree can be used to assess regional airway physiology. High-resolution CT (HRCT) provides detailed images of the lungs and has been used to evaluate bronchial airway geometry. Such measurements have been sued to assess diseases affecting the airways, such as asthma and cystic fibrosis, to measure airway response to external stimuli, and to evaluate the mechanics of airway collapse in sleep apnea. To routinely use CT imaging in a clinical setting to evaluate the in vivo airway tree, there is a need for an objective, automatic technique for identifying the airway tree in the CT images and measuring airway geometry parameters. Manual or semi-automatic segmentation and measurement of the airway tree from a 3D data set may require several man-hours of work, and the manual approaches suffer from inter-observer and intra- observer variabilities. This paper describes a method for automatic airway tree analysis that combines accurate airway wall location estimation with a technique for optimal airway border smoothing. A fuzzy logic, rule-based system is used to identify the branches of the 3D airway tree in thin-slice HRCT images. Raycasting is combined with a model-based parameter estimation technique to identify the approximate inner and outer airway wall borders in 2D cross-sections through the image data set. Finally, a 2D graph search is used to optimize the estimated airway wall locations and obtain accurate airway borders. We demonstrate this technique using CT images of a plexiglass tube phantom.
The photon PDF from high-mass Drell–Yan data at the LHC
Giuli, F.
2017-06-15
Achieving the highest precision for theoretical predictions at the LHC requires the calculation of hard-scattering cross sections that include perturbative QCD corrections up to (N)NNLO and electroweak (EW) corrections up to NLO. Parton distribution functions (PDFs) need to be provided with matching accuracy, which in the case of QED effects involves introducing the photon parton distribution of the proton, xγ(x,Q2) . In this work a determination of the photon PDF from fits to recent ATLAS measurements of high-mass Drell–Yan dilepton production atmore » $$\\sqrt{s}$$=8 TeV is presented. This analysis is based on the xFitter framework, and has required improvements both in the APFEL program, to account for NLO QED effects, and in the aMCfast interface to account for the photon-initiated contributions in the EW calculations within MadGraph5_aMC@NLO. The results are compared with other recent QED fits and determinations of the photon PDF, consistent results are found.« less
Khanna, Aditya S; Goodreau, Steven M; Gorbach, Pamina M; Daar, Eric; Little, Susan J
2014-08-01
Our objective here is to demonstrate the population-level effects of individual-level post-diagnosis behavior change (PDBC) in Southern Californian men who have sex with men (MSM), recently diagnosed with HIV. While PDBC has been empirically documented, the population-level effects of such behavior change are largely unknown. To examine these effects, we develop network models derived from the exponential random graph model family. We parameterize our models using behavioral data from the Southern California Acute Infection and Early Disease Research Program, and biological data from a number of published sources. Our models incorporate vital demographic processes, biology, treatment and behavior. We find that without PDBC, HIV prevalence among MSM would be significantly higher at any reasonable frequency of testing. We also demonstrate that higher levels of HIV risk behavior among HIV-positive men relative to HIV-negative men observed in some cross-sectional studies are consistent with individual-level PDBC.
The photon PDF from high-mass Drell-Yan data at the LHC.
Giuli, F
2017-01-01
Achieving the highest precision for theoretical predictions at the LHC requires the calculation of hard-scattering cross sections that include perturbative QCD corrections up to (N)NNLO and electroweak (EW) corrections up to NLO. Parton distribution functions (PDFs) need to be provided with matching accuracy, which in the case of QED effects involves introducing the photon parton distribution of the proton, [Formula: see text]. In this work a determination of the photon PDF from fits to recent ATLAS measurements of high-mass Drell-Yan dilepton production at [Formula: see text] TeV is presented. This analysis is based on the xFitter framework, and has required improvements both in the APFEL program, to account for NLO QED effects, and in the aMCfast interface to account for the photon-initiated contributions in the EW calculations within MadGraph5_aMC@NLO. The results are compared with other recent QED fits and determinations of the photon PDF, consistent results are found.
Loprinzi, Paul D; Fitzgerald, Elizabeth M; Cardinal, Bradley J
2012-03-01
To examine the association between objectively measured physical activity and depression symptoms among a nationally representative sample of pregnant women to provide a more accurate understanding of the relationship between physical activity and depression symptoms. We employed a cross-sectional study design. Data from the National Health and Nutrition Examination Survey 2005-2006 were used for this study. One-hundred and forty-one pregnant women wore an ActiGraph accelerometer for 7 days and completed the Patient Health Questionnaire-9 to assess depression status. More than 19% of the participants experienced some depression symptoms, and compared to their counterparts not having depression symptoms, they were less physically active. An inverse association was found between physical activity and depression symptoms among pregnant women. When feasible, nurses are encouraged to help facilitate physical activity among pregnant women, assuming an uncomplicated pregnancy. © 2012 AWHONN, the Association of Women's Health, Obstetric and Neonatal Nurses.
Earth resources instrumentation for the Space Station Polar Platform
NASA Technical Reports Server (NTRS)
Donohoe, Martin J.; Vane, Deborah
1986-01-01
The spacecraft and payloads of the Space Station Polar Platform program are described in a brief overview. Present plans call for one platform in a descending morning-equator-crossing orbit at 824 km and two or three platforms in ascending afternoon-crossing orbits at 542-824 km. The components of the NASA Earth Observing System (EOS) and NOAA payloads are listed in tables and briefly characterized, and data-distribution requirements and the mission development schedule are discussed. A drawing of the platform, a graph showing the spectral coverage of the EOS instruments, and a glossary of acronyms are provided.
A Flow-Channel Analysis for the Mars Hopper
DOE Office of Scientific and Technical Information (OSTI.GOV)
W. Spencer Cooley
The Mars Hopper is an exploratory vehicle designed to fly on Mars using carbon dioxide from the Martian atmosphere as a rocket propellant. The propellent gasses are thermally heated while traversing a radioisotope ther- mal rocket (RTR) engine’s core. This core is comprised of a radioisotope surrounded by a heat capacitive material interspersed with tubes for the propellant to travel through. These tubes, or flow channels, can be manu- factured in various cross-sectional shapes such as a special four-point star or the traditional circle. Analytical heat transfer and computational fluid dynamics (CFD) anal- yses were performed using flow channels withmore » either a circle or a star cross- sectional shape. The nominal total inlet pressure was specified at 2,805,000 Pa; and the outlet pressure was set to 2,785,000 Pa. The CO2 inlet tem- perature was 300 K; and the channel wall was 1200 K. The steady-state CFD simulations computed the smooth-walled star shape’s outlet temper- ature to be 959 K on the finest mesh. The smooth-walled circle’s outlet temperature was 902 K. A circle with a surface roughness specification at 0.01 mm gave 946 K and at 0.1 mm yielded 989 K. The The effects of a slightly varied inlet pressure were also examined. The analytical calculations were based on the mass flow rates computed in the CFD simulations and provided significantly higher outlet temperature results while displaying the same comparison trends. Research relating to the flow channel heat transfer studies was also done. Mathematical methods to geometrically match the cross-sectional areas of the circle and star, along with a square and equilateral triangle, were derived. A Wolfram Mathematica 8 module was programmed to analyze CFD results using Richardson Extrapolation and calculate the grid convergence index (GCI). A Mathematica notebook, also composed, computes and graphs the bulk mean temperature along a flow channel’s length while the user dynam- ically provides the input variables, allowing their effects on the temperature to be more easily observed.« less
NASA Astrophysics Data System (ADS)
Yin, Y.; Sonka, M.
2010-03-01
A novel method is presented for definition of search lines in a variety of surface segmentation approaches. The method is inspired by properties of electric field direction lines and is applicable to general-purpose n-D shapebased image segmentation tasks. Its utility is demonstrated in graph construction and optimal segmentation of multiple mutually interacting objects. The properties of the electric field-based graph construction guarantee that inter-object graph connecting lines are non-intersecting and inherently covering the entire object-interaction space. When applied to inter-object cross-surface mapping, our approach generates one-to-one and all-to-all vertex correspondent pairs between the regions of mutual interaction. We demonstrate the benefits of the electric field approach in several examples ranging from relatively simple single-surface segmentation to complex multiobject multi-surface segmentation of femur-tibia cartilage. The performance of our approach is demonstrated in 60 MR images from the Osteoarthritis Initiative (OAI), in which our approach achieved a very good performance as judged by surface positioning errors (average of 0.29 and 0.59 mm for signed and unsigned cartilage positioning errors, respectively).
NASA Astrophysics Data System (ADS)
Yamaguchi, Atsuko; Ohashi, Takeyoshi; Kawasaki, Takahiro; Inoue, Osamu; Kawada, Hiroki
2013-04-01
A new method for calculating critical dimension (CDs) at the top and bottom of three-dimensional (3D) pattern profiles from a critical-dimension scanning electron microscope (CD-SEM) image, called as "T-sigma method", is proposed and evaluated. Without preparing a library of database in advance, T-sigma can estimate a feature of a pattern sidewall. Furthermore, it supplies the optimum edge-definition (i.e., threshold level for determining edge position from a CDSEM signal) to detect the top and bottom of the pattern. This method consists of three steps. First, two components of line-edge roughness (LER); noise-induced bias (i.e., LER bias) and unbiased component (i.e., bias-free LER) are calculated with set threshold level. Second, these components are calculated with various threshold values, and the threshold-dependence of these two components, "T-sigma graph", is obtained. Finally, the optimum threshold value for the top and the bottom edge detection are given by the analysis of T-sigma graph. T-sigma was applied to CD-SEM images of three kinds of resist-pattern samples. In addition, reference metrology was performed with atomic force microscope (AFM) and scanning transmission electron microscope (STEM). Sensitivity of CD measured by T-sigma to the reference CD was higher than or equal to that measured by the conventional edge definition. Regarding the absolute measurement accuracy, T-sigma showed better results than the conventional definition. Furthermore, T-sigma graphs were calculated from CD-SEM images of two kinds of resist samples and compared with corresponding STEM observation results. Both bias-free LER and LER bias increased as the detected edge point moved from the bottom to the top of the pattern in the case that the pattern had a straight sidewall and a round top. On the other hand, they were almost constant in the case that the pattern had a re-entrant profile. T-sigma will be able to reveal a re-entrant feature. From these results, it is found that T-sigma method can provide rough cross-sectional pattern features and achieve quick, easy and accurate measurements of top and bottom CD.
NASA Technical Reports Server (NTRS)
Bostian, C. W.; Holt, S. B., Jr.; Kauffman, S. R.; Manus, E. A.; Marshall, R. E.; Stuzman, W. L.; Wiley, P. H.
1977-01-01
The considered investigation made use of the Communications Technology Satellite (CTS) downlink and the beacons carried by the Comstar satellites. The general behavior of rain attenuation and depolarization is illustrated with the aid of data from a storm which took place on July 15, 1976. The effect of the rain on the copolarized signal is indicated in a graph. Another graph shows the behavior of the cross-polarized signal component. Phase effects are also considered together with statistical curves for attenuation. The considered data from CTS indicate that, at least during summer convective storms, attenuation at 11.7 GHz is much more severe than anticipated. Attenuation may be a more serious impediment to dual polarized satellite links at this frequency than is depolarization.
3DScapeCS: application of three dimensional, parallel, dynamic network visualization in Cytoscape
2013-01-01
Background The exponential growth of gigantic biological data from various sources, such as protein-protein interaction (PPI), genome sequences scaffolding, Mass spectrometry (MS) molecular networking and metabolic flux, demands an efficient way for better visualization and interpretation beyond the conventional, two-dimensional visualization tools. Results We developed a 3D Cytoscape Client/Server (3DScapeCS) plugin, which adopted Cytoscape in interpreting different types of data, and UbiGraph for three-dimensional visualization. The extra dimension is useful in accommodating, visualizing, and distinguishing large-scale networks with multiple crossed connections in five case studies. Conclusions Evaluation on several experimental data using 3DScapeCS and its special features, including multilevel graph layout, time-course data animation, and parallel visualization has proven its usefulness in visualizing complex data and help to make insightful conclusions. PMID:24225050
USGS DLGs are digital representations of program-quadrangle format and sectional maps. All DLG data distributed by the United States Geological Survey (USGS) are DLG-Level 3 (DLG-3), which means the data contain a full range of attribute codes, have full topological structuring, ...
Recalibration of the GRLWEAP LRFD resistance factor for Oregon DOT.
DOT National Transportation Integrated Search
2011-02-01
The Bridge Section of the Oregon Department of Transportation (ODOT) is responsible for the design of all bridge structures and routinely uses GRLWEAP for controlling pile driving stresses and establishing capacity from the bearing graph. The LRFD re...
A Fuzzy Approach of Study to Improve the Status of Middle Class Family
NASA Astrophysics Data System (ADS)
Ramkumar, C.; Chandrasekaran, A. D.; Siva, E. P.
2018-04-01
In this chapter, we use the notion of FCM and its properties given, which was introduced by Bark Kosko in the year 1986. Further, this method is more simple and effective one as it can analyze the data by connection matrices and directed graphs. This paper has three sections; first section is introductory of Super Fuzzy Cognitive Maps. The application of super fuzzy cognitive maps to this problem is given in section two. In section three of this paper gives the conclusions based on our study.
Finding minimum-quotient cuts in planar graphs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, J.K.; Phillips, C.A.
Given a graph G = (V, E) where each vertex v {element_of} V is assigned a weight w(v) and each edge e {element_of} E is assigned a cost c(e), the quotient of a cut partitioning the vertices of V into sets S and {bar S} is c(S, {bar S})/min{l_brace}w(S), w(S){r_brace}, where c(S, {bar S}) is the sum of the costs of the edges crossing the cut and w(S) and w({bar S}) are the sum of the weights of the vertices in S and {bar S}, respectively. The problem of finding a cut whose quotient is minimum for a graph hasmore » in recent years attracted considerable attention, due in large part to the work of Rao and Leighton and Rao. They have shown that an algorithm (exact or approximation) for the minimum-quotient-cut problem can be used to obtain an approximation algorithm for the more famous minimumb-balanced-cut problem, which requires finding a cut (S,{bar S}) minimizing c(S,{bar S}) subject to the constraint bW {le} w(S) {le} (1 {minus} b)W, where W is the total vertex weight and b is some fixed balance in the range 0 < b {le} {1/2}. Unfortunately, the minimum-quotient-cut problem is strongly NP-hard for general graphs, and the best polynomial-time approximation algorithm known for the general problem guarantees only a cut whose quotient is at mostO(lg n) times optimal, where n is the size of the graph. However, for planar graphs, the minimum-quotient-cut problem appears more tractable, as Rao has developed several efficient approximation algorithms for the planar version of the problem capable of finding a cut whose quotient is at most some constant times optimal. In this paper, we improve Rao`s algorithms, both in terms of accuracy and speed. As our first result, we present two pseudopolynomial-time exact algorithms for the planar minimum-quotient-cut problem. As Rao`s most accurate approximation algorithm for the problem -- also a pseudopolynomial-time algorithm -- guarantees only a 1.5-times-optimal cut, our algorithms represent a significant advance.« less
Finding minimum-quotient cuts in planar graphs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, J.K.; Phillips, C.A.
Given a graph G = (V, E) where each vertex v [element of] V is assigned a weight w(v) and each edge e [element of] E is assigned a cost c(e), the quotient of a cut partitioning the vertices of V into sets S and [bar S] is c(S, [bar S])/min[l brace]w(S), w(S)[r brace], where c(S, [bar S]) is the sum of the costs of the edges crossing the cut and w(S) and w([bar S]) are the sum of the weights of the vertices in S and [bar S], respectively. The problem of finding a cut whose quotient is minimummore » for a graph has in recent years attracted considerable attention, due in large part to the work of Rao and Leighton and Rao. They have shown that an algorithm (exact or approximation) for the minimum-quotient-cut problem can be used to obtain an approximation algorithm for the more famous minimumb-balanced-cut problem, which requires finding a cut (S,[bar S]) minimizing c(S,[bar S]) subject to the constraint bW [le] w(S) [le] (1 [minus] b)W, where W is the total vertex weight and b is some fixed balance in the range 0 < b [le] [1/2]. Unfortunately, the minimum-quotient-cut problem is strongly NP-hard for general graphs, and the best polynomial-time approximation algorithm known for the general problem guarantees only a cut whose quotient is at mostO(lg n) times optimal, where n is the size of the graph. However, for planar graphs, the minimum-quotient-cut problem appears more tractable, as Rao has developed several efficient approximation algorithms for the planar version of the problem capable of finding a cut whose quotient is at most some constant times optimal. In this paper, we improve Rao's algorithms, both in terms of accuracy and speed. As our first result, we present two pseudopolynomial-time exact algorithms for the planar minimum-quotient-cut problem. As Rao's most accurate approximation algorithm for the problem -- also a pseudopolynomial-time algorithm -- guarantees only a 1.5-times-optimal cut, our algorithms represent a significant advance.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rothstein, Ira Z.; Stewart, Iain W.
Starting with QCD, we derive an effective field theory description for forward scattering and factorization violation as part of the soft-collinear effective field theory (SCET) for high energy scattering. These phenomena are mediated by long distance Glauber gluon exchanges, which are static in time, localized in the longitudinal distance, and act as a kernel for forward scattering where |t| << s. In hard scattering, Glauber gluons can induce corrections which invalidate factorization. With SCET, Glauber exchange graphs can be calculated explicitly, and are distinct from graphs involving soft, collinear, or ultrasoft gluons. We derive a complete basis of operators whichmore » describe the leading power effects of Glauber exchange. Key ingredients include regulating light-cone rapidity singularities and subtractions which prevent double counting. Our results include a novel all orders gauge invariant pure glue soft operator which appears between two collinear rapidity sectors. The 1-gluon Feynman rule for the soft operator coincides with the Lipatov vertex, but it also contributes to emissions with ≥ 2 soft gluons. Our Glauber operator basis is derived using tree level and one-loop matching calculations from full QCD to both SCET II and SCET I. The one-loop amplitude’s rapidity renormalization involves mixing of color octet operators and yields gluon Reggeization at the amplitude level. The rapidity renormalization group equation for the leading soft and collinear functions in the forward scattering cross section are each given by the BFKL equation. Various properties of Glauber gluon exchange in the context of both forward scattering and hard scattering factorization are described. For example, we derive an explicit rule for when eikonalization is valid, and provide a direct connection to the picture of multiple Wilson lines crossing a shockwave. In hard scattering operators Glauber subtractions for soft and collinear loop diagrams ensure that we are not sensitive to the directions for soft and collinear Wilson lines. Conversely, certain Glauber interactions can be absorbed into these soft and collinear Wilson lines by taking them to be in specific directions. Finally, we also discuss criteria for factorization violation.« less
An effective field theory for forward scattering and factorization violation
Rothstein, Ira Z.; Stewart, Iain W.
2016-08-03
Starting with QCD, we derive an effective field theory description for forward scattering and factorization violation as part of the soft-collinear effective field theory (SCET) for high energy scattering. These phenomena are mediated by long distance Glauber gluon exchanges, which are static in time, localized in the longitudinal distance, and act as a kernel for forward scattering where |t| << s. In hard scattering, Glauber gluons can induce corrections which invalidate factorization. With SCET, Glauber exchange graphs can be calculated explicitly, and are distinct from graphs involving soft, collinear, or ultrasoft gluons. We derive a complete basis of operators whichmore » describe the leading power effects of Glauber exchange. Key ingredients include regulating light-cone rapidity singularities and subtractions which prevent double counting. Our results include a novel all orders gauge invariant pure glue soft operator which appears between two collinear rapidity sectors. The 1-gluon Feynman rule for the soft operator coincides with the Lipatov vertex, but it also contributes to emissions with ≥ 2 soft gluons. Our Glauber operator basis is derived using tree level and one-loop matching calculations from full QCD to both SCET II and SCET I. The one-loop amplitude’s rapidity renormalization involves mixing of color octet operators and yields gluon Reggeization at the amplitude level. The rapidity renormalization group equation for the leading soft and collinear functions in the forward scattering cross section are each given by the BFKL equation. Various properties of Glauber gluon exchange in the context of both forward scattering and hard scattering factorization are described. For example, we derive an explicit rule for when eikonalization is valid, and provide a direct connection to the picture of multiple Wilson lines crossing a shockwave. In hard scattering operators Glauber subtractions for soft and collinear loop diagrams ensure that we are not sensitive to the directions for soft and collinear Wilson lines. Conversely, certain Glauber interactions can be absorbed into these soft and collinear Wilson lines by taking them to be in specific directions. Finally, we also discuss criteria for factorization violation.« less
DIGITAL LINE GRAPHS - USGS 1:24,000
USGS DLGs are digital representations of program-quadrangle format and sectional maps. All DLG data distributed by the United States Geological Survey (USGS) are DLG-Level 3 (DLG-3), which means the data contain a full range of attribute codes, have full topological structuring, ...
DIGITAL LINE GRAPHS - USGS 1:100,000
USGS DLGs are digital representations of program-quadrangle format and sectional maps. All DLG data distributed by the United States Geological Survey (USGS) are DLG-Level 3 (DLG-3), which means the data contain a full range of attribute codes, have full topological structuring, ...
Olayan, Rawan S; Ashoor, Haitham; Bajic, Vladimir B
2018-04-01
Finding computationally drug-target interactions (DTIs) is a convenient strategy to identify new DTIs at low cost with reasonable accuracy. However, the current DTI prediction methods suffer the high false positive prediction rate. We developed DDR, a novel method that improves the DTI prediction accuracy. DDR is based on the use of a heterogeneous graph that contains known DTIs with multiple similarities between drugs and multiple similarities between target proteins. DDR applies non-linear similarity fusion method to combine different similarities. Before fusion, DDR performs a pre-processing step where a subset of similarities is selected in a heuristic process to obtain an optimized combination of similarities. Then, DDR applies a random forest model using different graph-based features extracted from the DTI heterogeneous graph. Using 5-repeats of 10-fold cross-validation, three testing setups, and the weighted average of area under the precision-recall curve (AUPR) scores, we show that DDR significantly reduces the AUPR score error relative to the next best start-of-the-art method for predicting DTIs by 34% when the drugs are new, by 23% when targets are new and by 34% when the drugs and the targets are known but not all DTIs between them are not known. Using independent sources of evidence, we verify as correct 22 out of the top 25 DDR novel predictions. This suggests that DDR can be used as an efficient method to identify correct DTIs. The data and code are provided at https://bitbucket.org/RSO24/ddr/. vladimir.bajic@kaust.edu.sa. Supplementary data are available at Bioinformatics online.
Simultaneous segmentation of the bone and cartilage surfaces of a knee joint in 3D
NASA Astrophysics Data System (ADS)
Yin, Y.; Zhang, X.; Anderson, D. D.; Brown, T. D.; Hofwegen, C. Van; Sonka, M.
2009-02-01
We present a novel framework for the simultaneous segmentation of multiple interacting surfaces belonging to multiple mutually interacting objects. The method is a non-trivial extension of our previously reported optimal multi-surface segmentation. Considering an example application of knee-cartilage segmentation, the framework consists of the following main steps: 1) Shape model construction: Building a mean shape for each bone of the joint (femur, tibia, patella) from interactively segmented volumetric datasets. Using the resulting mean-shape model - identification of cartilage, non-cartilage, and transition areas on the mean-shape bone model surfaces. 2) Presegmentation: Employment of iterative optimal surface detection method to achieve approximate segmentation of individual bone surfaces. 3) Cross-object surface mapping: Detection of inter-bone equidistant separating sheets to help identify corresponding vertex pairs for all interacting surfaces. 4) Multi-object, multi-surface graph construction and final segmentation: Construction of a single multi-bone, multi-surface graph so that two surfaces (bone and cartilage) with zero and non-zero intervening distances can be detected for each bone of the joint, according to whether or not cartilage can be locally absent or present on the bone. To define inter-object relationships, corresponding vertex pairs identified using the separating sheets were interlinked in the graph. The graph optimization algorithm acted on the entire multiobject, multi-surface graph to yield a globally optimal solution. The segmentation framework was tested on 16 MR-DESS knee-joint datasets from the Osteoarthritis Initiative database. The average signed surface positioning error for the 6 detected surfaces ranged from 0.00 to 0.12 mm. When independently initialized, the signed reproducibility error of bone and cartilage segmentation ranged from 0.00 to 0.26 mm. The results showed that this framework provides robust, accurate, and reproducible segmentation of the knee joint bone and cartilage surfaces of the femur, tibia, and patella. As a general segmentation tool, the developed framework can be applied to a broad range of multi-object segmentation problems.
North America Today: A Reproducible Atlas. 1995 Revised Edition.
ERIC Educational Resources Information Center
1995
This book contains illustrative maps, tables and graphs depicting North America's: size; population; resources; commodities; trade; languages; religions; cities; environment; food and agriculture; schooling; jobs; energy; industry, demographic statistics; women; aspects of government; and territorial disputes. Sections of the book include: (1)…
National Drug Control Strategy, 2002.
ERIC Educational Resources Information Center
Office of National Drug Control Policy, Washington, DC.
This federal document offers a comprehensive approach to reduce demand for illegal drugs and decrease their availability. Supported by statistical tables and graphs, the summary is divided into three sections. "Stopping Use Before It Starts: Education and Community Action" highlights the importance of prevention programs and the…
Analysis of Community Detection Algorithms for Large Scale Cyber Networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mane, Prachita; Shanbhag, Sunanda; Kamath, Tanmayee
The aim of this project is to use existing community detection algorithms on an IP network dataset to create supernodes within the network. This study compares the performance of different algorithms on the network in terms of running time. The paper begins with an introduction to the concept of clustering and community detection followed by the research question that the team aimed to address. Further the paper describes the graph metrics that were considered in order to shortlist algorithms followed by a brief explanation of each algorithm with respect to the graph metric on which it is based. The nextmore » section in the paper describes the methodology used by the team in order to run the algorithms and determine which algorithm is most efficient with respect to running time. Finally, the last section of the paper includes the results obtained by the team and a conclusion based on those results as well as future work.« less
Prieto, Luis P; Sharma, Kshitij; Kidzinski, Łukasz; Rodríguez-Triana, María Jesús; Dillenbourg, Pierre
2018-04-01
The pedagogical modelling of everyday classroom practice is an interesting kind of evidence, both for educational research and teachers' own professional development. This paper explores the usage of wearable sensors and machine learning techniques to automatically extract orchestration graphs (teaching activities and their social plane over time), on a dataset of 12 classroom sessions enacted by two different teachers in different classroom settings. The dataset included mobile eye-tracking as well as audiovisual and accelerometry data from sensors worn by the teacher. We evaluated both time-independent and time-aware models, achieving median F1 scores of about 0.7-0.8 on leave-one-session-out k-fold cross-validation. Although these results show the feasibility of this approach, they also highlight the need for larger datasets, recorded in a wider variety of classroom settings, to provide automated tagging of classroom practice that can be used in everyday practice across multiple teachers.
Flexibility and rigidity of cross-linked Straight Fibrils under axial motion constraints.
Nagy Kem, Gyula
2016-09-01
The Straight Fibrils are stiff rod-like filaments and play a significant role in cellular processes as structural stability and intracellular transport. Introducing a 3D mechanical model for the motion of braced cylindrical fibrils under axial motion constraint; we provide some mechanism and a graph theoretical model for fibril structures and give the characterization of the flexibility and the rigidity of this bar-and-joint spatial framework. The connectedness and the circuit of the bracing graph characterize the flexibility of these structures. In this paper, we focus on the kinematical properties of hierarchical levels of fibrils and evaluate the number of the bracing elements for the rigidity and its computational complexity. The presented model is a good characterization of the frameworks of bio-fibrils such as microtubules, cellulose, which inspired this work. Copyright © 2016 Elsevier Ltd. All rights reserved.
THE BRAIN (The Harvard Experimental Basic Reckoning and Instructional Network)
1968-10-01
shown to be flexible enough to be serviceable, at least initially, for most users. As the user grows in experience and his programs in...and firmly established of the two. The advance of science has been marked by a progressive and rapidly accelerating separation of observable...impossible for them to distinguish incorrect reasoning or calculation from errors in graphing. " The bridge crossed, the instrument grows more
Scalable Adaptive Architectures for Maritime Operations Center Command and Control
2011-05-06
the project to investigate the possibility of using earlier work on the validation and verification of rule bases in addressing the dynamically ...support the organization. To address the dynamically changing rules of engagement of a maritime force as it crosses different geographical areas, GMU... dynamic analysis, makes use of an Occurrence Graph that corresponds to the dynamics (or execution) of the Petri Net, to capture properties
Origin of hyperbolicity in brain-to-brain coordination networks
NASA Astrophysics Data System (ADS)
Tadić, Bosiljka; Andjelković, Miroslav; Šuvakov, Milovan
2018-02-01
Hyperbolicity or negative curvature of complex networks is the intrinsic geometric proximity of nodes in the graph metric space, which implies an improved network function. Here, we investigate hidden combinatorial geometries in brain-to-brain coordination networks arising through social communications. The networks originate from correlations among EEG signals previously recorded during spoken communications comprising of 14 individuals with 24 speaker-listener pairs. We find that the corresponding networks are delta-hyperbolic with delta_max=1 and the graph diameter D=3 in each brain. While the emergent hyperbolicity in the two-brain networks satisfies delta_max/D/2 < 1 and can be attributed to the topology of the subgraph formed around the cross-brains linking channels. We identify these subgraphs in each studied two-brain network and decompose their structure into simple geometric descriptors (triangles, tetrahedra and cliques of higher orders) that contribute to hyperbolicity. Considering topologies that exceed two separate brain networks as a measure of coordination synergy between the brains, we identify different neuronal correlation patterns ranging from weak coordination to super-brain structure. These topology features are in qualitative agreement with the listener’s self-reported ratings of own experience and quality of the speaker, suggesting that studies of the cross-brain connector networks can reveal new insight into the neural mechanisms underlying human social behavior.
An efficient algorithm for planar drawing of RNA structures with pseudoknots of any type.
Byun, Yanga; Han, Kyungsook
2016-06-01
An RNA pseudoknot is a tertiary structural element in which bases of a loop pair with complementary bases are outside the loop. A drawing of RNA secondary structures is a tree, but a drawing of RNA pseudoknots is a graph that has an inner cycle within a pseudoknot and possibly outer cycles formed between the pseudoknot and other structural elements. Visualizing a large-scale RNA structure with pseudoknots as a planar drawing is challenging because a planar drawing of an RNA structure requires both pseudoknots and an entire structure enclosing the pseudoknots to be embedded into a plane without overlapping or crossing. This paper presents an efficient heuristic algorithm for visualizing a pseudoknotted RNA structure as a planar drawing. The algorithm consists of several parts for finding crossing stems and page mapping the stems, for the layout of stem-loops and pseudoknots, and for overlap detection between structural elements and resolving it. Unlike previous algorithms, our algorithm generates a planar drawing for a large RNA structure with pseudoknots of any type and provides a bracket view of the structure. It generates a compact and aesthetic structure graph for a large pseudoknotted RNA structure in O([Formula: see text]) time, where n is the number of stems of the RNA structure.
Bone architecture and strength in the growing skeleton: the role of sedentary time.
Gabel, Leigh; McKay, Heather A; Nettlefold, Lindsay; Race, Douglas; Macdonald, Heather M
2015-02-01
Today's youths spend close to 60% of their waking hours in sedentary activities; however, we know little about the potentially deleterious effects of sedentary time on bone health during this key period of growth and development. Thus, our objective was to determine whether sedentary time is associated with bone architecture, mineral density, and strength in children, adolescents, and young adults. We used high-resolution peripheral quantitative computed tomography (Scanco Medical) to measure bone architecture (trabecular and cortical microstructure and bone macrostructure) and cortical and total bone mineral density (BMD) at the distal tibia (8% site) in 154 males and 174 females (9-20 yr) who were participants in the University of British Columbia Healthy Bones III study. We applied finite element analysis to high-resolution peripheral quantitative computed tomography scans to estimate bone strength. We assessed self-reported screen time in all participants using a questionnaire and sedentary time (volume and patterns) in a subsample of participants with valid accelerometry data (89 males and 117 females; ActiGraph GT1M). We fit sex-specific univariate multivariable regression models, controlling for muscle cross-sectional area, limb length, maturity, ethnicity, dietary calcium, and physical activity. We did not observe independent effect of screen time on bone architecture, BMD, or strength in either sex (P > 0.05). Likewise, when adjusted for muscle cross-sectional area, limb length, maturity, ethnicity, dietary calcium, and physical activity, accelerometry-derived volume of sedentary time and breaks in bouts of sedentary time were not a determinant of bone architecture, BMD, or strength in either sex (P > 0.05). Further study is warranted to determine whether the lack of association between sedentary time and bone architecture, BMD, and strength at the distal tibia is also present at other skeletal sites.
Henry, Teague; Gesell, Sabina B.; Ip, Edward H.
2016-01-01
Background Social networks influence children and adolescents’ physical activity. The focus of this paper is to examine the differences in the effects of physical activity on friendship selection, with eye to the implications on physical activity interventions for young children. Network interventions to increase physical activity are warranted but have not been conducted. Prior to implementing a network intervention in the field, it is important to understand potential heterogeneities in the effects that activity level have on network structure. In this study, the associations between activity level and cross sectional network structure, and activity level and change in network structure are assessed. Methods We studied a real-world friendship network among 81 children (average age 7.96 years) who lived in low SES neighborhoods, attended public schools, and attended one of two structured aftercare programs, of which one has existed and the other was new. We used the exponential random graph model (ERGMs) and its longitudinal extension to evaluate the association between activity level and various demographic factors in having, forming, and dissolving friendship. Due to heterogeneity between the friendship networks within the aftercare programs, separate analyses were conducted for each network. Results There was heterogeneity in the effect of physical activity on both cross sectional network structure and the formation and dissolution processes, both across time and between networks. Conclusions Network analysis could be used to assess the unique structure and dynamics of a social network before an intervention is implemented, so as to optimize the effects of the network intervention for increasing childhood physical activity. Additionally, if peer selection processes are changing within a network, a static network intervention strategy for childhood physical activity could become inefficient as the network evolves. PMID:27867518
Cohen, Kristen E; Morgan, Philip J; Plotnikoff, Ronald C; Callister, Robin; Lubans, David R
2014-04-08
Although previous studies have demonstrated that children with high levels of fundamental movement skill competency are more active throughout the day, little is known regarding children's fundamental movement skill competency and their physical activity during key time periods of the school day (i.e., lunchtime, recess and after-school). The purpose of this study was to examine the associations between fundamental movement skill competency and objectively measured moderate-to-vigorous physical activity (MVPA) throughout the school day among children attending primary schools in low-income communities. Eight primary schools from low-income communities and 460 children (8.5 ± 0.6 years, 54% girls) were involved in the study. Children's fundamental movement skill competency (TGMD-2; 6 locomotor and 6 object-control skills), objectively measured physical activity (ActiGraph GT3X and GT3X + accelerometers), height, weight and demographics were assessed. Multilevel linear mixed models were used to assess the cross-sectional associations between fundamental movement skills and MVPA. After adjusting for age, sex, BMI and socio-economic status, locomotor skill competency was positively associated with total (P=0.002, r=0.15) and after-school (P=0.014, r=0.13) MVPA. Object-control skill competency was positively associated with total (P<0.001, r=0.20), lunchtime (P=0.03, r=0.10), recess (P=0.006, r=0.11) and after-school (P=0.022, r=0.13) MVPA. Object-control skill competency appears to be a better predictor of children's MVPA during school-based physical activity opportunities than locomotor skill competency. Improving fundamental movement skill competency, particularly object-control skills, may contribute to increased levels of children's MVPA throughout the day. Australian New Zealand Clinical Trials Registry No: ACTRN12611001080910.
Cova, Ilaria; Pomati, Simone; Maggiore, Laura; Forcella, Marica; Cucumo, Valentina; Ghiretti, Roberta; Grande, Giulia; Muzio, Fulvio; Mariani, Claudio
2017-01-01
Analysis of nutritional status and body composition in Alzheimer's disease (AD) and Mild Cognitive Impairment (MCI). A cross-sectional study was performed in a University-Hospital setting, recruiting 59 patients with AD, 34 subjects with MCI and 58 elderly healthy controls (HC). Nutritional status was assessed by anthropometric parameters (body mass index; calf, upper arm and waist circumferences), Mini Nutritional Assessment (MNA) and body composition by bioelectrical impedance vector analysis (BIVA). Variables were analyzed by analysis of variance and subjects were grouped by cognitive status and gender. Sociodemographic variables did not differ among the three groups (AD, MCI and HC), except for females' age, which was therefore used as covariate in a general linear multivariate model. MNA score was significantly lower in AD patients than in HC; MCI subjects achieved intermediate scores. AD patients (both sexes) had significantly (p<0.05) higher height-normalized impedance values and lower phase angles (body cell mass) compared with HC; a higher ratio of impedance to height was found in men with MCI with respect to HC. With BIVA method, MCI subjects showed a significant displacement on the RXc graph on the right side indicating lower soft tissues (Hotelling's T2 test: men = 10.6; women = 7.9;p < 0,05) just like AD patients (Hotelling's T2 test: men = 18.2; women = 16.9; p<0,001). Bioelectrical parameters significantly differ from MCI and AD to HC; MCI showed an intermediate pattern between AD and HC. Longitudinal studies are required to investigate if BIVA could reflect early AD-changes in body composition in subjects with MCI.
Shoham, David A; Harris, Jenine K; Mundt, Marlon; McGaghie, William
2016-09-01
Healthcare teams consist of individuals communicating with one another during patient care delivery. Coordination of multiple specialties is critical for patients with complex health conditions, and requires interprofessional and intraprofessional communication. We examined a communication network of 71 health professionals in four professional roles: physician, nurse, health management, and support personnel (dietitian, pharmacist, or social worker), or other health professionals (including physical, respiratory, and occupational therapists, and medical students) working in a burn unit. Data for this cross-sectional study were collected by surveying members of a healthcare team. Ties were defined by asking team members whom they discussed patient care matters with on the shift. We built an exponential random graph model to determine: (1) does professional role influence the likelihood of a tie; (2) are ties more likely between team members from different professions compared to between team members from the same profession; and (3) which professions are more likely to form interprofessional ties. Health management and support personnel ties were 94% interprofessional while ties among nurses were 60% interprofessional. Nurses and other health professionals were significantly less likely than physicians to form ties. Nurses were 1.64 times more likely to communicate with nurses than non-nurses (OR = 1.64, 95% CI: 1.01-2.66); there was no significant role homophily for physicians, other health professionals, or health management and support personnel. Understanding communication networks in healthcare teams is an early step in understanding how teams work together to provide care; future work should evaluate the types and quality of interactions between members of interprofessional healthcare teams.
Flocks, James
2006-01-01
Scientific knowledge from the past century is commonly represented by two-dimensional figures and graphs, as presented in manuscripts and maps. Using today's computer technology, this information can be extracted and projected into three- and four-dimensional perspectives. Computer models can be applied to datasets to provide additional insight into complex spatial and temporal systems. This process can be demonstrated by applying digitizing and modeling techniques to valuable information within widely used publications. The seminal paper by D. Frazier, published in 1967, identified 16 separate delta lobes formed by the Mississippi River during the past 6,000 yrs. The paper includes stratigraphic descriptions through geologic cross-sections, and provides distribution and chronologies of the delta lobes. The data from Frazier's publication are extensively referenced in the literature. Additional information can be extracted from the data through computer modeling. Digitizing and geo-rectifying Frazier's geologic cross-sections produce a three-dimensional perspective of the delta lobes. Adding the chronological data included in the report provides the fourth-dimension of the delta cycles, which can be visualized through computer-generated animation. Supplemental information can be added to the model, such as post-abandonment subsidence of the delta-lobe surface. Analyzing the regional, net surface-elevation balance between delta progradations and land subsidence is computationally intensive. By visualizing this process during the past 4,500 yrs through multi-dimensional animation, the importance of sediment compaction in influencing both the shape and direction of subsequent delta progradations becomes apparent. Visualization enhances a classic dataset, and can be further refined using additional data, as well as provide a guide for identifying future areas of study.
Physical activity among children: objective measurements using Fitbit One® and ActiGraph.
Hamari, Lotta; Kullberg, Tiina; Ruohonen, Jukka; Heinonen, Olli J; Díaz-Rodríguez, Natalia; Lilius, Johan; Pakarinen, Anni; Myllymäki, Annukka; Leppänen, Ville; Salanterä, Sanna
2017-04-20
Self-quantification of health parameters is becoming more popular; thus, the validity of the devices requires assessments. The aim of this study was to evaluate the validity of Fitbit One step counts (Fitbit Inc., San Francisco, CA, USA) against Actigraph wActisleep-BT step counts (ActiGraph, LLC, Pensacola, FL, USA) for measuring habitual physical activity among children. The study was implemented as a cross-sectional experimental design in which participants carried two waist-worn activity monitors for five consecutive days. The participants were chosen with a purposive sampling from three fourth grade classes (9-10 year olds) in two comprehensive schools. Altogether, there were 34 participants in the study. From these, eight participants were excluded from the analysis due to erroneous data. Primary outcome measures for step counts were Fitbit One and Actigraph wActisleep-BT. The supporting outcome measures were based on activity diaries and initial information sheets. Classical Bland-Altman plots were used for reporting the results. The average per-participant daily difference between the step counts from the two devices was 1937. The range was [116, 5052]. Fitbit One gave higher step counts for all but the least active participant. According to a Bland-Altman plot, the hourly step counts had a relative large mean bias across participants (161 step counts). The differences were partially explained by activity intensity: higher intensity denoted higher differences, and light intensity denoted lower differences. Fitbit One step counts are comparable to Actigraph step counts in a sample of 9-10-year-old children engaged in habitual physical activity in sedentary and light physical activity intensities. However, in moderate-to-vigorous physical activity, Fitbit One gives higher step counts when compared to Actigraph.
Quinn, Ashlinn K; Ae-Ngibise, Kenneth Ayuurebobi; Jack, Darby W; Boamah, Ellen Abrafi; Enuameh, Yeetey; Mujtaba, Mohammed Nuhu; Chillrud, Steven N; Wylie, Blair J; Owusu-Agyei, Seth; Kinney, Patrick L; Asante, Kwaku Poku
2016-03-01
The Ghana Randomized Air Pollution and Health Study (GRAPHS) is a community-level randomized-controlled trial of cookstove interventions for pregnant women and their newborns in rural Ghana. Given that household air pollution from biomass burning may be implicated in adverse cardiovascular outcomes, we sought to determine whether exposure to carbon monoxide (CO) from woodsmoke was associated with blood pressure (BP) among 817 adult women. Multivariate linear regression models were used to evaluate the association between CO exposure, determined with 72 hour personal monitoring at study enrollment, and BP, also measured at study enrollment. At the time of these assessments, women were in the first or second trimester of pregnancy. A significant positive association was found between CO exposure and diastolic blood pressure (DBP): on average, each 1 ppm increase in exposure to CO was associated with 0.43 mmHg higher DBP [0.01, 0.86]. A non-significant positive trend was also observed for systolic blood pressure (SBP). This study is one of very few to have examined the relationship between household air pollution and blood pressure among pregnant women, who are at particular risk for hypertensive complications. The results of this cross-sectional study suggest that household air pollution from wood-burning fires is associated with higher blood pressure, particularly DBP, in pregnant women at early to mid-gestation. The clinical implications of the observed association toward the eventual development of chronic hypertension and/or hypertensive complications of pregnancy remain uncertain, as few of the women were overtly hypertensive at this point in their pregnancies. Copyright © 2015 Elsevier GmbH. All rights reserved.
Cooke, Alexandra B; Daskalopoulou, Stella S; Dasgupta, Kaberi
2018-04-01
Accelerometer placement at the wrist is convenient and increasingly adopted despite less accurate physical activity (PA) measurement than with waist placement. Capitalizing on a study that started with wrist placement and shifted to waist placement, we compared associations between PA measures derived from different accelerometer locations with a responsive arterial health indicator, carotid-femoral pulse wave velocity (cfPWV). Cross-sectional study. We previously demonstrated an inverse association between waist-worn pedometer-assessed step counts (Yamax SW-200, 7 days) and cfPWV (-0.20m/s, 95% CI -0.28, -0.12 per 1000 step/day increment) in 366 adults. Participants concurrently wore accelerometers (ActiGraph GT3X+), most at the waist but the first 46 at the wrist. We matched this subgroup with participants from the 'waist accelerometer' group (sex, age, and pedometer-assessed steps/day) and assessed associations with cfPWV (applanation tonometry, Sphygmocor) separately in each subgroup through linear regression models. Compared to the waist group, wrist group participants had higher step counts (mean difference 3980 steps/day; 95% CI 2517, 5443), energy expenditure (967kcal/day, 95% CI 755, 1179), and moderate-to-vigorous-PA (138min; 95% CI 114, 162). Accelerometer-assessed step counts (waist) suggested an association with cfPWV (-0.28m/s, 95% CI -0.58, 0.01); but no relationship was apparent with wrist-assessed steps (0.02m/s, 95% CI -0.24, 0.27). Waist but not wrist ActiGraph PA measures signal associations between PA and cfPWV. We urge researchers to consider the importance of wear location choice on relationships with health indicators. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
HGIMDA: Heterogeneous graph inference for miRNA-disease association prediction
Zhang, Xu; You, Zhu-Hong; Huang, Yu-An; Yan, Gui-Ying
2016-01-01
Recently, microRNAs (miRNAs) have drawn more and more attentions because accumulating experimental studies have indicated miRNA could play critical roles in multiple biological processes as well as the development and progression of human complex diseases. Using the huge number of known heterogeneous biological datasets to predict potential associations between miRNAs and diseases is an important topic in the field of biology, medicine, and bioinformatics. In this study, considering the limitations in the previous computational methods, we developed the computational model of Heterogeneous Graph Inference for MiRNA-Disease Association prediction (HGIMDA) to uncover potential miRNA-disease associations by integrating miRNA functional similarity, disease semantic similarity, Gaussian interaction profile kernel similarity, and experimentally verified miRNA-disease associations into a heterogeneous graph. HGIMDA obtained AUCs of 0.8781 and 0.8077 based on global and local leave-one-out cross validation, respectively. Furthermore, HGIMDA was applied to three important human cancers for performance evaluation. As a result, 90% (Colon Neoplasms), 88% (Esophageal Neoplasms) and 88% (Kidney Neoplasms) of top 50 predicted miRNAs are confirmed by recent experiment reports. Furthermore, HGIMDA could be effectively applied to new diseases and new miRNAs without any known associations, which overcome the important limitations of many previous computational models. PMID:27533456
HGIMDA: Heterogeneous graph inference for miRNA-disease association prediction.
Chen, Xing; Yan, Chenggang Clarence; Zhang, Xu; You, Zhu-Hong; Huang, Yu-An; Yan, Gui-Ying
2016-10-04
Recently, microRNAs (miRNAs) have drawn more and more attentions because accumulating experimental studies have indicated miRNA could play critical roles in multiple biological processes as well as the development and progression of human complex diseases. Using the huge number of known heterogeneous biological datasets to predict potential associations between miRNAs and diseases is an important topic in the field of biology, medicine, and bioinformatics. In this study, considering the limitations in the previous computational methods, we developed the computational model of Heterogeneous Graph Inference for MiRNA-Disease Association prediction (HGIMDA) to uncover potential miRNA-disease associations by integrating miRNA functional similarity, disease semantic similarity, Gaussian interaction profile kernel similarity, and experimentally verified miRNA-disease associations into a heterogeneous graph. HGIMDA obtained AUCs of 0.8781 and 0.8077 based on global and local leave-one-out cross validation, respectively. Furthermore, HGIMDA was applied to three important human cancers for performance evaluation. As a result, 90% (Colon Neoplasms), 88% (Esophageal Neoplasms) and 88% (Kidney Neoplasms) of top 50 predicted miRNAs are confirmed by recent experiment reports. Furthermore, HGIMDA could be effectively applied to new diseases and new miRNAs without any known associations, which overcome the important limitations of many previous computational models.
Immune networks: multitasking capabilities near saturation
NASA Astrophysics Data System (ADS)
Agliari, E.; Annibale, A.; Barra, A.; Coolen, A. C. C.; Tantari, D.
2013-10-01
Pattern-diluted associative networks were recently introduced as models for the immune system, with nodes representing T-lymphocytes and stored patterns representing signalling protocols between T- and B-lymphocytes. It was shown earlier that in the regime of extreme pattern dilution, a system with NT T-lymphocytes can manage a number N_B={ {O}}(N_T^\\delta ) of B-lymphocytes simultaneously, with δ < 1. Here we study this model in the extensive load regime NB = αNT, with a high degree of pattern dilution, in agreement with immunological findings. We use graph theory and statistical mechanical analysis based on replica methods to show that in the finite-connectivity regime, where each T-lymphocyte interacts with a finite number of B-lymphocytes as NT → ∞, the T-lymphocytes can coordinate effective immune responses to an extensive number of distinct antigen invasions in parallel. As α increases, the system eventually undergoes a second order transition to a phase with clonal cross-talk interference, where the system’s performance degrades gracefully. Mathematically, the model is equivalent to a spin system on a finitely connected graph with many short loops, so one would expect the available analytical methods, which all assume locally tree-like graphs, to fail. Yet it turns out to be solvable. Our results are supported by numerical simulations.
16 CFR 1025.33 - Production of documents and things.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Section 1025.33 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION GENERAL RULES OF PRACTICE FOR... (including writings, drawings, graphs, charts, photographs, phono-records, and any other data compilation..., custody, or control of the party upon whom the request is served, or (2) To permit entry upon designated...
Inside Rural Pennsylvania: A Statistical Profile.
ERIC Educational Resources Information Center
Center for Rural Pennsylvania, Harrisburg.
Graphs, data tables, maps, and written descriptions give a statistical overview of rural Pennsylvania. A section on rural demographics covers population changes, racial and ethnic makeup, age cohorts, and families and income. Pennsylvania's rural population, the nation's largest, has increased more than its urban population since 1950, with the…
47 CFR 73.312 - Topographic data.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 47 Telecommunication 4 2012-10-01 2012-10-01 false Topographic data. 73.312 Section 73.312... Broadcast Stations § 73.312 Topographic data. (a) In the preparation of the profile graphs previously... question, the next best topographic information should be used. Topographic data may sometimes be obtained...
47 CFR 73.312 - Topographic data.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 47 Telecommunication 4 2013-10-01 2013-10-01 false Topographic data. 73.312 Section 73.312... Broadcast Stations § 73.312 Topographic data. (a) In the preparation of the profile graphs previously... question, the next best topographic information should be used. Topographic data may sometimes be obtained...
47 CFR 73.312 - Topographic data.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 47 Telecommunication 4 2014-10-01 2014-10-01 false Topographic data. 73.312 Section 73.312... Broadcast Stations § 73.312 Topographic data. (a) In the preparation of the profile graphs previously... question, the next best topographic information should be used. Topographic data may sometimes be obtained...
47 CFR 73.312 - Topographic data.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Broadcast Stations § 73.312 Topographic data. (a) In the preparation of the profile graphs previously... 47 Telecommunication 4 2011-10-01 2011-10-01 false Topographic data. 73.312 Section 73.312... question, the next best topographic information should be used. Topographic data may sometimes be obtained...
47 CFR 73.312 - Topographic data.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 4 2010-10-01 2010-10-01 false Topographic data. 73.312 Section 73.312... Broadcast Stations § 73.312 Topographic data. (a) In the preparation of the profile graphs previously... question, the next best topographic information should be used. Topographic data may sometimes be obtained...
Couple Graph Based Label Propagation Method for Hyperspectral Remote Sensing Data Classification
NASA Astrophysics Data System (ADS)
Wang, X. P.; Hu, Y.; Chen, J.
2018-04-01
Graph based semi-supervised classification method are widely used for hyperspectral image classification. We present a couple graph based label propagation method, which contains both the adjacency graph and the similar graph. We propose to construct the similar graph by using the similar probability, which utilize the label similarity among examples probably. The adjacency graph was utilized by a common manifold learning method, which has effective improve the classification accuracy of hyperspectral data. The experiments indicate that the couple graph Laplacian which unite both the adjacency graph and the similar graph, produce superior classification results than other manifold Learning based graph Laplacian and Sparse representation based graph Laplacian in label propagation framework.
Multi-Centrality Graph Spectral Decompositions and Their Application to Cyber Intrusion Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Pin-Yu; Choudhury, Sutanay; Hero, Alfred
Many modern datasets can be represented as graphs and hence spectral decompositions such as graph principal component analysis (PCA) can be useful. Distinct from previous graph decomposition approaches based on subspace projection of a single topological feature, e.g., the centered graph adjacency matrix (graph Laplacian), we propose spectral decomposition approaches to graph PCA and graph dictionary learning that integrate multiple features, including graph walk statistics, centrality measures and graph distances to reference nodes. In this paper we propose a new PCA method for single graph analysis, called multi-centrality graph PCA (MC-GPCA), and a new dictionary learning method for ensembles ofmore » graphs, called multi-centrality graph dictionary learning (MC-GDL), both based on spectral decomposition of multi-centrality matrices. As an application to cyber intrusion detection, MC-GPCA can be an effective indicator of anomalous connectivity pattern and MC-GDL can provide discriminative basis for attack classification.« less
Graphs, matrices, and the GraphBLAS: Seven good reasons
Kepner, Jeremy; Bader, David; Buluç, Aydın; ...
2015-01-01
The analysis of graphs has become increasingly important to a wide range of applications. Graph analysis presents a number of unique challenges in the areas of (1) software complexity, (2) data complexity, (3) security, (4) mathematical complexity, (5) theoretical analysis, (6) serial performance, and (7) parallel performance. Implementing graph algorithms using matrix-based approaches provides a number of promising solutions to these challenges. The GraphBLAS standard (istcbigdata.org/GraphBlas) is being developed to bring the potential of matrix based graph algorithms to the broadest possible audience. The GraphBLAS mathematically defines a core set of matrix-based graph operations that can be used to implementmore » a wide class of graph algorithms in a wide range of programming environments. This paper provides an introduction to the GraphBLAS and describes how the GraphBLAS can be used to address many of the challenges associated with analysis of graphs.« less
Adjusting protein graphs based on graph entropy.
Peng, Sheng-Lung; Tsay, Yu-Wei
2014-01-01
Measuring protein structural similarity attempts to establish a relationship of equivalence between polymer structures based on their conformations. In several recent studies, researchers have explored protein-graph remodeling, instead of looking a minimum superimposition for pairwise proteins. When graphs are used to represent structured objects, the problem of measuring object similarity become one of computing the similarity between graphs. Graph theory provides an alternative perspective as well as efficiency. Once a protein graph has been created, its structural stability must be verified. Therefore, a criterion is needed to determine if a protein graph can be used for structural comparison. In this paper, we propose a measurement for protein graph remodeling based on graph entropy. We extend the concept of graph entropy to determine whether a graph is suitable for representing a protein. The experimental results suggest that when applied, graph entropy helps a conformational on protein graph modeling. Furthermore, it indirectly contributes to protein structural comparison if a protein graph is solid.
Adjusting protein graphs based on graph entropy
2014-01-01
Measuring protein structural similarity attempts to establish a relationship of equivalence between polymer structures based on their conformations. In several recent studies, researchers have explored protein-graph remodeling, instead of looking a minimum superimposition for pairwise proteins. When graphs are used to represent structured objects, the problem of measuring object similarity become one of computing the similarity between graphs. Graph theory provides an alternative perspective as well as efficiency. Once a protein graph has been created, its structural stability must be verified. Therefore, a criterion is needed to determine if a protein graph can be used for structural comparison. In this paper, we propose a measurement for protein graph remodeling based on graph entropy. We extend the concept of graph entropy to determine whether a graph is suitable for representing a protein. The experimental results suggest that when applied, graph entropy helps a conformational on protein graph modeling. Furthermore, it indirectly contributes to protein structural comparison if a protein graph is solid. PMID:25474347
Lingley, Alexander J; Bowdridge, Joshua C; Farivar, Reza; Duffy, Kevin R
2018-04-30
A single histological marker applied to a slice of tissue often reveals myriad cytoarchitectonic characteristics that can obscure differences between neuron populations targeted for study. Isolation and measurement of a single feature from the tissue is possible through a variety of approaches, however, visualizing the data numerically or through graphs alone can preclude being able to identify important features and effects that are not obvious from direct observation of the tissue. We demonstrate an efficient, effective, and robust approach to quantify and visualize cytoarchitectural features in histologically prepared brain sections. We demonstrate that this approach is able to reveal small differences between populations of neurons that might otherwise have gone undiscovered. We used stereological methods to record the cross-sectional soma area and in situ position of neurons within sections of the cat, monkey, and human visual system. The two-dimensional coordinate of every measured cell was used to produce a scatter plot that recapitulated the natural spatial distribution of cells, and each point in the plot was color-coded according to its respective soma area. The final graphic display was a multi-dimensional map of neuron soma size that revealed subtle differences across neuron aggregations, permitted delineation of regional boundaries, and identified small differences between populations of neurons modified by a period of sensory deprivation. This approach to collecting and displaying cytoarchitectonic data is simple, efficient, and provides a means of investigating small differences between neuron populations. Copyright © 2018. Published by Elsevier B.V.
Characterizing Containment and Related Classes of Graphs,
1985-01-01
Math . to appear. [G2] Golumbic,. Martin C., D. Rotem and J. Urrutia. "Comparability graphs and intersection graphs" Discrete Math . 43 (1983) 37-40. [G3...intersection classes of graphs" Discrete Math . to appear. [S2] Scheinerman, Edward R. Intersection Classes and Multiple Intersection Parameters of Graphs...graphs and of interval graphs" Canad. Jour. of blath. 16 (1964) 539-548. [G1] Golumbic, Martin C. "Containment graphs: and. intersection graphs" Discrete
Hegarty, Peter; Lemieux, Anthony F; McQueen, Grant
2010-03-01
Graphs seem to connote facts more than words or tables do. Consequently, they seem unlikely places to spot implicit sexism at work. Yet, in 6 studies (N = 741), women and men constructed (Study 1) and recalled (Study 2) gender difference graphs with men's data first, and graphed powerful groups (Study 3) and individuals (Study 4) ahead of weaker ones. Participants who interpreted graph order as evidence of author "bias" inferred that the author graphed his or her own gender group first (Study 5). Women's, but not men's, preferences to graph men first were mitigated when participants graphed a difference between themselves and an opposite-sex friend prior to graphing gender differences (Study 6). Graph production and comprehension are affected by beliefs and suppositions about the groups represented in graphs to a greater degree than cognitive models of graph comprehension or realist models of scientific thinking have yet acknowledged.
1989-12-01
and handling of registered mail is contained in Section 911, Dii (reference (i)), and in Chapter 8. (See Volume II, Chapter 4, Para- graph 406 for...determined by the retail value in the country of acquisition. Duty free mailings to the CTUS are subject to the following conditions: a. Perfume
Science and Technology Pocket Data Book.
ERIC Educational Resources Information Center
National Science Foundation, Washington, DC. Div. of Science Resources Studies.
This pocket guide contains a collection of graphed data, available in 1994, on science and technology funding patterns within the United States, public attitudes toward science and technology, and international trends in science and technology. Sections contain: (1) national research and development (R&D) funding patterns; (2) academic R&D…
The National Drug Control Strategy, 1997.
ERIC Educational Resources Information Center
Office of National Drug Control Policy, Washington, DC.
This federal document offers a comprehensive approach to reduce demand for illegal drugs and decrease their availability. Supported by statistical tables and graphs, the summary is divided into six sections. "The Purpose and Nature of the Strategy" outlines a 10-year plan for drug interdiction and reduction and identifies the elements of…
1986 Agricultural Chartbook. Agriculture Handbook No. 663.
ERIC Educational Resources Information Center
Department of Agriculture, Washington, DC.
This book contains 310 charts, tables, and graphs containing statistical information about agriculture-related commodities and services, primarily in the United States, in 1986. The book is organized in seven sections that cover the following topics: (1) the farm (farm income, farm population, farm workers, food and fiber system, agriculture and…
Arizona Conserve Water Educators Guide
ERIC Educational Resources Information Center
Project WET Foundation, 2007
2007-01-01
This award-winning, 350-page, full-color book provides a thorough study of Arizona water resources from a water conservation perspective. Its background section contains maps, graphs, diagrams and photos that facilitate the teaching of 15 interactive, multi-disciplinary lessons to K-12 students. In addition, 10 Arizona case studies are highlighted…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Symons, Christopher T; Arel, Itamar
2011-01-01
Budgeted learning under constraints on both the amount of labeled information and the availability of features at test time pertains to a large number of real world problems. Ideas from multi-view learning, semi-supervised learning, and even active learning have applicability, but a common framework whose assumptions fit these problem spaces is non-trivial to construct. We leverage ideas from these fields based on graph regularizers to construct a robust framework for learning from labeled and unlabeled samples in multiple views that are non-independent and include features that are inaccessible at the time the model would need to be applied. We describemore » examples of applications that fit this scenario, and we provide experimental results to demonstrate the effectiveness of knowledge carryover from training-only views. As learning algorithms are applied to more complex applications, relevant information can be found in a wider variety of forms, and the relationships between these information sources are often quite complex. The assumptions that underlie most learning algorithms do not readily or realistically permit the incorporation of many of the data sources that are available, despite an implicit understanding that useful information exists in these sources. When multiple information sources are available, they are often partially redundant, highly interdependent, and contain noise as well as other information that is irrelevant to the problem under study. In this paper, we are focused on a framework whose assumptions match this reality, as well as the reality that labeled information is usually sparse. Most significantly, we are interested in a framework that can also leverage information in scenarios where many features that would be useful for learning a model are not available when the resulting model will be applied. As with constraints on labels, there are many practical limitations on the acquisition of potentially useful features. A key difference in the case of feature acquisition is that the same constraints often don't pertain to the training samples. This difference provides an opportunity to allow features that are impractical in an applied setting to nevertheless add value during the model-building process. Unfortunately, there are few machine learning frameworks built on assumptions that allow effective utilization of features that are only available at training time. In this paper we formulate a knowledge carryover framework for the budgeted learning scenario with constraints on features and labels. The approach is based on multi-view and semi-supervised learning methods that use graph-encoded regularization. Our main contributions are the following: (1) we propose and provide justification for a methodology for ensuring that changes in the graph regularizer using alternate views are performed in a manner that is target-concept specific, allowing value to be obtained from noisy views; and (2) we demonstrate how this general set-up can be used to effectively improve models by leveraging features unavailable at test time. The rest of the paper is structured as follows. In Section 2, we outline real-world problems to motivate the approach and describe relevant prior work. Section 3 describes the graph construction process and the learning methodologies that are employed. Section 4 provides preliminary discussion regarding theoretical motivation for the method. In Section 5, effectiveness of the approach is demonstrated in a series of experiments employing modified versions of two well-known semi-supervised learning algorithms. Section 6 concludes the paper.« less
Olson, Scott A.; Weber, Matthew A.
1996-01-01
Scour depths and rock rip-rap sizes were computed using the general guidelines described in Hydraulic Engineering Circular 18 (Richardson and others, 1993). Scour depths were calculated assuming an infinite depth of erosive material and a homogeneous particle-size distribution. The scour analysis results are presented in tables 1 and 2 and a graph of the scour depths is presented in figure 8.
Planar Embedding of Planar Graphs,
1983-02-01
Stanford University and supported by a Chaim Wcismann postdoctoral fellowship and DARPA contract MDAOO3-C-0102. Current address: Institute of ...rectilinear embeddings (both with and without cross - overs), using the bounding box area cost. He proved that a tree of vertices with maximum degree 4 can...be laid out without crossovers in an area that is linear in the number of edges (or vertices). He also showed how Ato get a such an embedding for any
Ayotte, Joseph D.
1996-01-01
Scour depths and rock rip-rap sizes were computed using the general guidelines described in Hydraulic Engineering Circular 18 (Richardson and others, 1993). Scour depths were calculated assuming an infinite depth of erosive material and a homogeneous particle-size distribution. The scour analysis results are presented in tables 1 and 2 and a graph of the scour depths is presented in figure 8.
Boehmler, Erick M.
1996-01-01
Scour depths and rock rip-rap sizes were computed using the general guidelines described in Hydraulic Engineering Circular 18 (Richardson and others, 1993). Scour depths were calculated assuming an infinite depth of erosive material and a homogeneous particle-size distribution. The scour analysis results are presented in tables 1 and 2 and a graph of the scour depths is presented in figure 8.
Boehmler, Erick M.
1996-01-01
Scour depths and rock rip-rap sizes were computed using the general guidelines described in Hydraulic Engineering Circular 18 (Richardson and others, 1993). Scour depths were calculated assuming an infinite depth of erosive material and a homogeneous particle-size distribution. The scour analysis results are presented in tables 1 and 2 and a graph of the scour depths is presented in figure 8.
Olson, Scott A.
1996-01-01
Scour depths and rock rip-rap sizes were computed using the general guidelines described in Hydraulic Engineering Circular 18 (Richardson and others, 1993). Scour depths were calculated assuming an infinite depth of erosive material and a homogeneous particle-size distribution. The scour analysis results are presented in tables 1 and 2 and a graph of the scour depths is presented in figure 8.
Ayotte, Joseph D.
1996-01-01
Scour depths and rock rip-rap sizes were computed using the general guidelines described in Hydraulic Engineering Circular 18 (Richardson and others, 1993). Scour depths were calculated assuming an infinite depth of erosive material and a homogeneous particle-size distribution. The scour analysis results are presented in tables 1 and 2 and a graph of the scour depths is presented in figure 8.
Ayotte, Joseph D.
1996-01-01
Scour depths and rock rip-rap sizes were computed using the general guidelines described in Hydraulic Engineering Circular 18 (Richardson and others, 1993). Scour depths were calculated assuming an infinite depth of erosive material and a homogeneous particle-size distribution. The scour analysis results are presented in tables 1 and 2 and a graph of the scour depths is presented in figure 8.
ERIC Educational Resources Information Center
Yoder, Sharon K.
This book discusses four kinds of graphs that are taught in mathematics at the middle school level: pictographs, bar graphs, line graphs, and circle graphs. The chapters on each of these types of graphs contain information such as starting, scaling, drawing, labeling, and finishing the graphs using "LogoWriter." The final chapter of the…
Automatic segmentation of the choroid in enhanced depth imaging optical coherence tomography images.
Tian, Jing; Marziliano, Pina; Baskaran, Mani; Tun, Tin Aung; Aung, Tin
2013-03-01
Enhanced Depth Imaging (EDI) optical coherence tomography (OCT) provides high-definition cross-sectional images of the choroid in vivo, and hence is used in many clinical studies. However, the quantification of the choroid depends on the manual labelings of two boundaries, Bruch's membrane and the choroidal-scleral interface. This labeling process is tedious and subjective of inter-observer differences, hence, automatic segmentation of the choroid layer is highly desirable. In this paper, we present a fast and accurate algorithm that could segment the choroid automatically. Bruch's membrane is detected by searching the pixel with the biggest gradient value above the retinal pigment epithelium (RPE) and the choroidal-scleral interface is delineated by finding the shortest path of the graph formed by valley pixels using Dijkstra's algorithm. The experiments comparing automatic segmentation results with the manual labelings are conducted on 45 EDI-OCT images and the average of Dice's Coefficient is 90.5%, which shows good consistency of the algorithm with the manual labelings. The processing time for each image is about 1.25 seconds.
NLO predictions for the production of a spin-two particle at the LHC
Das, Goutam; Degrande, Céline; Hirschi, Valentin; ...
2017-05-08
We obtain predictions accurate at the next-to-leading order in QCD for the production of a generic spin-two particle in the most relevant channels at the LHC: production in association with coloured particles (inclusive, one jet, two jets andmore » $$t\\bar t$$), with vector bosons ($$Z,W^\\pm,\\gamma$$) and with the Higgs boson. Here, we present total and differential cross sections as well as branching ratios as a function of the mass and the collision energy also considering the case of non-universal couplings to standard model particles. We find that the next-to-leading order corrections give rise to sizeable $K$ factors for many channels, in some cases exposing the unitarity-violating behaviour of non-universal couplings scenarios, and in general greatly reduce the theoretical uncertainties. Our predictions are publicly available in the MadGraph5_aMC@NLO framework and can, therefore, be directly used in experimental simulations of spin-two particle production for arbitrary values of the mass and couplings.« less
NLO predictions for the production of a spin-two particle at the LHC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Das, Goutam; Degrande, Céline; Hirschi, Valentin
We obtain predictions accurate at the next-to-leading order in QCD for the production of a generic spin-two particle in the most relevant channels at the LHC: production in association with coloured particles (inclusive, one jet, two jets andmore » $$t\\bar t$$), with vector bosons ($$Z,W^\\pm,\\gamma$$) and with the Higgs boson. Here, we present total and differential cross sections as well as branching ratios as a function of the mass and the collision energy also considering the case of non-universal couplings to standard model particles. We find that the next-to-leading order corrections give rise to sizeable $K$ factors for many channels, in some cases exposing the unitarity-violating behaviour of non-universal couplings scenarios, and in general greatly reduce the theoretical uncertainties. Our predictions are publicly available in the MadGraph5_aMC@NLO framework and can, therefore, be directly used in experimental simulations of spin-two particle production for arbitrary values of the mass and couplings.« less
Inventory Uncertainty Quantification using TENDL Covariance Data in Fispact-II
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eastwood, J.W.; Morgan, J.G.; Sublet, J.-Ch., E-mail: jean-christophe.sublet@ccfe.ac.uk
2015-01-15
The new inventory code Fispact-II provides predictions of inventory, radiological quantities and their uncertainties using nuclear data covariance information. Central to the method is a novel fast pathways search algorithm using directed graphs. The pathways output provides (1) an aid to identifying important reactions, (2) fast estimates of uncertainties, (3) reduced models that retain important nuclides and reactions for use in the code's Monte Carlo sensitivity analysis module. Described are the methods that are being implemented for improving uncertainty predictions, quantification and propagation using the covariance data that the recent nuclear data libraries contain. In the TENDL library, above themore » upper energy of the resolved resonance range, a Monte Carlo method in which the covariance data come from uncertainties of the nuclear model calculations is used. The nuclear data files are read directly by FISPACT-II without any further intermediate processing. Variance and covariance data are processed and used by FISPACT-II to compute uncertainties in collapsed cross sections, and these are in turn used to predict uncertainties in inventories and all derived radiological data.« less
Study for requirement of advanced long life small modular fast reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tak, Taewoo, E-mail: ttwispy@unist.ac.kr; Choe, Jiwon, E-mail: chi91023@unist.ac.kr; Jeong, Yongjin, E-mail: yjjeong09@unist.ac.kr
2016-01-22
To develop an advanced long-life SMR core concept, the feasibility of the long-life breed-and-burn core concept has been assessed and the preliminary selection on the reactor design requirement such as fuel form, coolant material has been performed. With the simplified cigar-type geometry of 8m-tall CANDLE reactor concept, it has demonstrated the strengths of breed-and-burn strategy. There is a saturation region in the graph for the multiplication factors, which means that a steady breeding is being proceeded along the axial direction. The propagation behavior of the CANDLE core can be also confirmed through the evolution of the axial power profile. Coolantmore » material is expected to have low melting point, density, viscosity and absorption cross section and a high boiling point, specific heat, and thermal conductivity. In this respect, sodium is preferable material for a coolant of this nuclear power plant system. The metallic fuel has harder spectrum compared to the oxide and carbide fuel, which is favorable to increase the breeding and extend the cycle length.« less
Nondestructive analysis of automotive paints with spectral domain optical coherence tomography.
Dong, Yue; Lawman, Samuel; Zheng, Yalin; Williams, Dominic; Zhang, Jinke; Shen, Yao-Chun
2016-05-01
We have demonstrated for the first time, to our knowledge, the use of optical coherence tomography (OCT) as an analytical tool for nondestructively characterizing the individual paint layer thickness of multiple layered automotive paints. A graph-based segmentation method was used for automatic analysis of the thickness distribution for the top layers of solid color paints. The thicknesses measured with OCT were in good agreement with the optical microscope and ultrasonic techniques that are the current standard in the automobile industry. Because of its high axial resolution (5.5 μm), the OCT technique was shown to be able to resolve the thickness of individual paint layers down to 11 μm. With its high lateral resolution (12.4 μm), the OCT system was also able to measure the cross-sectional area of the aluminum flakes in a metallic automotive paint. The range of values measured was 300-1850 μm2. In summary, the proposed OCT is a noncontact, high-resolution technique that has the potential for inclusion as part of the quality assurance process in automobile coating.
Study for requirement of advanced long life small modular fast reactor
NASA Astrophysics Data System (ADS)
Tak, Taewoo; Choe, Jiwon; Jeong, Yongjin; Lee, Deokjung; Kim, T. K.
2016-01-01
To develop an advanced long-life SMR core concept, the feasibility of the long-life breed-and-burn core concept has been assessed and the preliminary selection on the reactor design requirement such as fuel form, coolant material has been performed. With the simplified cigar-type geometry of 8m-tall CANDLE reactor concept, it has demonstrated the strengths of breed-and-burn strategy. There is a saturation region in the graph for the multiplication factors, which means that a steady breeding is being proceeded along the axial direction. The propagation behavior of the CANDLE core can be also confirmed through the evolution of the axial power profile. Coolant material is expected to have low melting point, density, viscosity and absorption cross section and a high boiling point, specific heat, and thermal conductivity. In this respect, sodium is preferable material for a coolant of this nuclear power plant system. The metallic fuel has harder spectrum compared to the oxide and carbide fuel, which is favorable to increase the breeding and extend the cycle length.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Fangyan; Zhang, Song; Chung Wong, Pak
Effectively visualizing large graphs and capturing the statistical properties are two challenging tasks. To aid in these two tasks, many sampling approaches for graph simplification have been proposed, falling into three categories: node sampling, edge sampling, and traversal-based sampling. It is still unknown which approach is the best. We evaluate commonly used graph sampling methods through a combined visual and statistical comparison of graphs sampled at various rates. We conduct our evaluation on three graph models: random graphs, small-world graphs, and scale-free graphs. Initial results indicate that the effectiveness of a sampling method is dependent on the graph model, themore » size of the graph, and the desired statistical property. This benchmark study can be used as a guideline in choosing the appropriate method for a particular graph sampling task, and the results presented can be incorporated into graph visualization and analysis tools.« less
Stavrakas, Vassilis; Melas, Ioannis N; Sakellaropoulos, Theodore; Alexopoulos, Leonidas G
2015-01-01
Modeling of signal transduction pathways is instrumental for understanding cells' function. People have been tackling modeling of signaling pathways in order to accurately represent the signaling events inside cells' biochemical microenvironment in a way meaningful for scientists in a biological field. In this article, we propose a method to interrogate such pathways in order to produce cell-specific signaling models. We integrate available prior knowledge of protein connectivity, in a form of a Prior Knowledge Network (PKN) with phosphoproteomic data to construct predictive models of the protein connectivity of the interrogated cell type. Several computational methodologies focusing on pathways' logic modeling using optimization formulations or machine learning algorithms have been published on this front over the past few years. Here, we introduce a light and fast approach that uses a breadth-first traversal of the graph to identify the shortest pathways and score proteins in the PKN, fitting the dependencies extracted from the experimental design. The pathways are then combined through a heuristic formulation to produce a final topology handling inconsistencies between the PKN and the experimental scenarios. Our results show that the algorithm we developed is efficient and accurate for the construction of medium and large scale signaling networks. We demonstrate the applicability of the proposed approach by interrogating a manually curated interaction graph model of EGF/TNFA stimulation against made up experimental data. To avoid the possibility of erroneous predictions, we performed a cross-validation analysis. Finally, we validate that the introduced approach generates predictive topologies, comparable to the ILP formulation. Overall, an efficient approach based on graph theory is presented herein to interrogate protein-protein interaction networks and to provide meaningful biological insights.
Solving the scalability issue in quantum-based refinement: Q|R#1.
Zheng, Min; Moriarty, Nigel W; Xu, Yanting; Reimers, Jeffrey R; Afonine, Pavel V; Waller, Mark P
2017-12-01
Accurately refining biomacromolecules using a quantum-chemical method is challenging because the cost of a quantum-chemical calculation scales approximately as n m , where n is the number of atoms and m (≥3) is based on the quantum method of choice. This fundamental problem means that quantum-chemical calculations become intractable when the size of the system requires more computational resources than are available. In the development of the software package called Q|R, this issue is referred to as Q|R#1. A divide-and-conquer approach has been developed that fragments the atomic model into small manageable pieces in order to solve Q|R#1. Firstly, the atomic model of a crystal structure is analyzed to detect noncovalent interactions between residues, and the results of the analysis are represented as an interaction graph. Secondly, a graph-clustering algorithm is used to partition the interaction graph into a set of clusters in such a way as to minimize disruption to the noncovalent interaction network. Thirdly, the environment surrounding each individual cluster is analyzed and any residue that is interacting with a particular cluster is assigned to the buffer region of that particular cluster. A fragment is defined as a cluster plus its buffer region. The gradients for all atoms from each of the fragments are computed, and only the gradients from each cluster are combined to create the total gradients. A quantum-based refinement is carried out using the total gradients as chemical restraints. In order to validate this interaction graph-based fragmentation approach in Q|R, the entire atomic model of an amyloid cross-β spine crystal structure (PDB entry 2oNA) was refined.
Mathematical foundations of the GraphBLAS
Kepner, Jeremy; Aaltonen, Peter; Bader, David; ...
2016-12-01
The GraphBLAS standard (GraphBlas.org) is being developed to bring the potential of matrix-based graph algorithms to the broadest possible audience. Mathematically, the GraphBLAS defines a core set of matrix-based graph operations that can be used to implement a wide class of graph algorithms in a wide range of programming environments. This study provides an introduction to the mathematics of the GraphBLAS. Graphs represent connections between vertices with edges. Matrices can represent a wide range of graphs using adjacency matrices or incidence matrices. Adjacency matrices are often easier to analyze while incidence matrices are often better for representing data. Fortunately, themore » two are easily connected by matrix multiplication. A key feature of matrix mathematics is that a very small number of matrix operations can be used to manipulate a very wide range of graphs. This composability of a small number of operations is the foundation of the GraphBLAS. A standard such as the GraphBLAS can only be effective if it has low performance overhead. Finally, performance measurements of prototype GraphBLAS implementations indicate that the overhead is low.« less
GRMDA: Graph Regression for MiRNA-Disease Association Prediction
Chen, Xing; Yang, Jing-Ru; Guan, Na-Na; Li, Jian-Qiang
2018-01-01
Nowadays, as more and more associations between microRNAs (miRNAs) and diseases have been discovered, miRNA has gradually become a hot topic in the biological field. Because of the high consumption of time and money on carrying out biological experiments, computational method which can help scientists choose the most likely associations between miRNAs and diseases for further experimental studies is desperately needed. In this study, we proposed a method of Graph Regression for MiRNA-Disease Association prediction (GRMDA) which combines known miRNA-disease associations, miRNA functional similarity, disease semantic similarity, and Gaussian interaction profile kernel similarity. We used Gaussian interaction profile kernel similarity to supplement the shortage of miRNA functional similarity and disease semantic similarity. Furthermore, the graph regression was synchronously performed in three latent spaces, including association space, miRNA similarity space, and disease similarity space, by using two matrix factorization approaches called Singular Value Decomposition and Partial Least-Squares to extract important related attributes and filter the noise. In the leave-one-out cross validation and five-fold cross validation, GRMDA obtained the AUCs of 0.8272 and 0.8080 ± 0.0024, respectively. Thus, its performance is better than some previous models. In the case study of Lymphoma using the recorded miRNA-disease associations in HMDD V2.0 database, 88% of top 50 predicted miRNAs were verified by experimental literatures. In order to test the performance of GRMDA on new diseases with no known related miRNAs, we took Breast Neoplasms as an example by regarding all the known related miRNAs as unknown ones. We found that 100% of top 50 predicted miRNAs were verified. Moreover, 84% of top 50 predicted miRNAs in case study for Esophageal Neoplasms based on HMDD V1.0 were verified to have known associations. In conclusion, GRMDA is an effective and practical method for miRNA-disease association prediction. PMID:29515453
GRMDA: Graph Regression for MiRNA-Disease Association Prediction.
Chen, Xing; Yang, Jing-Ru; Guan, Na-Na; Li, Jian-Qiang
2018-01-01
Nowadays, as more and more associations between microRNAs (miRNAs) and diseases have been discovered, miRNA has gradually become a hot topic in the biological field. Because of the high consumption of time and money on carrying out biological experiments, computational method which can help scientists choose the most likely associations between miRNAs and diseases for further experimental studies is desperately needed. In this study, we proposed a method of Graph Regression for MiRNA-Disease Association prediction (GRMDA) which combines known miRNA-disease associations, miRNA functional similarity, disease semantic similarity, and Gaussian interaction profile kernel similarity. We used Gaussian interaction profile kernel similarity to supplement the shortage of miRNA functional similarity and disease semantic similarity. Furthermore, the graph regression was synchronously performed in three latent spaces, including association space, miRNA similarity space, and disease similarity space, by using two matrix factorization approaches called Singular Value Decomposition and Partial Least-Squares to extract important related attributes and filter the noise. In the leave-one-out cross validation and five-fold cross validation, GRMDA obtained the AUCs of 0.8272 and 0.8080 ± 0.0024, respectively. Thus, its performance is better than some previous models. In the case study of Lymphoma using the recorded miRNA-disease associations in HMDD V2.0 database, 88% of top 50 predicted miRNAs were verified by experimental literatures. In order to test the performance of GRMDA on new diseases with no known related miRNAs, we took Breast Neoplasms as an example by regarding all the known related miRNAs as unknown ones. We found that 100% of top 50 predicted miRNAs were verified. Moreover, 84% of top 50 predicted miRNAs in case study for Esophageal Neoplasms based on HMDD V1.0 were verified to have known associations. In conclusion, GRMDA is an effective and practical method for miRNA-disease association prediction.
1990-01-09
data structures can easily be presented to the user interface. An emphasis of the Graph Browser was the realization of graph views and graph animation ... animation of the graph. Anima- tion of the graph includes changing node shapes, changing node and arc colors, changing node and arc text, and making...many graphs tend to be tree-like. Animtion of a graph is a useful feature. One of the primary goals of GMB was to support animated graphs. For animation
ERIC Educational Resources Information Center
Phage, Itumeleng B.; Lemmer, Miriam; Hitge, Mariette
2017-01-01
Students' graph comprehension may be affected by the background of the students who are the readers or interpreters of the graph, their knowledge of the context in which the graph is set, and the inferential processes required by the graph operation. This research study investigated these aspects of graph comprehension for 152 first year…
NASA Astrophysics Data System (ADS)
Xiong, B.; Oude Elberink, S.; Vosselman, G.
2014-07-01
In the task of 3D building model reconstruction from point clouds we face the problem of recovering a roof topology graph in the presence of noise, small roof faces and low point densities. Errors in roof topology graphs will seriously affect the final modelling results. The aim of this research is to automatically correct these errors. We define the graph correction as a graph-to-graph problem, similar to the spelling correction problem (also called the string-to-string problem). The graph correction is more complex than string correction, as the graphs are 2D while strings are only 1D. We design a strategy based on a dictionary of graph edit operations to automatically identify and correct the errors in the input graph. For each type of error the graph edit dictionary stores a representative erroneous subgraph as well as the corrected version. As an erroneous roof topology graph may contain several errors, a heuristic search is applied to find the optimum sequence of graph edits to correct the errors one by one. The graph edit dictionary can be expanded to include entries needed to cope with errors that were previously not encountered. Experiments show that the dictionary with only fifteen entries already properly corrects one quarter of erroneous graphs in about 4500 buildings, and even half of the erroneous graphs in one test area, achieving as high as a 95% acceptance rate of the reconstructed models.
Walsh, Adam D; Crawford, David; Cameron, Adrian J; Campbell, Karen J; Hesketh, Kylie D
2017-07-05
Early childhood (under five years of age) is a critical developmental period when children's physical activity behaviours are shaped and when physical activity patterns begin to emerge. Physical activity levels track from early childhood through to adolescence with low levels of physical activity associated with poorer health. The aims of this study were to examine cross-sectional and longitudinal associations between the physical activity levels of fathers and their children at the ages of 20 months, 3.5 and 5 years, and to investigate whether these associations differed based on paternal body mass index (BMI) and education. The Melbourne Infant Feeding Activity and Nutrition Trial (InFANT) Program was a cluster randomized-controlled trial delivered to pre-existing first-time parent groups. Physical activity levels of fathers and their first-born children were assessed using the Active Australia Survey and ActiGraph accelerometers respectively. Cross-sectional associations between father and child physical activity behaviours were assessed at each time point. Longitudinal associations between father and child physical activity were also investigated from child age 20 months to both 3.5 and 5 years. Additional stratified analyses were conducted based on paternal BMI and paternal education as a proxy for socioeconomic position (SEP). Data from the control and interventions groups were pooled and all analyses adjusted for intervention status, clustering by first-time parent group and accelerometer wear time. Physical activity levels of fathers and their children at child age 20 months were not associated cross-sectionally or longitudinally at child age 3.5 and 5 years. Positive associations were observed between light physical activity of healthy weight fathers and children at age 3.5 years. Inverse associations were observed for moderate/vigorous physical activity between fathers and children at age 5 years, including between overweight/obese fathers and their children at this age in stratified analyses. There were no clear associations between the physical activity of fathers and children. Future research should include the use of more robust measures of physical activity among fathers to allow in-depth assessment of their physical activity behaviours. Investigation of well-defined correlates of physical activity in young children is warranted to confirm these findings and further progress research in this field.
Comparison and Enumeration of Chemical Graphs
Akutsu, Tatsuya; Nagamochi, Hiroshi
2013-01-01
Chemical compounds are usually represented as graph structured data in computers. In this review article, we overview several graph classes relevant to chemical compounds and the computational complexities of several fundamental problems for these graph classes. In particular, we consider the following problems: determining whether two chemical graphs are identical, determining whether one input chemical graph is a part of the other input chemical graph, finding a maximum common part of two input graphs, finding a reaction atom mapping, enumerating possible chemical graphs, and enumerating stereoisomers. We also discuss the relationship between the fifth problem and kernel functions for chemical compounds. PMID:24688697
Mean square cordial labelling related to some acyclic graphs and its rough approximations
NASA Astrophysics Data System (ADS)
Dhanalakshmi, S.; Parvathi, N.
2018-04-01
In this paper we investigate that the path Pn, comb graph Pn⊙K1, n-centipede graph,centipede graph (n,2) and star Sn admits mean square cordial labeling. Also we proved that the induced sub graph obtained by the upper approximation of any sub graph H of the above acyclic graphs admits mean square cordial labeling.
Relating zeta functions of discrete and quantum graphs
NASA Astrophysics Data System (ADS)
Harrison, Jonathan; Weyand, Tracy
2018-02-01
We write the spectral zeta function of the Laplace operator on an equilateral metric graph in terms of the spectral zeta function of the normalized Laplace operator on the corresponding discrete graph. To do this, we apply a relation between the spectrum of the Laplacian on a discrete graph and that of the Laplacian on an equilateral metric graph. As a by-product, we determine how the multiplicity of eigenvalues of the quantum graph, that are also in the spectrum of the graph with Dirichlet conditions at the vertices, depends on the graph geometry. Finally we apply the result to calculate the vacuum energy and spectral determinant of a complete bipartite graph and compare our results with those for a star graph, a graph in which all vertices are connected to a central vertex by a single edge.
Xuan, Junyu; Lu, Jie; Zhang, Guangquan; Luo, Xiangfeng
2015-12-01
Graph mining has been a popular research area because of its numerous application scenarios. Many unstructured and structured data can be represented as graphs, such as, documents, chemical molecular structures, and images. However, an issue in relation to current research on graphs is that they cannot adequately discover the topics hidden in graph-structured data which can be beneficial for both the unsupervised learning and supervised learning of the graphs. Although topic models have proved to be very successful in discovering latent topics, the standard topic models cannot be directly applied to graph-structured data due to the "bag-of-word" assumption. In this paper, an innovative graph topic model (GTM) is proposed to address this issue, which uses Bernoulli distributions to model the edges between nodes in a graph. It can, therefore, make the edges in a graph contribute to latent topic discovery and further improve the accuracy of the supervised and unsupervised learning of graphs. The experimental results on two different types of graph datasets show that the proposed GTM outperforms the latent Dirichlet allocation on classification by using the unveiled topics of these two models to represent graphs.
ERIC Educational Resources Information Center
Lazarsfeld, Paul F., Ed.
Part two of a seven-section, final report on the Multi-Disciplinary Graduate Program in Educational Research, this document contains discussions of quantification and reason analysis. Quantification is presented as a language consisting of sentences (graphs and tables), words, (classificatory instruments), and grammar (rules for constructing and…
Scratch Your Brain Where It Itches: Math Games, Tricks and Quick Activities, Book D-1 Algebra.
ERIC Educational Resources Information Center
Brumbaugh, Doug
This resource book for algebra contains games, tricks, and quick activities for the classroom. Categories of activities include puzzlers, patterns, manipulatives, measurement, graphing, and a section that contains reproducible statement and value cards. Twenty one puzzle problems, four pattern activities, and 11 quick activities that engage…
Derive Workshop Matrix Algebra and Linear Algebra.
ERIC Educational Resources Information Center
Townsley Kulich, Lisa; Victor, Barbara
This document presents the course content for a workshop that integrates the use of the computer algebra system Derive with topics in matrix and linear algebra. The first section is a guide to using Derive that provides information on how to write algebraic expressions, make graphs, save files, edit, define functions, differentiate expressions,…
Value Added: The Economic Impact of Public Universities.
ERIC Educational Resources Information Center
National Association of State Universities and Land Grant Colleges, Washington, DC.
This monograph reports the results of a survey of the economic impact on state and local economies of the 194 member institutions of the National Association of State Universities and Land-Grant Colleges. Analysis of responses (from 111 institutions) is reported in text and graphs. An introductory section notes that the recent emphasis on cutting…
Statistical Report of Kentucky Public Libraries, Fiscal Year 1997-1998.
ERIC Educational Resources Information Center
Bank, Jay, Comp.
This report contains statistical information on Kentucky public libraries for fiscal year 1997-1998 taken from the Annual Report of Public Libraries. The report is separated into seven sections: summary of library statistics for the most recent year (1998) and comparisons with the three prior years; graphs showing statistical trends in library…
Genome U-Plot: a whole genome visualization.
Gaitatzes, Athanasios; Johnson, Sarah H; Smadbeck, James B; Vasmatzis, George
2018-05-15
The ability to produce and analyze whole genome sequencing (WGS) data from samples with structural variations (SV) generated the need to visualize such abnormalities in simplified plots. Conventional two-dimensional representations of WGS data frequently use either circular or linear layouts. There are several diverse advantages regarding both these representations, but their major disadvantage is that they do not use the two-dimensional space very efficiently. We propose a layout, termed the Genome U-Plot, which spreads the chromosomes on a two-dimensional surface and essentially quadruples the spatial resolution. We present the Genome U-Plot for producing clear and intuitive graphs that allows researchers to generate novel insights and hypotheses by visualizing SVs such as deletions, amplifications, and chromoanagenesis events. The main features of the Genome U-Plot are its layered layout, its high spatial resolution and its improved aesthetic qualities. We compare conventional visualization schemas with the Genome U-Plot using visualization metrics such as number of line crossings and crossing angle resolution measures. Based on our metrics, we improve the readability of the resulting graph by at least 2-fold, making apparent important features and making it easy to identify important genomic changes. A whole genome visualization tool with high spatial resolution and improved aesthetic qualities. An implementation and documentation of the Genome U-Plot is publicly available at https://github.com/gaitat/GenomeUPlot. vasmatzis.george@mayo.edu. Supplementary data are available at Bioinformatics online.
Preserving Differential Privacy in Degree-Correlation based Graph Generation
Wang, Yue; Wu, Xintao
2014-01-01
Enabling accurate analysis of social network data while preserving differential privacy has been challenging since graph features such as cluster coefficient often have high sensitivity, which is different from traditional aggregate functions (e.g., count and sum) on tabular data. In this paper, we study the problem of enforcing edge differential privacy in graph generation. The idea is to enforce differential privacy on graph model parameters learned from the original network and then generate the graphs for releasing using the graph model with the private parameters. In particular, we develop a differential privacy preserving graph generator based on the dK-graph generation model. We first derive from the original graph various parameters (i.e., degree correlations) used in the dK-graph model, then enforce edge differential privacy on the learned parameters, and finally use the dK-graph model with the perturbed parameters to generate graphs. For the 2K-graph model, we enforce the edge differential privacy by calibrating noise based on the smooth sensitivity, rather than the global sensitivity. By doing this, we achieve the strict differential privacy guarantee with smaller magnitude noise. We conduct experiments on four real networks and compare the performance of our private dK-graph models with the stochastic Kronecker graph generation model in terms of utility and privacy tradeoff. Empirical evaluations show the developed private dK-graph generation models significantly outperform the approach based on the stochastic Kronecker generation model. PMID:24723987
Myths in funding ocean research at the National Science Foundation
NASA Astrophysics Data System (ADS)
Duce, Robert A.; Benoit-Bird, Kelly J.; Ortiz, Joseph; Woodgate, Rebecca A.; Bontempi, Paula; Delaney, Margaret; Gaines, Steven D.; Harper, Scott; Jones, Brandon; White, Lisa D.
2012-12-01
Every 3 years the U.S. National Science Foundation (NSF), through its Advisory Committee on Geosciences, forms a Committee of Visitors (COV) to review different aspects of the Directorate for Geosciences (GEO). This year a COV was formed to review the Biological Oceanography (BO), Chemical Oceanography (CO), and Physical Oceanography (PO) programs in the Ocean Section; the Marine Geology and Geophysics (MGG) and Integrated Ocean Drilling Program (IODP) science programs in the Marine Geosciences Section; and the Ocean Education and Ocean Technology and Interdisciplinary Coordination (OTIC) programs in the Integrative Programs Section of the Ocean Sciences Division (OCE). The 2012 COV assessed the proposal review process for fiscal year (FY) 2009-2011, when 3843 proposal actions were considered, resulting in 1141 awards. To do this, COV evaluated the documents associated with 206 projects that were randomly selected from the following categories: low-rated proposals that were funded, high-rated proposals that were funded, low-rated proposals that were declined, high-rated proposals that were declined, some in the middle (53 awarded, 106 declined), and all (47) proposals submitted to the Rapid Response Research (RAPID) funding mechanism. NSF provided additional data as requested by the COV in the form of graphs and tables. The full COV report, including graphs and tables, is available at http://www.nsf.gov/geo/acgeo_cov.jsp.
A general method for computing Tutte polynomials of self-similar graphs
NASA Astrophysics Data System (ADS)
Gong, Helin; Jin, Xian'an
2017-10-01
Self-similar graphs were widely studied in both combinatorics and statistical physics. Motivated by the construction of the well-known 3-dimensional Sierpiński gasket graphs, in this paper we introduce a family of recursively constructed self-similar graphs whose inner duals are of the self-similar property. By combining the dual property of the Tutte polynomial and the subgraph-decomposition trick, we show that the Tutte polynomial of this family of graphs can be computed in an iterative way and in particular the exact expression of the formula of the number of their spanning trees is derived. Furthermore, we show our method is a general one that is easily extended to compute Tutte polynomials for other families of self-similar graphs such as Farey graphs, 2-dimensional Sierpiński gasket graphs, Hanoi graphs, modified Koch graphs, Apollonian graphs, pseudofractal scale-free web, fractal scale-free network, etc.
Bipartite separability and nonlocal quantum operations on graphs
NASA Astrophysics Data System (ADS)
Dutta, Supriyo; Adhikari, Bibhas; Banerjee, Subhashish; Srikanth, R.
2016-07-01
In this paper we consider the separability problem for bipartite quantum states arising from graphs. Earlier it was proved that the degree criterion is the graph-theoretic counterpart of the familiar positive partial transpose criterion for separability, although there are entangled states with positive partial transpose for which the degree criterion fails. Here we introduce the concept of partially symmetric graphs and degree symmetric graphs by using the well-known concept of partial transposition of a graph and degree criteria, respectively. Thus, we provide classes of bipartite separable states of dimension m ×n arising from partially symmetric graphs. We identify partially asymmetric graphs that lack the property of partial symmetry. We develop a combinatorial procedure to create a partially asymmetric graph from a given partially symmetric graph. We show that this combinatorial operation can act as an entanglement generator for mixed states arising from partially symmetric graphs.
On the local edge antimagicness of m-splitting graphs
NASA Astrophysics Data System (ADS)
Albirri, E. R.; Dafik; Slamin; Agustin, I. H.; Alfarisi, R.
2018-04-01
Let G be a connected and simple graph. A split graph is a graph derived by adding new vertex v‧ in every vertex v‧ such that v‧ adjacent to v in graph G. An m-splitting graph is a graph which has m v‧-vertices, denoted by mSpl(G). A local edge antimagic coloring in G = (V, E) graph is a bijection f:V (G)\\to \\{1,2,3,\\ldots,|V(G)|\\} in which for any two adjacent edges e 1 and e 2 satisfies w({e}1)\
Survey of Approaches to Generate Realistic Synthetic Graphs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lim, Seung-Hwan; Lee, Sangkeun; Powers, Sarah S
A graph is a flexible data structure that can represent relationships between entities. As with other data analysis tasks, the use of realistic graphs is critical to obtaining valid research results. Unfortunately, using the actual ("real-world") graphs for research and new algorithm development is difficult due to the presence of sensitive information in the data or due to the scale of data. This results in practitioners developing algorithms and systems that employ synthetic graphs instead of real-world graphs. Generating realistic synthetic graphs that provide reliable statistical confidence to algorithmic analysis and system evaluation involves addressing technical hurdles in a broadmore » set of areas. This report surveys the state of the art in approaches to generate realistic graphs that are derived from fitted graph models on real-world graphs.« less
Self-organizing maps for learning the edit costs in graph matching.
Neuhaus, Michel; Bunke, Horst
2005-06-01
Although graph matching and graph edit distance computation have become areas of intensive research recently, the automatic inference of the cost of edit operations has remained an open problem. In the present paper, we address the issue of learning graph edit distance cost functions for numerically labeled graphs from a corpus of sample graphs. We propose a system of self-organizing maps (SOMs) that represent the distance measuring spaces of node and edge labels. Our learning process is based on the concept of self-organization. It adapts the edit costs in such a way that the similarity of graphs from the same class is increased, whereas the similarity of graphs from different classes decreases. The learning procedure is demonstrated on two different applications involving line drawing graphs and graphs representing diatoms, respectively.
Apparatuses and Methods for Producing Runtime Architectures of Computer Program Modules
NASA Technical Reports Server (NTRS)
Abi-Antoun, Marwan Elia (Inventor); Aldrich, Jonathan Erik (Inventor)
2013-01-01
Apparatuses and methods for producing run-time architectures of computer program modules. One embodiment includes creating an abstract graph from the computer program module and from containment information corresponding to the computer program module, wherein the abstract graph has nodes including types and objects, and wherein the abstract graph relates an object to a type, and wherein for a specific object the abstract graph relates the specific object to a type containing the specific object; and creating a runtime graph from the abstract graph, wherein the runtime graph is a representation of the true runtime object graph, wherein the runtime graph represents containment information such that, for a specific object, the runtime graph relates the specific object to another object that contains the specific object.
G-Hash: Towards Fast Kernel-based Similarity Search in Large Graph Databases.
Wang, Xiaohong; Smalter, Aaron; Huan, Jun; Lushington, Gerald H
2009-01-01
Structured data including sets, sequences, trees and graphs, pose significant challenges to fundamental aspects of data management such as efficient storage, indexing, and similarity search. With the fast accumulation of graph databases, similarity search in graph databases has emerged as an important research topic. Graph similarity search has applications in a wide range of domains including cheminformatics, bioinformatics, sensor network management, social network management, and XML documents, among others.Most of the current graph indexing methods focus on subgraph query processing, i.e. determining the set of database graphs that contains the query graph and hence do not directly support similarity search. In data mining and machine learning, various graph kernel functions have been designed to capture the intrinsic similarity of graphs. Though successful in constructing accurate predictive and classification models for supervised learning, graph kernel functions have (i) high computational complexity and (ii) non-trivial difficulty to be indexed in a graph database.Our objective is to bridge graph kernel function and similarity search in graph databases by proposing (i) a novel kernel-based similarity measurement and (ii) an efficient indexing structure for graph data management. Our method of similarity measurement builds upon local features extracted from each node and their neighboring nodes in graphs. A hash table is utilized to support efficient storage and fast search of the extracted local features. Using the hash table, a graph kernel function is defined to capture the intrinsic similarity of graphs and for fast similarity query processing. We have implemented our method, which we have named G-hash, and have demonstrated its utility on large chemical graph databases. Our results show that the G-hash method achieves state-of-the-art performance for k-nearest neighbor (k-NN) classification. Most importantly, the new similarity measurement and the index structure is scalable to large database with smaller indexing size, faster indexing construction time, and faster query processing time as compared to state-of-the-art indexing methods such as C-tree, gIndex, and GraphGrep.
NASA Technical Reports Server (NTRS)
Burleigh, Scott C.
2011-01-01
Contact Graph Routing (CGR) is a dynamic routing system that computes routes through a time-varying topology of scheduled communication contacts in a network based on the DTN (Delay-Tolerant Networking) architecture. It is designed to enable dynamic selection of data transmission routes in a space network based on DTN. This dynamic responsiveness in route computation should be significantly more effective and less expensive than static routing, increasing total data return while at the same time reducing mission operations cost and risk. The basic strategy of CGR is to take advantage of the fact that, since flight mission communication operations are planned in detail, the communication routes between any pair of bundle agents in a population of nodes that have all been informed of one another's plans can be inferred from those plans rather than discovered via dialogue (which is impractical over long one-way-light-time space links). Messages that convey this planning information are used to construct contact graphs (time-varying models of network connectivity) from which CGR automatically computes efficient routes for bundles. Automatic route selection increases the flexibility and resilience of the space network, simplifying cross-support and reducing mission management costs. Note that there are no routing tables in Contact Graph Routing. The best route for a bundle destined for a given node may routinely be different from the best route for a different bundle destined for the same node, depending on bundle priority, bundle expiration time, and changes in the current lengths of transmission queues for neighboring nodes; routes must be computed individually for each bundle, from the Bundle Protocol agent's current network connectivity model for the bundle s destination node (the contact graph). Clearly this places a premium on optimizing the implementation of the route computation algorithm. The scalability of CGR to very large networks remains a research topic. The information carried by CGR contact plan messages is useful not only for dynamic route computation, but also for the implementation of rate control, congestion forecasting, transmission episode initiation and termination, timeout interval computation, and retransmission timer suspension and resumption.
GraphReduce: Processing Large-Scale Graphs on Accelerator-Based Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sengupta, Dipanjan; Song, Shuaiwen; Agarwal, Kapil
2015-11-15
Recent work on real-world graph analytics has sought to leverage the massive amount of parallelism offered by GPU devices, but challenges remain due to the inherent irregularity of graph algorithms and limitations in GPU-resident memory for storing large graphs. We present GraphReduce, a highly efficient and scalable GPU-based framework that operates on graphs that exceed the device’s internal memory capacity. GraphReduce adopts a combination of edge- and vertex-centric implementations of the Gather-Apply-Scatter programming model and operates on multiple asynchronous GPU streams to fully exploit the high degrees of parallelism in GPUs with efficient graph data movement between the host andmore » device.« less
Photoeffect cross sections of some rare-earth elements at 145.4 keV
NASA Astrophysics Data System (ADS)
Umesh, T. K.; Ranganathaiah, C.; Sanjeevaiah, B.
1985-08-01
Total attenuation cross sections in the elements La, Ce, Pr, Nd, Sm, Gd, Dy, Ho, and Er were derived from the measured total cross sections of their simple oxide compounds, by employing the mixture rule at 145.4-keV photon energy. The compound cross sections have been measured by performing transmission experiments in a good geometry setup. From the derived total cross sections of elements, photoeffect cross sections have been obtained by subtracting the theoretical scattering cross sections. A good agreement is observed between the present data of photoeffect cross sections and Scofield's theoretical data.
Volatility Behaviors of Financial Time Series by Percolation System on Sierpinski Carpet Lattice
NASA Astrophysics Data System (ADS)
Pei, Anqi; Wang, Jun
2015-01-01
The financial time series is simulated and investigated by the percolation system on the Sierpinski carpet lattice, where percolation is usually employed to describe the behavior of connected clusters in a random graph, and the Sierpinski carpet lattice is a graph which corresponds the fractal — Sierpinski carpet. To study the fluctuation behavior of returns for the financial model and the Shanghai Composite Index, we establish a daily volatility measure — multifractal volatility (MFV) measure to obtain MFV series, which have long-range cross-correlations with squared daily return series. The autoregressive fractionally integrated moving average (ARFIMA) model is used to analyze the MFV series, which performs better when compared to other volatility series. By a comparative study of the multifractality and volatility analysis of the data, the simulation data of the proposed model exhibits very similar behaviors to those of the real stock index, which indicates somewhat rationality of the model to the market application.
Dynamic multicast routing scheme in WDM optical network
NASA Astrophysics Data System (ADS)
Zhu, Yonghua; Dong, Zhiling; Yao, Hong; Yang, Jianyong; Liu, Yibin
2007-11-01
During the information era, the Internet and the service of World Wide Web develop rapidly. Therefore, the wider and wider bandwidth is required with the lower and lower cost. The demand of operation turns out to be diversified. Data, images, videos and other special transmission demands share the challenge and opportunity with the service providers. Simultaneously, the electrical equipment has approached their limit. So the optical communication based on the wavelength division multiplexing (WDM) and the optical cross-connects (OXCs) shows great potentials and brilliant future to build an optical network based on the unique technical advantage and multi-wavelength characteristic. In this paper, we propose a multi-layered graph model with inter-path between layers to solve the problem of multicast routing wavelength assignment (RWA) contemporarily by employing an efficient graph theoretic formulation. And at the same time, an efficient dynamic multicast algorithm named Distributed Message Copying Multicast (DMCM) mechanism is also proposed. The multicast tree with minimum hops can be constructed dynamically according to this proposed scheme.
Multitriangulations, pseudotriangulations and some problems of realization of polytopes
NASA Astrophysics Data System (ADS)
Pilaud, Vincent
2010-09-01
This thesis explores two specific topics of discrete geometry, the multitriangulations and the polytopal realizations of products, whose connection is the problem of finding polytopal realizations of a given combinatorial structure. A k-triangulation is a maximal set of chords of the convex n-gon such that no k+1 of them mutually cross. We propose a combinatorial and geometric study of multitriangulations based on their stars, which play the same role as triangles of triangulations. This study leads to interpret multitriangulations by duality as pseudoline arrangements with contact points covering a given support. We exploit finally these results to discuss some open problems on multitriangulations, in particular the question of the polytopal realization of their flip graphs. We study secondly the polytopality of Cartesian products. We investigate the existence of polytopal realizations of cartesian products of graphs, and we study the minimal dimension that can have a polytope whose k-skeleton is that of a product of simplices.
Spectra of Adjacency Matrices in Networks of Extreme Introverts and Extroverts
NASA Astrophysics Data System (ADS)
Bassler, Kevin E.; Ezzatabadipour, Mohammadmehdi; Zia, R. K. P.
Many interesting properties were discovered in recent studies of preferred degree networks, suitable for describing social behavior of individuals who tend to prefer a certain number of contacts. In an extreme version (coined the XIE model), introverts always cut links while extroverts always add them. While the intra-group links are static, the cross-links are dynamic and lead to an ensemble of bipartite graphs, with extraordinary correlations between elements of the incidence matrix: nij In the steady state, this system can be regarded as one in thermal equilibrium with long-ranged interactions between the nij's, and displays an extreme Thouless effect. Here, we report simulation studies of a different perspective of networks, namely, the spectra associated with this ensemble of adjacency matrices {aij } . As a baseline, we first consider the spectra associated with a simple random (Erdős-Rényi) ensemble of bipartite graphs, where simulation results can be understood analytically. Work supported by the NSF through Grant DMR-1507371.
Comparing Internet Probing Methodologies Through an Analysis of Large Dynamic Graphs
2014-06-01
comparable Internet topologies in less time. We compare these by modeling union of traceroute outputs as graphs, and using standard graph theoretical...topologies in less time. We compare these by modeling union of traceroute outputs as graphs, and using standard graph theoretical measurements as well...We compare these by modeling union of traceroute outputs as graphs, and study the graphs by using vertex and edge count, average vertex degree
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sukumar, Sreenivas R.; Hong, Seokyong; Lee, Sangkeun
2016-06-01
GraphBench is a benchmark suite for graph pattern mining and graph analysis systems. The benchmark suite is a significant addition to conducting apples-apples comparison of graph analysis software (databases, in-memory tools, triple stores, etc.)
Asymptote Misconception on Graphing Functions: Does Graphing Software Resolve It?
ERIC Educational Resources Information Center
Öçal, Mehmet Fatih
2017-01-01
Graphing function is an important issue in mathematics education due to its use in various areas of mathematics and its potential roles for students to enhance learning mathematics. The use of some graphing software assists students' learning during graphing functions. However, the display of graphs of functions that students sketched by hand may…
Generalized graph states based on Hadamard matrices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Shawn X.; Yu, Nengkun; Department of Mathematics and Statistics, University of Guelph, Guelph, Ontario N1G 2W1
2015-07-15
Graph states are widely used in quantum information theory, including entanglement theory, quantum error correction, and one-way quantum computing. Graph states have a nice structure related to a certain graph, which is given by either a stabilizer group or an encoding circuit, both can be directly given by the graph. To generalize graph states, whose stabilizer groups are abelian subgroups of the Pauli group, one approach taken is to study non-abelian stabilizers. In this work, we propose to generalize graph states based on the encoding circuit, which is completely determined by the graph and a Hadamard matrix. We study themore » entanglement structures of these generalized graph states and show that they are all maximally mixed locally. We also explore the relationship between the equivalence of Hadamard matrices and local equivalence of the corresponding generalized graph states. This leads to a natural generalization of the Pauli (X, Z) pairs, which characterizes the local symmetries of these generalized graph states. Our approach is also naturally generalized to construct graph quantum codes which are beyond stabilizer codes.« less
Graph processing platforms at scale: practices and experiences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lim, Seung-Hwan; Lee, Sangkeun; Brown, Tyler C
2015-01-01
Graph analysis unveils hidden associations of data in many phenomena and artifacts, such as road network, social networks, genomic information, and scientific collaboration. Unfortunately, a wide diversity in the characteristics of graphs and graph operations make it challenging to find a right combination of tools and implementation of algorithms to discover desired knowledge from the target data set. This study presents an extensive empirical study of three representative graph processing platforms: Pegasus, GraphX, and Urika. Each system represents a combination of options in data model, processing paradigm, and infrastructure. We benchmarked each platform using three popular graph operations, degree distribution,more » connected components, and PageRank over a variety of real-world graphs. Our experiments show that each graph processing platform shows different strength, depending the type of graph operations. While Urika performs the best in non-iterative operations like degree distribution, GraphX outputforms iterative operations like connected components and PageRank. In addition, we discuss challenges to optimize the performance of each platform over large scale real world graphs.« less
A fast algorithm for vertex-frequency representations of signals on graphs
Jestrović, Iva; Coyle, James L.; Sejdić, Ervin
2016-01-01
The windowed Fourier transform (short time Fourier transform) and the S-transform are widely used signal processing tools for extracting frequency information from non-stationary signals. Previously, the windowed Fourier transform had been adopted for signals on graphs and has been shown to be very useful for extracting vertex-frequency information from graphs. However, high computational complexity makes these algorithms impractical. We sought to develop a fast windowed graph Fourier transform and a fast graph S-transform requiring significantly shorter computation time. The proposed schemes have been tested with synthetic test graph signals and real graph signals derived from electroencephalography recordings made during swallowing. The results showed that the proposed schemes provide significantly lower computation time in comparison with the standard windowed graph Fourier transform and the fast graph S-transform. Also, the results showed that noise has no effect on the results of the algorithm for the fast windowed graph Fourier transform or on the graph S-transform. Finally, we showed that graphs can be reconstructed from the vertex-frequency representations obtained with the proposed algorithms. PMID:28479645
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grossman, Max; Pritchard Jr., Howard Porter; Budimlic, Zoran
2016-12-22
Graph500 [14] is an effort to offer a standardized benchmark across large-scale distributed platforms which captures the behavior of common communicationbound graph algorithms. Graph500 differs from other large-scale benchmarking efforts (such as HPL [6] or HPGMG [7]) primarily in the irregularity of its computation and data access patterns. The core computational kernel of Graph500 is a breadth-first search (BFS) implemented on an undirected graph. The output of Graph500 is a spanning tree of the input graph, usually represented by a predecessor mapping for every node in the graph. The Graph500 benchmark defines several pre-defined input sizes for implementers to testmore » against. This report summarizes investigation into implementing the Graph500 benchmark on OpenSHMEM, and focuses on first building a strong and practical understanding of the strengths and limitations of past work before proposing and developing novel extensions.« less
Graphing trillions of triangles.
Burkhardt, Paul
2017-07-01
The increasing size of Big Data is often heralded but how data are transformed and represented is also profoundly important to knowledge discovery, and this is exemplified in Big Graph analytics. Much attention has been placed on the scale of the input graph but the product of a graph algorithm can be many times larger than the input. This is true for many graph problems, such as listing all triangles in a graph. Enabling scalable graph exploration for Big Graphs requires new approaches to algorithms, architectures, and visual analytics. A brief tutorial is given to aid the argument for thoughtful representation of data in the context of graph analysis. Then a new algebraic method to reduce the arithmetic operations in counting and listing triangles in graphs is introduced. Additionally, a scalable triangle listing algorithm in the MapReduce model will be presented followed by a description of the experiments with that algorithm that led to the current largest and fastest triangle listing benchmarks to date. Finally, a method for identifying triangles in new visual graph exploration technologies is proposed.
Multiple graph regularized protein domain ranking.
Wang, Jim Jing-Yan; Bensmail, Halima; Gao, Xin
2012-11-19
Protein domain ranking is a fundamental task in structural biology. Most protein domain ranking methods rely on the pairwise comparison of protein domains while neglecting the global manifold structure of the protein domain database. Recently, graph regularized ranking that exploits the global structure of the graph defined by the pairwise similarities has been proposed. However, the existing graph regularized ranking methods are very sensitive to the choice of the graph model and parameters, and this remains a difficult problem for most of the protein domain ranking methods. To tackle this problem, we have developed the Multiple Graph regularized Ranking algorithm, MultiG-Rank. Instead of using a single graph to regularize the ranking scores, MultiG-Rank approximates the intrinsic manifold of protein domain distribution by combining multiple initial graphs for the regularization. Graph weights are learned with ranking scores jointly and automatically, by alternately minimizing an objective function in an iterative algorithm. Experimental results on a subset of the ASTRAL SCOP protein domain database demonstrate that MultiG-Rank achieves a better ranking performance than single graph regularized ranking methods and pairwise similarity based ranking methods. The problem of graph model and parameter selection in graph regularized protein domain ranking can be solved effectively by combining multiple graphs. This aspect of generalization introduces a new frontier in applying multiple graphs to solving protein domain ranking applications.
Evolutionary graph theory: breaking the symmetry between interaction and replacement
Ohtsuki, Hisashi; Pacheco, Jorge M.; Nowak, Martin A.
2008-01-01
We study evolutionary dynamics in a population whose structure is given by two graphs: the interaction graph determines who plays with whom in an evolutionary game; the replacement graph specifies the geometry of evolutionary competition and updating. First, we calculate the fixation probabilities of frequency dependent selection between two strategies or phenotypes. We consider three different update mechanisms: birth-death, death-birth and imitation. Then, as a particular example, we explore the evolution of cooperation. Suppose the interaction graph is a regular graph of degree h, the replacement graph is a regular graph of degree g and the overlap between the two graphs is a regular graph of degree l. We show that cooperation is favored by natural selection if b/c > hg/l. Here, b and c denote the benefit and cost of the altruistic act. This result holds for death-birth updating, weak selection and large population size. Note that the optimum population structure for cooperators is given by maximum overlap between the interaction and the replacement graph (g = h = l), which means that the two graphs are identical. We also prove that a modified replicator equation can describe how the expected values of the frequencies of an arbitrary number of strategies change on replacement and interaction graphs: the two graphs induce a transformation of the payoff matrix. PMID:17350049
Multiple graph regularized protein domain ranking
2012-01-01
Background Protein domain ranking is a fundamental task in structural biology. Most protein domain ranking methods rely on the pairwise comparison of protein domains while neglecting the global manifold structure of the protein domain database. Recently, graph regularized ranking that exploits the global structure of the graph defined by the pairwise similarities has been proposed. However, the existing graph regularized ranking methods are very sensitive to the choice of the graph model and parameters, and this remains a difficult problem for most of the protein domain ranking methods. Results To tackle this problem, we have developed the Multiple Graph regularized Ranking algorithm, MultiG-Rank. Instead of using a single graph to regularize the ranking scores, MultiG-Rank approximates the intrinsic manifold of protein domain distribution by combining multiple initial graphs for the regularization. Graph weights are learned with ranking scores jointly and automatically, by alternately minimizing an objective function in an iterative algorithm. Experimental results on a subset of the ASTRAL SCOP protein domain database demonstrate that MultiG-Rank achieves a better ranking performance than single graph regularized ranking methods and pairwise similarity based ranking methods. Conclusion The problem of graph model and parameter selection in graph regularized protein domain ranking can be solved effectively by combining multiple graphs. This aspect of generalization introduces a new frontier in applying multiple graphs to solving protein domain ranking applications. PMID:23157331
Xi-cam: a versatile interface for data visualization and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke
Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less
Xi-cam: a versatile interface for data visualization and analysis
Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke; ...
2018-05-31
Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less
Electron-Impact Ionization Cross Section Database
National Institute of Standards and Technology Data Gateway
SRD 107 Electron-Impact Ionization Cross Section Database (Web, free access) This is a database primarily of total ionization cross sections of molecules by electron impact. The database also includes cross sections for a small number of atoms and energy distributions of ejected electrons for H, He, and H2. The cross sections were calculated using the Binary-Encounter-Bethe (BEB) model, which combines the Mott cross section with the high-incident energy behavior of the Bethe cross section. Selected experimental data are included.
Alternative Fuels Data Center: Maps and Data
fleet type from 1992-2014 Last update August 2016 View Graph Graph Download Data Generated_thumb20160830 Trend of S&FP AFV acquisitions by fuel type from 1992-2015 Last update August 2016 View Graph Graph transactions from 1997-2014 Last update August 2016 View Graph Graph Download Data Biofuelsatlas BioFuels Atlas
GraphReduce: Large-Scale Graph Analytics on Accelerator-Based HPC Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sengupta, Dipanjan; Agarwal, Kapil; Song, Shuaiwen
2015-09-30
Recent work on real-world graph analytics has sought to leverage the massive amount of parallelism offered by GPU devices, but challenges remain due to the inherent irregularity of graph algorithms and limitations in GPU-resident memory for storing large graphs. We present GraphReduce, a highly efficient and scalable GPU-based framework that operates on graphs that exceed the device’s internal memory capacity. GraphReduce adopts a combination of both edge- and vertex-centric implementations of the Gather-Apply-Scatter programming model and operates on multiple asynchronous GPU streams to fully exploit the high degrees of parallelism in GPUs with efficient graph data movement between the hostmore » and the device.« less
SING: Subgraph search In Non-homogeneous Graphs
2010-01-01
Background Finding the subgraphs of a graph database that are isomorphic to a given query graph has practical applications in several fields, from cheminformatics to image understanding. Since subgraph isomorphism is a computationally hard problem, indexing techniques have been intensively exploited to speed up the process. Such systems filter out those graphs which cannot contain the query, and apply a subgraph isomorphism algorithm to each residual candidate graph. The applicability of such systems is limited to databases of small graphs, because their filtering power degrades on large graphs. Results In this paper, SING (Subgraph search In Non-homogeneous Graphs), a novel indexing system able to cope with large graphs, is presented. The method uses the notion of feature, which can be a small subgraph, subtree or path. Each graph in the database is annotated with the set of all its features. The key point is to make use of feature locality information. This idea is used to both improve the filtering performance and speed up the subgraph isomorphism task. Conclusions Extensive tests on chemical compounds, biological networks and synthetic graphs show that the proposed system outperforms the most popular systems in query time over databases of medium and large graphs. Other specific tests show that the proposed system is effective for single large graphs. PMID:20170516
GrouseFlocks: steerable exploration of graph hierarchy space.
Archambault, Daniel; Munzner, Tamara; Auber, David
2008-01-01
Several previous systems allow users to interactively explore a large input graph through cuts of a superimposed hierarchy. This hierarchy is often created using clustering algorithms or topological features present in the graph. However, many graphs have domain-specific attributes associated with the nodes and edges, which could be used to create many possible hierarchies providing unique views of the input graph. GrouseFlocks is a system for the exploration of this graph hierarchy space. By allowing users to see several different possible hierarchies on the same graph, the system helps users investigate graph hierarchy space instead of a single fixed hierarchy. GrouseFlocks provides a simple set of operations so that users can create and modify their graph hierarchies based on selections. These selections can be made manually or based on patterns in the attribute data provided with the graph. It provides feedback to the user within seconds, allowing interactive exploration of this space.
Spectral partitioning in equitable graphs.
Barucca, Paolo
2017-06-01
Graph partitioning problems emerge in a wide variety of complex systems, ranging from biology to finance, but can be rigorously analyzed and solved only for a few graph ensembles. Here, an ensemble of equitable graphs, i.e., random graphs with a block-regular structure, is studied, for which analytical results can be obtained. In particular, the spectral density of this ensemble is computed exactly for a modular and bipartite structure. Kesten-McKay's law for random regular graphs is found analytically to apply also for modular and bipartite structures when blocks are homogeneous. An exact solution to graph partitioning for two equal-sized communities is proposed and verified numerically, and a conjecture on the absence of an efficient recovery detectability transition in equitable graphs is suggested. A final discussion summarizes results and outlines their relevance for the solution of graph partitioning problems in other graph ensembles, in particular for the study of detectability thresholds and resolution limits in stochastic block models.
Spectral partitioning in equitable graphs
NASA Astrophysics Data System (ADS)
Barucca, Paolo
2017-06-01
Graph partitioning problems emerge in a wide variety of complex systems, ranging from biology to finance, but can be rigorously analyzed and solved only for a few graph ensembles. Here, an ensemble of equitable graphs, i.e., random graphs with a block-regular structure, is studied, for which analytical results can be obtained. In particular, the spectral density of this ensemble is computed exactly for a modular and bipartite structure. Kesten-McKay's law for random regular graphs is found analytically to apply also for modular and bipartite structures when blocks are homogeneous. An exact solution to graph partitioning for two equal-sized communities is proposed and verified numerically, and a conjecture on the absence of an efficient recovery detectability transition in equitable graphs is suggested. A final discussion summarizes results and outlines their relevance for the solution of graph partitioning problems in other graph ensembles, in particular for the study of detectability thresholds and resolution limits in stochastic block models.
Berenbrock, Charles E.
2015-01-01
The effects of reduced cross-sectional data points on steady-flow profiles were also determined. Thirty-five cross sections of the original steady-flow model of the Kootenai River were used. These two methods were tested for all cross sections with each cross section resolution reduced to 10, 20 and 30 data points, that is, six tests were completed for each of the thirty-five cross sections. Generally, differences from the original water-surface elevation were smaller as the number of data points in reduced cross sections increased, but this was not always the case, especially in the braided reach. Differences were smaller for reduced cross sections developed by the genetic algorithm method than the standard algorithm method.
1991-01-01
critical G’s/# G’s -) 0 as IV(G)I -- oo? References [B1] C. Berge, Regularizable graphs, Ann. Discrete Math ., 3, 1978, 11-19. [B2] _ _, Some common...Springer-Verlag, Berlin, 1980, 108-123. [B3] _ _, Some common properties for regularizable graphs, edge-critical graphs, and B-graphs, Ann. Discrete Math ., 12...graphs - an extension of the K6nig-Egervgiry theorem, Discrete Math ., 27, 1979, 23-33. [ER] M.N Ellingham and G.F. Royle, Well-covered cubic graphs
Study of Chromatic parameters of Line, Total, Middle graphs and Graph operators of Bipartite graph
NASA Astrophysics Data System (ADS)
Nagarathinam, R.; Parvathi, N.
2018-04-01
Chromatic parameters have been explored on the basis of graph coloring process in which a couple of adjacent nodes receives different colors. But the Grundy and b-coloring executes maximum colors under certain restrictions. In this paper, Chromatic, b-chromatic and Grundy number of some graph operators of bipartite graph has been investigat
DOE Office of Scientific and Technical Information (OSTI.GOV)
2010-09-30
The Umbra gbs (Graph-Based Search) library provides implementations of graph-based search/planning algorithms that can be applied to legacy graph data structures. Unlike some other graph algorithm libraries, this one does not require your graph class to inherit from a specific base class. Implementations of Dijkstra's Algorithm and A-Star search are included and can be used with graphs that are lazily-constructed.
Information visualisation based on graph models
NASA Astrophysics Data System (ADS)
Kasyanov, V. N.; Kasyanova, E. V.
2013-05-01
Information visualisation is a key component of support tools for many applications in science and engineering. A graph is an abstract structure that is widely used to model information for its visualisation. In this paper, we consider practical and general graph formalism called hierarchical graphs and present the Higres and Visual Graph systems aimed at supporting information visualisation on the base of hierarchical graph models.
ERIC Educational Resources Information Center
van Eijck, Michiel; Goedhart, Martin J.; Ellermeijer, Ton
2011-01-01
Polysemy in graph-related practices is the phenomenon that a single graph can sustain different meanings assigned to it. Considerable research has been done on polysemy in graph-related practices in school science in which graphs are rather used as scientific tools. However, graphs in science textbooks are also used rather pedagogically to…
Lamplighter groups, de Brujin graphs, spider-web graphs and their spectra
NASA Astrophysics Data System (ADS)
Grigorchuk, R.; Leemann, P.-H.; Nagnibeda, T.
2016-05-01
We study the infinite family of spider-web graphs \\{{{ S }}k,N,M\\}, k≥slant 2, N≥slant 0 and M≥slant 1, initiated in the 50s in the context of network theory. It was later shown in physical literature that these graphs have remarkable percolation and spectral properties. We provide a mathematical explanation of these properties by putting the spider-web graphs in the context of group theory and algebraic graph theory. Namely, we realize them as tensor products of the well-known de Bruijn graphs \\{{{ B }}k,N\\} with cyclic graphs \\{{C}M\\} and show that these graphs are described by the action of the lamplighter group {{ L }}k={Z}/k{Z}\\wr {Z} on the infinite binary tree. Our main result is the identification of the infinite limit of \\{{{ S }}k,N,M\\}, as N,M\\to ∞ , with the Cayley graph of the lamplighter group {{ L }}k which, in turn, is one of the famous Diestel-Leader graphs {{DL}}k,k. As an application we compute the spectra of all spider-web graphs and show their convergence to the discrete spectral distribution associated with the Laplacian on the lamplighter group.
SAS program for quantitative stratigraphic correlation by principal components
Hohn, M.E.
1985-01-01
A SAS program is presented which constructs a composite section of stratigraphic events through principal components analysis. The variables in the analysis are stratigraphic sections and the observational units are range limits of taxa. The program standardizes data in each section, extracts eigenvectors, estimates missing range limits, and computes the composite section from scores of events on the first principal component. Provided is an option of several types of diagnostic plots; these help one to determine conservative range limits or unrealistic estimates of missing values. Inspection of the graphs and eigenvalues allow one to evaluate goodness of fit between the composite and measured data. The program is extended easily to the creation of a rank-order composite. ?? 1985.
Laser anemometry for hot flows
NASA Astrophysics Data System (ADS)
Kugler, P.; Langer, G.
1987-07-01
The fundamental principles, instrumentation, and practical operation of LDA and laser-transit-anemometry systems for measuring velocity profiles and the degree of turbulence in high-temperature flows are reviewed and illustrated with diagrams, drawings and graphs of typical data. Consideration is given to counter, tracker, spectrum-analyzer and correlation methods of LDA signal processing; multichannel analyzer and cross correlation methods for LTA data; LTA results for a small liquid fuel rocket motor; and experiments demonstrating the feasibility of an optoacoustic demodulation scheme for LDA signals from unsteady flows.
A chemical definition of the boundary of the Antarctic ozone hole
NASA Technical Reports Server (NTRS)
Proffitt, M. H.; Powell, J. A.; Tuck, A. F.; Fahey, D. W.; Kelly, K. K.; Krueger, A. J.; Schoeberl, M. R.; Gary, B. L.; Margitan, J. J.; Chan, K. R.
1989-01-01
A program designed to study the Antarctic ozone hole using ER-2 high-altitude and DC-8 aircraft was conducted out of Punta Arenas, Chile during August 17-September 22, 1987. Graphs are presented of ozone and chlorine monoxide when crossing the boundary of the chemically perturbed region on August 23 and on September 21. Interpretations of ClO, H2O, and N2O measurements are presented, indicating ongoing diabetic cooling and advective poleward transport across the boundary.
Fact Book 1981-82. State University System of Florida.
ERIC Educational Resources Information Center
Florida State Board of Regents, Tallahassee.
Data presented on the State University System (SUS) of Florida are presented in the form of tabular displays, charts, graphs, and a glossary. Preliminary sections list members of the State Board of Education and the Florida Board of Regents, provide a description of the State University System of Florida, and list measures used for reporting…
Students Exiting School, 1993-94. Programs for Exceptional Students Data Report, Series 97-12.
ERIC Educational Resources Information Center
Florida State Dept. of Education, Tallahassee. Div. of Public Schools.
This document presents narrative data, tables, and graphs on exceptional education students who exited school in Florida during the 1993-94 school year. The first section provides background information and describes the types of exit data collected from school districts and agency programs. Definitions of terms are also provided. The next three…
Sketching the General Quadratic Equation Using Dynamic Geometry Software
ERIC Educational Resources Information Center
Stols, G. H.
2005-01-01
This paper explores a geometrical way to sketch graphs of the general quadratic in two variables with Geometer's Sketchpad. To do this, a geometric procedure as described by De Temple is used, bearing in mind that this general quadratic equation (1) represents all the possible conics (conics sections), and the fact that five points (no three of…
Ohio Information Package: Community and Natural Resource Development. Bulletin 698, March 1989.
ERIC Educational Resources Information Center
Heimlich, Joe E., Comp.; And Others
This booklet consists almost entirely of demographic data on Ohio presented in the form of charts and graphs. The information, for the most part, focuses on the period from 1980 to 1987 and is categorized into five sections: Population, Households, Families and Health; Employment; Income and Taxes; and Miscellaneous Ohio Information. Much of the…
Biometric Subject Verification Based on Electrocardiographic Signals
NASA Technical Reports Server (NTRS)
Dusan, Sorin V. (Inventor); Jorgensen, Charles C. (Inventor)
2014-01-01
A method of authenticating or declining to authenticate an asserted identity of a candidate-person. In an enrollment phase, a reference PQRST heart action graph is provided or constructed from information obtained from a plurality of graphs that resemble each other for a known reference person, using a first graph comparison metric. In a verification phase, a candidate-person asserts his/her identity and presents a plurality of his/her heart cycle graphs. If a sufficient number of the candidate-person's measured graphs resemble each other, a representative composite graph is constructed from the candidate-person's graphs and is compared with a composite reference graph, for the person whose identity is asserted, using a second graph comparison metric. When the second metric value lies in a selected range, the candidate-person's assertion of identity is accepted.
EvoGraph: On-The-Fly Efficient Mining of Evolving Graphs on GPU
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sengupta, Dipanjan; Song, Shuaiwen
With the prevalence of the World Wide Web and social networks, there has been a growing interest in high performance analytics for constantly-evolving dynamic graphs. Modern GPUs provide massive AQ1 amount of parallelism for efficient graph processing, but the challenges remain due to their lack of support for the near real-time streaming nature of dynamic graphs. Specifically, due to the current high volume and velocity of graph data combined with the complexity of user queries, traditional processing methods by first storing the updates and then repeatedly running static graph analytics on a sequence of versions or snapshots are deemed undesirablemore » and computational infeasible on GPU. We present EvoGraph, a highly efficient and scalable GPU- based dynamic graph analytics framework.« less
Genome alignment with graph data structures: a comparison
2014-01-01
Background Recent advances in rapid, low-cost sequencing have opened up the opportunity to study complete genome sequences. The computational approach of multiple genome alignment allows investigation of evolutionarily related genomes in an integrated fashion, providing a basis for downstream analyses such as rearrangement studies and phylogenetic inference. Graphs have proven to be a powerful tool for coping with the complexity of genome-scale sequence alignments. The potential of graphs to intuitively represent all aspects of genome alignments led to the development of graph-based approaches for genome alignment. These approaches construct a graph from a set of local alignments, and derive a genome alignment through identification and removal of graph substructures that indicate errors in the alignment. Results We compare the structures of commonly used graphs in terms of their abilities to represent alignment information. We describe how the graphs can be transformed into each other, and identify and classify graph substructures common to one or more graphs. Based on previous approaches, we compile a list of modifications that remove these substructures. Conclusion We show that crucial pieces of alignment information, associated with inversions and duplications, are not visible in the structure of all graphs. If we neglect vertex or edge labels, the graphs differ in their information content. Still, many ideas are shared among all graph-based approaches. Based on these findings, we outline a conceptual framework for graph-based genome alignment that can assist in the development of future genome alignment tools. PMID:24712884
Efficient dynamic graph construction for inductive semi-supervised learning.
Dornaika, F; Dahbi, R; Bosaghzadeh, A; Ruichek, Y
2017-10-01
Most of graph construction techniques assume a transductive setting in which the whole data collection is available at construction time. Addressing graph construction for inductive setting, in which data are coming sequentially, has received much less attention. For inductive settings, constructing the graph from scratch can be very time consuming. This paper introduces a generic framework that is able to make any graph construction method incremental. This framework yields an efficient and dynamic graph construction method that adds new samples (labeled or unlabeled) to a previously constructed graph. As a case study, we use the recently proposed Two Phase Weighted Regularized Least Square (TPWRLS) graph construction method. The paper has two main contributions. First, we use the TPWRLS coding scheme to represent new sample(s) with respect to an existing database. The representative coefficients are then used to update the graph affinity matrix. The proposed method not only appends the new samples to the graph but also updates the whole graph structure by discovering which nodes are affected by the introduction of new samples and by updating their edge weights. The second contribution of the article is the application of the proposed framework to the problem of graph-based label propagation using multiple observations for vision-based recognition tasks. Experiments on several image databases show that, without any significant loss in the accuracy of the final classification, the proposed dynamic graph construction is more efficient than the batch graph construction. Copyright © 2017 Elsevier Ltd. All rights reserved.
JavaGenes: Evolving Graphs with Crossover
NASA Technical Reports Server (NTRS)
Globus, Al; Atsatt, Sean; Lawton, John; Wipke, Todd
2000-01-01
Genetic algorithms usually use string or tree representations. We have developed a novel crossover operator for a directed and undirected graph representation, and used this operator to evolve molecules and circuits. Unlike strings or trees, a single point in the representation cannot divide every possible graph into two parts, because graphs may contain cycles. Thus, the crossover operator is non-trivial. A steady-state, tournament selection genetic algorithm code (JavaGenes) was written to implement and test the graph crossover operator. All runs were executed by cycle-scavagging on networked workstations using the Condor batch processing system. The JavaGenes code has evolved pharmaceutical drug molecules and simple digital circuits. Results to date suggest that JavaGenes can evolve moderate sized drug molecules and very small circuits in reasonable time. The algorithm has greater difficulty with somewhat larger circuits, suggesting that directed graphs (circuits) are more difficult to evolve than undirected graphs (molecules), although necessary differences in the crossover operator may also explain the results. In principle, JavaGenes should be able to evolve other graph-representable systems, such as transportation networks, metabolic pathways, and computer networks. However, large graphs evolve significantly slower than smaller graphs, presumably because the space-of-all-graphs explodes combinatorially with graph size. Since the representation strongly affects genetic algorithm performance, adding graphs to the evolutionary programmer's bag-of-tricks should be beneficial. Also, since graph evolution operates directly on the phenotype, the genotype-phenotype translation step, common in genetic algorithm work, is eliminated.
Chemical Applications of Graph Theory: Part II. Isomer Enumeration.
ERIC Educational Resources Information Center
Hansen, Peter J.; Jurs, Peter C.
1988-01-01
Discusses the use of graph theory to aid in the depiction of organic molecular structures. Gives a historical perspective of graph theory and explains graph theory terminology with organic examples. Lists applications of graph theory to current research projects. (ML)
Graphing trillions of triangles
Burkhardt, Paul
2016-01-01
The increasing size of Big Data is often heralded but how data are transformed and represented is also profoundly important to knowledge discovery, and this is exemplified in Big Graph analytics. Much attention has been placed on the scale of the input graph but the product of a graph algorithm can be many times larger than the input. This is true for many graph problems, such as listing all triangles in a graph. Enabling scalable graph exploration for Big Graphs requires new approaches to algorithms, architectures, and visual analytics. A brief tutorial is given to aid the argument for thoughtful representation of data in the context of graph analysis. Then a new algebraic method to reduce the arithmetic operations in counting and listing triangles in graphs is introduced. Additionally, a scalable triangle listing algorithm in the MapReduce model will be presented followed by a description of the experiments with that algorithm that led to the current largest and fastest triangle listing benchmarks to date. Finally, a method for identifying triangles in new visual graph exploration technologies is proposed. PMID:28690426
Exploring Text and Icon Graph Interpretation in Students with Dyslexia: An Eye-tracking Study.
Kim, Sunjung; Wiseheart, Rebecca
2017-02-01
A growing body of research suggests that individuals with dyslexia struggle to use graphs efficiently. Given the persistence of orthographic processing deficits in dyslexia, this study tested whether graph interpretation deficits in dyslexia are directly related to difficulties processing the orthographic components of graphs (i.e. axes and legend labels). Participants were 80 college students with and without dyslexia. Response times and eye movements were recorded as students answered comprehension questions about simple data displayed in bar graphs. Axes and legends were labelled either with words (mixed-modality graphs) or icons (orthography-free graphs). Students also answered informationally equivalent questions presented in sentences (orthography-only condition). Response times were slower in the dyslexic group only for processing sentences. However, eye tracking data revealed group differences for processing mixed-modality graphs, whereas no group differences were found for the orthography-free graphs. When processing bar graphs, students with dyslexia differ from their able reading peers only when graphs contain orthographic features. Implications for processing informational text are discussed. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Experiments on Antiprotons: Antiproton-Nucleon Cross Sections
DOE R&D Accomplishments Database
Chamberlain, Owen; Keller, Donald V.; Mermond, Ronald; Segre, Emilio; Steiner, Herbert M.; Ypsilantis, Tom
1957-07-22
In this paper experiments are reported on annihilation and scattering of antiprotons in H{sub 2}O , D{sub 2}O, and O{sub 2}. From the data measured it is possible to obtain an antiproton-proton and an antiproton-deuteron cross section at 457 Mev (lab). Further analysis gives the p-p and p-n cross sections as 104 mb for the p-p reaction cross section and 113 mb for the p-n reaction cross section. The respective annihilation cross sections are 89 and 74 mb. The Glauber correction necessary in order to pass from the p-d to the p-n cross section by subtraction of the p-p cross section is unfortunately large and somewhat uncertain. The data are compared with the p-p and p-n cross sections and with other results on p-p collisions.
ERIC Educational Resources Information Center
Conway, Lorraine
This packet of student materials contains a variety of worksheet activities dealing with science graphs and science word games. These reproducible materials deal with: (1) bar graphs; (2) line graphs; (3) circle graphs; (4) pictographs; (5) histograms; (6) artgraphs; (7) designing your own graphs; (8) medical prefixes; (9) color prefixes; (10)…
Study of BenW (n = 1-12) clusters: An electron collision perspective
NASA Astrophysics Data System (ADS)
Modak, Paresh; Kaur, Jaspreet; Antony, Bobby
2017-08-01
This article explores electron scattering cross sections by Beryllium-Tungsten clusters (BenW). Beryllium and tungsten are important elements for plasma facing wall components, especially for the deuterium/tritium phase of ITER and in the recently installed JET. The present study focuses on different electron impact interactions in terms of elastic cross section (Qel), inelastic cross section (Qinel), ionization cross section (Qion), and momentum transfer cross section (Qmtcs) for the first twelve clusters belonging to the BenW family. It also predicts the evolution of the cross section with the size of the cluster. These cross sections are used as an input to model processes in plasma. The ionization cross section presented here is compared with the available reported data. This is the first comprehensive report on cross section data for all the above-mentioned scattering channels, to the best of our knowledge. Such broad analysis of cross section data gives vital insight into the study of local chemistry of electron interactions with BenW (n = 1-12) clusters in plasma.
Huang, Xiaoke; Zhao, Ye; Yang, Jing; Zhang, Chong; Ma, Chao; Ye, Xinyue
2016-01-01
We propose TrajGraph, a new visual analytics method, for studying urban mobility patterns by integrating graph modeling and visual analysis with taxi trajectory data. A special graph is created to store and manifest real traffic information recorded by taxi trajectories over city streets. It conveys urban transportation dynamics which can be discovered by applying graph analysis algorithms. To support interactive, multiscale visual analytics, a graph partitioning algorithm is applied to create region-level graphs which have smaller size than the original street-level graph. Graph centralities, including Pagerank and betweenness, are computed to characterize the time-varying importance of different urban regions. The centralities are visualized by three coordinated views including a node-link graph view, a map view and a temporal information view. Users can interactively examine the importance of streets to discover and assess city traffic patterns. We have implemented a fully working prototype of this approach and evaluated it using massive taxi trajectories of Shenzhen, China. TrajGraph's capability in revealing the importance of city streets was evaluated by comparing the calculated centralities with the subjective evaluations from a group of drivers in Shenzhen. Feedback from a domain expert was collected. The effectiveness of the visual interface was evaluated through a formal user study. We also present several examples and a case study to demonstrate the usefulness of TrajGraph in urban transportation analysis.
ERIC Educational Resources Information Center
Lawes, Jonathan F.
2013-01-01
Graphing polar curves typically involves a combination of three traditional techniques, all of which can be time-consuming and tedious. However, an alternative method--graphing the polar function on a rectangular plane--simplifies graphing, increases student understanding of the polar coordinate system, and reinforces graphing techniques learned…
Averaging cross section data so we can fit it
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, D.
2014-10-23
The 56Fe cross section we are interested in have a lot of fluctuations. We would like to fit the average of the cross section with cross sections calculated within EMPIRE. EMPIRE is a Hauser-Feshbach theory based nuclear reaction code, requires cross sections to be smoothed using a Lorentzian profile. The plan is to fit EMPIRE to these cross sections in the fast region (say above 500 keV).
New cross sections for H on H2 collisional transitions
NASA Astrophysics Data System (ADS)
Zou, Qianxia
2011-12-01
The cross section for H on H2 collisions is important for astrophysics as well as our understanding of the simple chemical systems. This is the simplest atom-molecule cross section. With a new H3 potential surface by Mielke et al., we have modified the ABC code by Skouteris, Castillo and Manolopoulos to calculate new cross sections. These cross sections are compared to previous cross section calculations.
On the locating-chromatic number for graphs with two homogenous components
NASA Astrophysics Data System (ADS)
Welyyanti, Des; Baskoro, Edy Tri; Simajuntak, Rinovia; Uttunggadewa, Saladin
2017-10-01
The locating-chromatic number of a graph was introduced by Chartrand et al. in 2002. The concept of the locating-chromatic number is a marriage between graph coloring and the notion of graph partition dimension. This concept is only for connected graphs. In [8], we extended this concept also for disconnected graphs. In this paper, we determine the locating- chromatic number of a graph with two components. In particular, we determine such values if the components are homogeneous and each component has locating-chromatic number 3.
Enabling Graph Mining in RDF Triplestores using SPARQL for Holistic In-situ Graph Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Sangkeun; Sukumar, Sreenivas R; Hong, Seokyong
The graph analysis is now considered as a promising technique to discover useful knowledge in data with a new perspective. We envi- sion that there are two dimensions of graph analysis: OnLine Graph Analytic Processing (OLGAP) and Graph Mining (GM) where each respectively focuses on subgraph pattern matching and automatic knowledge discovery in graph. Moreover, as these two dimensions aim to complementarily solve complex problems, holistic in-situ graph analysis which covers both OLGAP and GM in a single system is critical for minimizing the burdens of operating multiple graph systems and transferring intermediate result-sets between those systems. Nevertheless, most existingmore » graph analysis systems are only capable of one dimension of graph analysis. In this work, we take an approach to enabling GM capabilities (e.g., PageRank, connected-component analysis, node eccentricity, etc.) in RDF triplestores, which are originally developed to store RDF datasets and provide OLGAP capability. More specifically, to achieve our goal, we implemented six representative graph mining algorithms using SPARQL. The approach allows a wide range of available RDF data sets directly applicable for holistic graph analysis within a system. For validation of our approach, we evaluate performance of our implementations with nine real-world datasets and three different computing environments - a laptop computer, an Amazon EC2 instance, and a shared-memory Cray XMT2 URIKA-GD graph-processing appliance. The experimen- tal results show that our implementation can provide promising and scalable performance for real world graph analysis in all tested environments. The developed software is publicly available in an open-source project that we initiated.« less
Dowding, Dawn; Merrill, Jacqueline A; Onorato, Nicole; Barrón, Yolanda; Rosati, Robert J; Russell, David
2018-02-01
To explore home care nurses' numeracy and graph literacy and their relationship to comprehension of visualized data. A multifactorial experimental design using online survey software. Nurses were recruited from 2 Medicare-certified home health agencies. Numeracy and graph literacy were measured using validated scales. Nurses were randomized to 1 of 4 experimental conditions. Each condition displayed data for 1 of 4 quality indicators, in 1 of 4 different visualized formats (bar graph, line graph, spider graph, table). A mixed linear model measured the impact of numeracy, graph literacy, and display format on data understanding. In all, 195 nurses took part in the study. They were slightly more numerate and graph literate than the general population. Overall, nurses understood information presented in bar graphs most easily (88% correct), followed by tables (81% correct), line graphs (77% correct), and spider graphs (41% correct). Individuals with low numeracy and low graph literacy had poorer comprehension of information displayed across all formats. High graph literacy appeared to enhance comprehension of data regardless of numeracy capabilities. Clinical dashboards are increasingly used to provide information to clinicians in visualized format, under the assumption that visual display reduces cognitive workload. Results of this study suggest that nurses' comprehension of visualized information is influenced by their numeracy, graph literacy, and the display format of the data. Individual differences in numeracy and graph literacy skills need to be taken into account when designing dashboard technology. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Durand, Patrick; Labarre, Laurent; Meil, Alain; Divo, Jean-Louis; Vandenbrouck, Yves; Viari, Alain; Wojcik, Jérôme
2006-01-17
A large variety of biological data can be represented by graphs. These graphs can be constructed from heterogeneous data coming from genomic and post-genomic technologies, but there is still need for tools aiming at exploring and analysing such graphs. This paper describes GenoLink, a software platform for the graphical querying and exploration of graphs. GenoLink provides a generic framework for representing and querying data graphs. This framework provides a graph data structure, a graph query engine, allowing to retrieve sub-graphs from the entire data graph, and several graphical interfaces to express such queries and to further explore their results. A query consists in a graph pattern with constraints attached to the vertices and edges. A query result is the set of all sub-graphs of the entire data graph that are isomorphic to the pattern and satisfy the constraints. The graph data structure does not rely upon any particular data model but can dynamically accommodate for any user-supplied data model. However, for genomic and post-genomic applications, we provide a default data model and several parsers for the most popular data sources. GenoLink does not require any programming skill since all operations on graphs and the analysis of the results can be carried out graphically through several dedicated graphical interfaces. GenoLink is a generic and interactive tool allowing biologists to graphically explore various sources of information. GenoLink is distributed either as a standalone application or as a component of the Genostar/Iogma platform. Both distributions are free for academic research and teaching purposes and can be requested at academy@genostar.com. A commercial licence form can be obtained for profit company at info@genostar.com. See also http://www.genostar.org.
Durand, Patrick; Labarre, Laurent; Meil, Alain; Divo1, Jean-Louis; Vandenbrouck, Yves; Viari, Alain; Wojcik, Jérôme
2006-01-01
Background A large variety of biological data can be represented by graphs. These graphs can be constructed from heterogeneous data coming from genomic and post-genomic technologies, but there is still need for tools aiming at exploring and analysing such graphs. This paper describes GenoLink, a software platform for the graphical querying and exploration of graphs. Results GenoLink provides a generic framework for representing and querying data graphs. This framework provides a graph data structure, a graph query engine, allowing to retrieve sub-graphs from the entire data graph, and several graphical interfaces to express such queries and to further explore their results. A query consists in a graph pattern with constraints attached to the vertices and edges. A query result is the set of all sub-graphs of the entire data graph that are isomorphic to the pattern and satisfy the constraints. The graph data structure does not rely upon any particular data model but can dynamically accommodate for any user-supplied data model. However, for genomic and post-genomic applications, we provide a default data model and several parsers for the most popular data sources. GenoLink does not require any programming skill since all operations on graphs and the analysis of the results can be carried out graphically through several dedicated graphical interfaces. Conclusion GenoLink is a generic and interactive tool allowing biologists to graphically explore various sources of information. GenoLink is distributed either as a standalone application or as a component of the Genostar/Iogma platform. Both distributions are free for academic research and teaching purposes and can be requested at academy@genostar.com. A commercial licence form can be obtained for profit company at info@genostar.com. See also . PMID:16417636
Enabling Graph Mining in RDF Triplestores using SPARQL for Holistic In-situ Graph Analysis
Lee, Sangkeun; Sukumar, Sreenivas R; Hong, Seokyong; ...
2016-01-01
The graph analysis is now considered as a promising technique to discover useful knowledge in data with a new perspective. We envi- sion that there are two dimensions of graph analysis: OnLine Graph Analytic Processing (OLGAP) and Graph Mining (GM) where each respectively focuses on subgraph pattern matching and automatic knowledge discovery in graph. Moreover, as these two dimensions aim to complementarily solve complex problems, holistic in-situ graph analysis which covers both OLGAP and GM in a single system is critical for minimizing the burdens of operating multiple graph systems and transferring intermediate result-sets between those systems. Nevertheless, most existingmore » graph analysis systems are only capable of one dimension of graph analysis. In this work, we take an approach to enabling GM capabilities (e.g., PageRank, connected-component analysis, node eccentricity, etc.) in RDF triplestores, which are originally developed to store RDF datasets and provide OLGAP capability. More specifically, to achieve our goal, we implemented six representative graph mining algorithms using SPARQL. The approach allows a wide range of available RDF data sets directly applicable for holistic graph analysis within a system. For validation of our approach, we evaluate performance of our implementations with nine real-world datasets and three different computing environments - a laptop computer, an Amazon EC2 instance, and a shared-memory Cray XMT2 URIKA-GD graph-processing appliance. The experimen- tal results show that our implementation can provide promising and scalable performance for real world graph analysis in all tested environments. The developed software is publicly available in an open-source project that we initiated.« less
Molecular graph convolutions: moving beyond fingerprints.
Kearnes, Steven; McCloskey, Kevin; Berndl, Marc; Pande, Vijay; Riley, Patrick
2016-08-01
Molecular "fingerprints" encoding structural information are the workhorse of cheminformatics and machine learning in drug discovery applications. However, fingerprint representations necessarily emphasize particular aspects of the molecular structure while ignoring others, rather than allowing the model to make data-driven decisions. We describe molecular graph convolutions, a machine learning architecture for learning from undirected graphs, specifically small molecules. Graph convolutions use a simple encoding of the molecular graph-atoms, bonds, distances, etc.-which allows the model to take greater advantage of information in the graph structure. Although graph convolutions do not outperform all fingerprint-based methods, they (along with other graph-based methods) represent a new paradigm in ligand-based virtual screening with exciting opportunities for future improvement.
Mutual proximity graphs for improved reachability in music recommendation.
Flexer, Arthur; Stevens, Jeff
2018-01-01
This paper is concerned with the impact of hubness, a general problem of machine learning in high-dimensional spaces, on a real-world music recommendation system based on visualisation of a k-nearest neighbour (knn) graph. Due to a problem of measuring distances in high dimensions, hub objects are recommended over and over again while anti-hubs are nonexistent in recommendation lists, resulting in poor reachability of the music catalogue. We present mutual proximity graphs, which are an alternative to knn and mutual knn graphs, and are able to avoid hub vertices having abnormally high connectivity. We show that mutual proximity graphs yield much better graph connectivity resulting in improved reachability compared to knn graphs, mutual knn graphs and mutual knn graphs enhanced with minimum spanning trees, while simultaneously reducing the negative effects of hubness.
Mutual proximity graphs for improved reachability in music recommendation
Flexer, Arthur; Stevens, Jeff
2018-01-01
This paper is concerned with the impact of hubness, a general problem of machine learning in high-dimensional spaces, on a real-world music recommendation system based on visualisation of a k-nearest neighbour (knn) graph. Due to a problem of measuring distances in high dimensions, hub objects are recommended over and over again while anti-hubs are nonexistent in recommendation lists, resulting in poor reachability of the music catalogue. We present mutual proximity graphs, which are an alternative to knn and mutual knn graphs, and are able to avoid hub vertices having abnormally high connectivity. We show that mutual proximity graphs yield much better graph connectivity resulting in improved reachability compared to knn graphs, mutual knn graphs and mutual knn graphs enhanced with minimum spanning trees, while simultaneously reducing the negative effects of hubness. PMID:29348779
A computer program for analyzing channel geometry
Regan, R.S.; Schaffranek, R.W.
1985-01-01
The Channel Geometry Analysis Program (CGAP) provides the capability to process, analyze, and format cross-sectional data for input to flow/transport simulation models or other computational programs. CGAP allows for a variety of cross-sectional data input formats through use of variable format specification. The program accepts data from various computer media and provides for modification of machine-stored parameter values. CGAP has been devised to provide a rapid and efficient means of computing and analyzing the physical properties of an open-channel reach defined by a sequence of cross sections. CGAP 's 16 options provide a wide range of methods by which to analyze and depict a channel reach and its individual cross-sectional properties. The primary function of the program is to compute the area, width, wetted perimeter, and hydraulic radius of cross sections at successive increments of water surface elevation (stage) from data that consist of coordinate pairs of cross-channel distances and land surface or channel bottom elevations. Longitudinal rates-of-change of cross-sectional properties are also computed, as are the mean properties of a channel reach. Output products include tabular lists of cross-sectional area, channel width, wetted perimeter, hydraulic radius, average depth, and cross-sectional symmetry computed as functions of stage; plots of cross sections; plots of cross-sectional area and (or) channel width as functions of stage; tabular lists of cross-sectional area and channel width computed as functions of stage for subdivisions of a cross section; plots of cross sections in isometric projection; and plots of cross-sectional area at a fixed stage as a function of longitudinal distance along an open-channel reach. A Command Procedure Language program and Job Control Language procedure exist to facilitate program execution on the U.S. Geological Survey Prime and Amdahl computer systems respectively. (Lantz-PTT)
Identifying the minor set cover of dense connected bipartite graphs via random matching edge sets
NASA Astrophysics Data System (ADS)
Hamilton, Kathleen E.; Humble, Travis S.
2017-04-01
Using quantum annealing to solve an optimization problem requires minor embedding a logic graph into a known hardware graph. In an effort to reduce the complexity of the minor embedding problem, we introduce the minor set cover (MSC) of a known graph G: a subset of graph minors which contain any remaining minor of the graph as a subgraph. Any graph that can be embedded into G will be embeddable into a member of the MSC. Focusing on embedding into the hardware graph of commercially available quantum annealers, we establish the MSC for a particular known virtual hardware, which is a complete bipartite graph. We show that the complete bipartite graph K_{N,N} has a MSC of N minors, from which K_{N+1} is identified as the largest clique minor of K_{N,N}. The case of determining the largest clique minor of hardware with faults is briefly discussed but remains an open question.
Identifying the minor set cover of dense connected bipartite graphs via random matching edge sets
Hamilton, Kathleen E.; Humble, Travis S.
2017-02-23
Using quantum annealing to solve an optimization problem requires minor embedding a logic graph into a known hardware graph. We introduce the minor set cover (MSC) of a known graph GG : a subset of graph minors which contain any remaining minor of the graph as a subgraph, in an effort to reduce the complexity of the minor embedding problem. Any graph that can be embedded into GG will be embeddable into a member of the MSC. Focusing on embedding into the hardware graph of commercially available quantum annealers, we establish the MSC for a particular known virtual hardware, whichmore » is a complete bipartite graph. Furthermore, we show that the complete bipartite graph K N,N has a MSC of N minors, from which K N+1 is identified as the largest clique minor of K N,N. In the case of determining the largest clique minor of hardware with faults we briefly discussed this open question.« less
Constructing compact and effective graphs for recommender systems via node and edge aggregations
Lee, Sangkeun; Kahng, Minsuk; Lee, Sang-goo
2014-12-10
Exploiting graphs for recommender systems has great potential to flexibly incorporate heterogeneous information for producing better recommendation results. As our baseline approach, we first introduce a naive graph-based recommendation method, which operates with a heterogeneous log-metadata graph constructed from user log and content metadata databases. Although the na ve graph-based recommendation method is simple, it allows us to take advantages of heterogeneous information and shows promising flexibility and recommendation accuracy. However, it often leads to extensive processing time due to the sheer size of the graphs constructed from entire user log and content metadata databases. In this paper, we proposemore » node and edge aggregation approaches to constructing compact and e ective graphs called Factor-Item bipartite graphs by aggregating nodes and edges of a log-metadata graph. Furthermore, experimental results using real world datasets indicate that our approach can significantly reduce the size of graphs exploited for recommender systems without sacrificing the recommendation quality.« less
graphkernels: R and Python packages for graph comparison
Ghisu, M Elisabetta; Llinares-López, Felipe; Borgwardt, Karsten
2018-01-01
Abstract Summary Measuring the similarity of graphs is a fundamental step in the analysis of graph-structured data, which is omnipresent in computational biology. Graph kernels have been proposed as a powerful and efficient approach to this problem of graph comparison. Here we provide graphkernels, the first R and Python graph kernel libraries including baseline kernels such as label histogram based kernels, classic graph kernels such as random walk based kernels, and the state-of-the-art Weisfeiler-Lehman graph kernel. The core of all graph kernels is implemented in C ++ for efficiency. Using the kernel matrices computed by the package, we can easily perform tasks such as classification, regression and clustering on graph-structured samples. Availability and implementation The R and Python packages including source code are available at https://CRAN.R-project.org/package=graphkernels and https://pypi.python.org/pypi/graphkernels. Contact mahito@nii.ac.jp or elisabetta.ghisu@bsse.ethz.ch Supplementary information Supplementary data are available online at Bioinformatics. PMID:29028902
Detecting labor using graph theory on connectivity matrices of uterine EMG.
Al-Omar, S; Diab, A; Nader, N; Khalil, M; Karlsson, B; Marque, C
2015-08-01
Premature labor is one of the most serious health problems in the developed world. One of the main reasons for this is that no good way exists to distinguish true labor from normal pregnancy contractions. The aim of this paper is to investigate if the application of graph theory techniques to multi-electrode uterine EMG signals can improve the discrimination between pregnancy contractions and labor. To test our methods we first applied them to synthetic graphs where we detected some differences in the parameters results and changes in the graph model from pregnancy-like graphs to labor-like graphs. Then, we applied the same methods to real signals. We obtained the best differentiation between pregnancy and labor through the same parameters. Major improvements in differentiating between pregnancy and labor were obtained using a low pass windowing preprocessing step. Results show that real graphs generally became more organized when moving from pregnancy, where the graph showed random characteristics, to labor where the graph became a more small-world like graph.
Gnutzmann, Sven; Waltner, Daniel
2016-12-01
We consider exact and asymptotic solutions of the stationary cubic nonlinear Schrödinger equation on metric graphs. We focus on some basic example graphs. The asymptotic solutions are obtained using the canonical perturbation formalism developed in our earlier paper [S. Gnutzmann and D. Waltner, Phys. Rev. E 93, 032204 (2016)2470-004510.1103/PhysRevE.93.032204]. For closed example graphs (interval, ring, star graph, tadpole graph), we calculate spectral curves and show how the description of spectra reduces to known characteristic functions of linear quantum graphs in the low-intensity limit. Analogously for open examples, we show how nonlinear scattering of stationary waves arises and how it reduces to known linear scattering amplitudes at low intensities. In the short-wavelength asymptotics we discuss how genuine nonlinear effects may be described using the leading order of canonical perturbation theory: bifurcation of spectral curves (and the corresponding solutions) in closed graphs and multistability in open graphs.
A Visual Analytics Paradigm Enabling Trillion-Edge Graph Exploration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wong, Pak C.; Haglin, David J.; Gillen, David S.
We present a visual analytics paradigm and a system prototype for exploring web-scale graphs. A web-scale graph is described as a graph with ~one trillion edges and ~50 billion vertices. While there is an aggressive R&D effort in processing and exploring web-scale graphs among internet vendors such as Facebook and Google, visualizing a graph of that scale still remains an underexplored R&D area. The paper describes a nontraditional peek-and-filter strategy that facilitates the exploration of a graph database of unprecedented size for visualization and analytics. We demonstrate that our system prototype can 1) preprocess a graph with ~25 billion edgesmore » in less than two hours and 2) support database query and visualization on the processed graph database afterward. Based on our computational performance results, we argue that we most likely will achieve the one trillion edge mark (a computational performance improvement of 40 times) for graph visual analytics in the near future.« less
graphkernels: R and Python packages for graph comparison.
Sugiyama, Mahito; Ghisu, M Elisabetta; Llinares-López, Felipe; Borgwardt, Karsten
2018-02-01
Measuring the similarity of graphs is a fundamental step in the analysis of graph-structured data, which is omnipresent in computational biology. Graph kernels have been proposed as a powerful and efficient approach to this problem of graph comparison. Here we provide graphkernels, the first R and Python graph kernel libraries including baseline kernels such as label histogram based kernels, classic graph kernels such as random walk based kernels, and the state-of-the-art Weisfeiler-Lehman graph kernel. The core of all graph kernels is implemented in C ++ for efficiency. Using the kernel matrices computed by the package, we can easily perform tasks such as classification, regression and clustering on graph-structured samples. The R and Python packages including source code are available at https://CRAN.R-project.org/package=graphkernels and https://pypi.python.org/pypi/graphkernels. mahito@nii.ac.jp or elisabetta.ghisu@bsse.ethz.ch. Supplementary data are available online at Bioinformatics. © The Author(s) 2017. Published by Oxford University Press.
NASA Technical Reports Server (NTRS)
Klutz, Glenn
1989-01-01
A facility was established that uses collected data and feeds it into mathematical models that generate improved data arrays by correcting for various losses, base line drift, and conversion to unity scaling. These developed data arrays have headers and other identifying information affixed and are subsequently stored in a Laser Materials and Characteristics data base which is accessible to various users. The two part data base: absorption - emission spectra and tabulated data, is developed around twelve laser models. The tabulated section of the data base is divided into several parts: crystalline, optical, mechanical, and thermal properties; aborption and emission spectra information; chemical name and formulas; and miscellaneous. A menu-driven, language-free graphing program will reduce and/or remove the requirement that users become competent FORTRAN programmers and the concomitant requirement that they also spend several days to a few weeks becoming conversant with the GEOGRAF library and sequence of calls and the continual refreshers of both. The work included becoming thoroughly conversant with or at least very familiar with GEOGRAF by GEOCOMP Corp. The development of the graphing program involved trial runs of the various callable library routines on dummy data in order to become familiar with actual implementation and sequencing. This was followed by trial runs with actual data base files and some additional data from current research that was not in the data base but currently needed graphs. After successful runs, with dummy and real data, using actual FORTRAN instructions steps were undertaken to develop the menu-driven language-free implementation of a program which would require the user only know how to use microcomputers. The user would simply be responding to items displayed on the video screen. To assist the user in arriving at the optimum values needed for a specific graph, a paper, and pencil check list was made available to use on the trial runs.
Continuous-time quantum walks on star graphs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salimi, S.
2009-06-15
In this paper, we investigate continuous-time quantum walk on star graphs. It is shown that quantum central limit theorem for a continuous-time quantum walk on star graphs for N-fold star power graph, which are invariant under the quantum component of adjacency matrix, converges to continuous-time quantum walk on K{sub 2} graphs (complete graph with two vertices) and the probability of observing walk tends to the uniform distribution.
Matching Extension in Regular Graphs
1989-01-01
Plummer, Matching Theory, Ann. Discrete Math . 29, North- Holland, Amsterdam, 1986. [101 , The matching structure of graphs: some recent re- sults...maximums d’un graphe, These, Dr. troisieme cycle, Univ. Grenoble, 1978. [12 ] D. Naddef and W.R. Pulleyblank, Matching in regular graphs, Discrete Math . 34...1981, 283-291. [13 1 M.D. Plummer, On n-extendable graphs, Discrete Math . 31, 1980, 201-210. . [ 141 ,Matching extension in planar graphs IV
2010-12-02
Motzkin, T. and Straus, E. (1965). Maxima for graphs and a new proof of a theorem of Turan . Canad. J. Math. 17 533–540. [33] Rendl, F. and Sotirov, R...Convex Graph Invariants Venkat Chandrasekaran, Pablo A . Parrilo, and Alan S. Willsky ∗ Laboratory for Information and Decision Systems Department of...this paper we study convex graph invariants, which are graph invariants that are convex functions of the adjacency matrix of a graph. Some examples
Application-Specific Graph Sampling for Frequent Subgraph Mining and Community Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Purohit, Sumit; Choudhury, Sutanay; Holder, Lawrence B.
Graph mining is an important data analysis methodology, but struggles as the input graph size increases. The scalability and usability challenges posed by such large graphs make it imperative to sample the input graph and reduce its size. The critical challenge in sampling is to identify the appropriate algorithm to insure the resulting analysis does not suffer heavily from the data reduction. Predicting the expected performance degradation for a given graph and sampling algorithm is also useful. In this paper, we present different sampling approaches for graph mining applications such as Frequent Subgrpah Mining (FSM), and Community Detection (CD). Wemore » explore graph metrics such as PageRank, Triangles, and Diversity to sample a graph and conclude that for heterogeneous graphs Triangles and Diversity perform better than degree based metrics. We also present two new sampling variations for targeted graph mining applications. We present empirical results to show that knowledge of the target application, along with input graph properties can be used to select the best sampling algorithm. We also conclude that performance degradation is an abrupt, rather than gradual phenomena, as the sample size decreases. We present the empirical results to show that the performance degradation follows a logistic function.« less
Graph characterization via Ihara coefficients.
Ren, Peng; Wilson, Richard C; Hancock, Edwin R
2011-02-01
The novel contributions of this paper are twofold. First, we demonstrate how to characterize unweighted graphs in a permutation-invariant manner using the polynomial coefficients from the Ihara zeta function, i.e., the Ihara coefficients. Second, we generalize the definition of the Ihara coefficients to edge-weighted graphs. For an unweighted graph, the Ihara zeta function is the reciprocal of a quasi characteristic polynomial of the adjacency matrix of the associated oriented line graph. Since the Ihara zeta function has poles that give rise to infinities, the most convenient numerically stable representation is to work with the coefficients of the quasi characteristic polynomial. Moreover, the polynomial coefficients are invariant to vertex order permutations and also convey information concerning the cycle structure of the graph. To generalize the representation to edge-weighted graphs, we make use of the reduced Bartholdi zeta function. We prove that the computation of the Ihara coefficients for unweighted graphs is a special case of our proposed method for unit edge weights. We also present a spectral analysis of the Ihara coefficients and indicate their advantages over other graph spectral methods. We apply the proposed graph characterization method to capturing graph-class structure and clustering graphs. Experimental results reveal that the Ihara coefficients are more effective than methods based on Laplacian spectra.
Kwon, Oh-Hyun; Crnovrsanin, Tarik; Ma, Kwan-Liu
2018-01-01
Using different methods for laying out a graph can lead to very different visual appearances, with which the viewer perceives different information. Selecting a "good" layout method is thus important for visualizing a graph. The selection can be highly subjective and dependent on the given task. A common approach to selecting a good layout is to use aesthetic criteria and visual inspection. However, fully calculating various layouts and their associated aesthetic metrics is computationally expensive. In this paper, we present a machine learning approach to large graph visualization based on computing the topological similarity of graphs using graph kernels. For a given graph, our approach can show what the graph would look like in different layouts and estimate their corresponding aesthetic metrics. An important contribution of our work is the development of a new framework to design graph kernels. Our experimental study shows that our estimation calculation is considerably faster than computing the actual layouts and their aesthetic metrics. Also, our graph kernels outperform the state-of-the-art ones in both time and accuracy. In addition, we conducted a user study to demonstrate that the topological similarity computed with our graph kernel matches perceptual similarity assessed by human users.
Knowledge Representation Issues in Semantic Graphs for Relationship Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barthelemy, M; Chow, E; Eliassi-Rad, T
2005-02-02
An important task for Homeland Security is the prediction of threat vulnerabilities, such as through the detection of relationships between seemingly disjoint entities. A structure used for this task is a ''semantic graph'', also known as a ''relational data graph'' or an ''attributed relational graph''. These graphs encode relationships as typed links between a pair of typed nodes. Indeed, semantic graphs are very similar to semantic networks used in AI. The node and link types are related through an ontology graph (also known as a schema). Furthermore, each node has a set of attributes associated with it (e.g., ''age'' maymore » be an attribute of a node of type ''person''). Unfortunately, the selection of types and attributes for both nodes and links depends on human expertise and is somewhat subjective and even arbitrary. This subjectiveness introduces biases into any algorithm that operates on semantic graphs. Here, we raise some knowledge representation issues for semantic graphs and provide some possible solutions using recently developed ideas in the field of complex networks. In particular, we use the concept of transitivity to evaluate the relevance of individual links in the semantic graph for detecting relationships. We also propose new statistical measures for semantic graphs and illustrate these semantic measures on graphs constructed from movies and terrorism data.« less
Simple graph models of information spread in finite populations
Voorhees, Burton; Ryder, Bergerud
2015-01-01
We consider several classes of simple graphs as potential models for information diffusion in a structured population. These include biases cycles, dual circular flows, partial bipartite graphs and what we call ‘single-link’ graphs. In addition to fixation probabilities, we study structure parameters for these graphs, including eigenvalues of the Laplacian, conductances, communicability and expected hitting times. In several cases, values of these parameters are related, most strongly so for partial bipartite graphs. A measure of directional bias in cycles and circular flows arises from the non-zero eigenvalues of the antisymmetric part of the Laplacian and another measure is found for cycles as the value of the transition probability for which hitting times going in either direction of the cycle are equal. A generalization of circular flow graphs is used to illustrate the possibility of tuning edge weights to match pre-specified values for graph parameters; in particular, we show that generalizations of circular flows can be tuned to have fixation probabilities equal to the Moran probability for a complete graph by tuning vertex temperature profiles. Finally, single-link graphs are introduced as an example of a graph involving a bottleneck in the connection between two components and these are compared to the partial bipartite graphs. PMID:26064661
Multistrand superconductor cable
Borden, A.R.
1984-03-08
Improved multistrand Rutherford-type superconductor cable is produced by using strands which are preformed, prior to being wound into the cable, so that each strand has a variable cross section, with successive portions having a substantially round cross section, a transitional oval cross section, a rectangular cross section, a transitional oval cross section, a round cross section and so forth, in repetitive cycles along the length of the strand. The cable is wound and flattened so that the portions of rectangular cross section extend across the two flat sides of the cable at the strand angle. The portions of round cross section are bent at the edges of the flattened cable, so as to extend between the two flat sides. The rectangular portions of the strands slide easil
Elmetwaly, Shereef; Schlick, Tamar
2014-01-01
Graph representations have been widely used to analyze and design various economic, social, military, political, and biological networks. In systems biology, networks of cells and organs are useful for understanding disease and medical treatments and, in structural biology, structures of molecules can be described, including RNA structures. In our RNA-As-Graphs (RAG) framework, we represent RNA structures as tree graphs by translating unpaired regions into vertices and helices into edges. Here we explore the modularity of RNA structures by applying graph partitioning known in graph theory to divide an RNA graph into subgraphs. To our knowledge, this is the first application of graph partitioning to biology, and the results suggest a systematic approach for modular design in general. The graph partitioning algorithms utilize mathematical properties of the Laplacian eigenvector (µ2) corresponding to the second eigenvalues (λ2) associated with the topology matrix defining the graph: λ2 describes the overall topology, and the sum of µ2′s components is zero. The three types of algorithms, termed median, sign, and gap cuts, divide a graph by determining nodes of cut by median, zero, and largest gap of µ2′s components, respectively. We apply these algorithms to 45 graphs corresponding to all solved RNA structures up through 11 vertices (∼220 nucleotides). While we observe that the median cut divides a graph into two similar-sized subgraphs, the sign and gap cuts partition a graph into two topologically-distinct subgraphs. We find that the gap cut produces the best biologically-relevant partitioning for RNA because it divides RNAs at less stable connections while maintaining junctions intact. The iterative gap cuts suggest basic modules and assembly protocols to design large RNA structures. Our graph substructuring thus suggests a systematic approach to explore the modularity of biological networks. In our applications to RNA structures, subgraphs also suggest design strategies for novel RNA motifs. PMID:25188578
A Semantic Graph Query Language
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaplan, I L
2006-10-16
Semantic graphs can be used to organize large amounts of information from a number of sources into one unified structure. A semantic query language provides a foundation for extracting information from the semantic graph. The graph query language described here provides a simple, powerful method for querying semantic graphs.
Constructing Dense Graphs with Unique Hamiltonian Cycles
ERIC Educational Resources Information Center
Lynch, Mark A. M.
2012-01-01
It is not difficult to construct dense graphs containing Hamiltonian cycles, but it is difficult to generate dense graphs that are guaranteed to contain a unique Hamiltonian cycle. This article presents an algorithm for generating arbitrarily large simple graphs containing "unique" Hamiltonian cycles. These graphs can be turned into dense graphs…
Kafieh, Raheleh; Rabbani, Hossein; Abramoff, Michael D.; Sonka, Milan
2013-01-01
Optical coherence tomography (OCT) is a powerful and noninvasive method for retinal imaging. In this paper, we introduce a fast segmentation method based on a new variant of spectral graph theory named diffusion maps. The research is performed on spectral domain (SD) OCT images depicting macular and optic nerve head appearance. The presented approach does not require edge-based image information in localizing most of boundaries and relies on regional image texture. Consequently, the proposed method demonstrates robustness in situations of low image contrast or poor layer-to-layer image gradients. Diffusion mapping applied to 2D and 3D OCT datasets is composed of two steps, one for partitioning the data into important and less important sections, and another one for localization of internal layers. In the first step, the pixels/voxels are grouped in rectangular/cubic sets to form a graph node. The weights of the graph are calculated based on geometric distances between pixels/voxels and differences of their mean intensity. The first diffusion map clusters the data into three parts, the second of which is the area of interest. The other two sections are eliminated from the remaining calculations. In the second step, the remaining area is subjected to another diffusion map assessment and the internal layers are localized based on their textural similarities. The proposed method was tested on 23 datasets from two patient groups (glaucoma and normals). The mean unsigned border positioning errors (mean ± SD) was 8.52 ± 3.13 and 7.56 ± 2.95 μm for the 2D and 3D methods, respectively. PMID:23837966
Dynamic graph of an oxy-fuel combustion system using autocatalytic set model
NASA Astrophysics Data System (ADS)
Harish, Noor Ainy; Bakar, Sumarni Abu
2017-08-01
Evaporation process is one of the main processes besides combustion process in an oxy-combustion boiler system. An Autocatalytic Set (ASC) Model has successfully applied in developing graphical representation of the chemical reactions that occurs in the evaporation process in the system. Seventeen variables identified in the process are represented as nodes and the catalytic relationships are represented as edges in the graph. In addition, in this paper graph dynamics of ACS is further investigated. By using Dynamic Autocatalytic Set Graph Algorithm (DAGA), the adjacency matrix for each of the graphs and its relations to Perron-Frobenius Theorem is investigated. The dynamic graph obtained is further investigated where the connection of the graph to fuzzy graph Type 1 is established.
A Weight-Adaptive Laplacian Embedding for Graph-Based Clustering.
Cheng, De; Nie, Feiping; Sun, Jiande; Gong, Yihong
2017-07-01
Graph-based clustering methods perform clustering on a fixed input data graph. Thus such clustering results are sensitive to the particular graph construction. If this initial construction is of low quality, the resulting clustering may also be of low quality. We address this drawback by allowing the data graph itself to be adaptively adjusted in the clustering procedure. In particular, our proposed weight adaptive Laplacian (WAL) method learns a new data similarity matrix that can adaptively adjust the initial graph according to the similarity weight in the input data graph. We develop three versions of these methods based on the L2-norm, fuzzy entropy regularizer, and another exponential-based weight strategy, that yield three new graph-based clustering objectives. We derive optimization algorithms to solve these objectives. Experimental results on synthetic data sets and real-world benchmark data sets exhibit the effectiveness of these new graph-based clustering methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamilton, Kathleen E.; Humble, Travis S.
Using quantum annealing to solve an optimization problem requires minor embedding a logic graph into a known hardware graph. We introduce the minor set cover (MSC) of a known graph GG : a subset of graph minors which contain any remaining minor of the graph as a subgraph, in an effort to reduce the complexity of the minor embedding problem. Any graph that can be embedded into GG will be embeddable into a member of the MSC. Focusing on embedding into the hardware graph of commercially available quantum annealers, we establish the MSC for a particular known virtual hardware, whichmore » is a complete bipartite graph. Furthermore, we show that the complete bipartite graph K N,N has a MSC of N minors, from which K N+1 is identified as the largest clique minor of K N,N. In the case of determining the largest clique minor of hardware with faults we briefly discussed this open question.« less
Genus Ranges of 4-Regular Rigid Vertex Graphs
Buck, Dorothy; Dolzhenko, Egor; Jonoska, Nataša; Saito, Masahico; Valencia, Karin
2016-01-01
A rigid vertex of a graph is one that has a prescribed cyclic order of its incident edges. We study orientable genus ranges of 4-regular rigid vertex graphs. The (orientable) genus range is a set of genera values over all orientable surfaces into which a graph is embedded cellularly, and the embeddings of rigid vertex graphs are required to preserve the prescribed cyclic order of incident edges at every vertex. The genus ranges of 4-regular rigid vertex graphs are sets of consecutive integers, and we address two questions: which intervals of integers appear as genus ranges of such graphs, and what types of graphs realize a given genus range. For graphs with 2n vertices (n > 1), we prove that all intervals [a, b] for all a < b ≤ n, and singletons [h, h] for some h ≤ n, are realized as genus ranges. For graphs with 2n − 1 vertices (n ≥ 1), we prove that all intervals [a, b] for all a < b ≤ n except [0, n], and [h, h] for some h ≤ n, are realized as genus ranges. We also provide constructions of graphs that realize these ranges. PMID:27807395
Ringo: Interactive Graph Analytics on Big-Memory Machines
Perez, Yonathan; Sosič, Rok; Banerjee, Arijit; Puttagunta, Rohan; Raison, Martin; Shah, Pararth; Leskovec, Jure
2016-01-01
We present Ringo, a system for analysis of large graphs. Graphs provide a way to represent and analyze systems of interacting objects (people, proteins, webpages) with edges between the objects denoting interactions (friendships, physical interactions, links). Mining graphs provides valuable insights about individual objects as well as the relationships among them. In building Ringo, we take advantage of the fact that machines with large memory and many cores are widely available and also relatively affordable. This allows us to build an easy-to-use interactive high-performance graph analytics system. Graphs also need to be built from input data, which often resides in the form of relational tables. Thus, Ringo provides rich functionality for manipulating raw input data tables into various kinds of graphs. Furthermore, Ringo also provides over 200 graph analytics functions that can then be applied to constructed graphs. We show that a single big-memory machine provides a very attractive platform for performing analytics on all but the largest graphs as it offers excellent performance and ease of use as compared to alternative approaches. With Ringo, we also demonstrate how to integrate graph analytics with an iterative process of trial-and-error data exploration and rapid experimentation, common in data mining workloads. PMID:27081215
Computing Information Value from RDF Graph Properties
DOE Office of Scientific and Technical Information (OSTI.GOV)
al-Saffar, Sinan; Heileman, Gregory
2010-11-08
Information value has been implicitly utilized and mostly non-subjectively computed in information retrieval (IR) systems. We explicitly define and compute the value of an information piece as a function of two parameters, the first is the potential semantic impact the target information can subjectively have on its recipient's world-knowledge, and the second parameter is trust in the information source. We model these two parameters as properties of RDF graphs. Two graphs are constructed, a target graph representing the semantics of the target body of information and a context graph representing the context of the consumer of that information. We computemore » information value subjectively as a function of both potential change to the context graph (impact) and the overlap between the two graphs (trust). Graph change is computed as a graph edit distance measuring the dissimilarity between the context graph before and after the learning of the target graph. A particular application of this subjective information valuation is in the construction of a personalized ranking component in Web search engines. Based on our method, we construct a Web re-ranking system that personalizes the information experience for the information-consumer.« less
Ringo: Interactive Graph Analytics on Big-Memory Machines.
Perez, Yonathan; Sosič, Rok; Banerjee, Arijit; Puttagunta, Rohan; Raison, Martin; Shah, Pararth; Leskovec, Jure
2015-01-01
We present Ringo, a system for analysis of large graphs. Graphs provide a way to represent and analyze systems of interacting objects (people, proteins, webpages) with edges between the objects denoting interactions (friendships, physical interactions, links). Mining graphs provides valuable insights about individual objects as well as the relationships among them. In building Ringo, we take advantage of the fact that machines with large memory and many cores are widely available and also relatively affordable. This allows us to build an easy-to-use interactive high-performance graph analytics system. Graphs also need to be built from input data, which often resides in the form of relational tables. Thus, Ringo provides rich functionality for manipulating raw input data tables into various kinds of graphs. Furthermore, Ringo also provides over 200 graph analytics functions that can then be applied to constructed graphs. We show that a single big-memory machine provides a very attractive platform for performing analytics on all but the largest graphs as it offers excellent performance and ease of use as compared to alternative approaches. With Ringo, we also demonstrate how to integrate graph analytics with an iterative process of trial-and-error data exploration and rapid experimentation, common in data mining workloads.
Reflecting on Graphs: Attributes of Graph Choice and Construction Practices in Biology
Angra, Aakanksha; Gardner, Stephanie M.
2017-01-01
Undergraduate biology education reform aims to engage students in scientific practices such as experimental design, experimentation, and data analysis and communication. Graphs are ubiquitous in the biological sciences, and creating effective graphical representations involves quantitative and disciplinary concepts and skills. Past studies document student difficulties with graphing within the contexts of classroom or national assessments without evaluating student reasoning. Operating under the metarepresentational competence framework, we conducted think-aloud interviews to reveal differences in reasoning and graph quality between undergraduate biology students, graduate students, and professors in a pen-and-paper graphing task. All professors planned and thought about data before graph construction. When reflecting on their graphs, professors and graduate students focused on the function of graphs and experimental design, while most undergraduate students relied on intuition and data provided in the task. Most undergraduate students meticulously plotted all data with scaled axes, while professors and some graduate students transformed the data, aligned the graph with the research question, and reflected on statistics and sample size. Differences in reasoning and approaches taken in graph choice and construction corroborate and extend previous findings and provide rich targets for undergraduate and graduate instruction. PMID:28821538
Yu, Qingbao; Du, Yuhui; Chen, Jiayu; He, Hao; Sui, Jing; Pearlson, Godfrey; Calhoun, Vince D
2017-11-01
A key challenge in building a brain graph using fMRI data is how to define the nodes. Spatial brain components estimated by independent components analysis (ICA) and regions of interest (ROIs) determined by brain atlas are two popular methods to define nodes in brain graphs. It is difficult to evaluate which method is better in real fMRI data. Here we perform a simulation study and evaluate the accuracies of a few graph metrics in graphs with nodes of ICA components, ROIs, or modified ROIs in four simulation scenarios. Graph measures with ICA nodes are more accurate than graphs with ROI nodes in all cases. Graph measures with modified ROI nodes are modulated by artifacts. The correlations of graph metrics across subjects between graphs with ICA nodes and ground truth are higher than the correlations between graphs with ROI nodes and ground truth in scenarios with large overlapped spatial sources. Moreover, moving the location of ROIs would largely decrease the correlations in all scenarios. Evaluating graphs with different nodes is promising in simulated data rather than real data because different scenarios can be simulated and measures of different graphs can be compared with a known ground truth. Since ROIs defined using brain atlas may not correspond well to real functional boundaries, overall findings of this work suggest that it is more appropriate to define nodes using data-driven ICA than ROI approaches in real fMRI data. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Dugan, J. V., Jr.; Canright, R. B., Jr.
1972-01-01
The numerical capture cross section is calculated from the capture ratio, defined as the fraction of trajectories reaching a prescribed minimum separation of 3 A. The calculated capture cross sections for a rotational temperature of 77 K suggest large reaction cross sections in 80 K experiments for the large dipole-moment target, methyl cyanide.
Kasai, Takehiro; Ishiguro, Naoki; Matsui, Yasumoto; Harada, Atsushi; Takemura, Marie; Yuki, Atsumu; Kato, Yuki; Otsuka, Rei; Ando, Fujiko; Shimokata, Hiroshi
2015-06-01
Sex- and age-related differences in mid-thigh composition and muscle quality remain unclear. The present study aimed to clarify these differences using computed tomography in middle-aged and elderly Japanese. A total of 2310 participants (age 40-89 years), who were randomly selected from the local residents, underwent computed tomography examination of the right mid-thigh. Thigh circumference and cross-sectional areas of the thigh, muscle, quadriceps, non-quadriceps, fat, and bone were measured. Knee extension strength and muscle quality index (knee extension strength/quadriceps cross-sectional area) were also assessed. Sex- and age-related differences in these indices were analyzed. The thigh cross-sectional area in men and women decreased by 0.6% and 0.5%/year, respectively, because of a decrease in muscle cross-sectional area (men 75.2%, women 40.6%), fat cross-sectional area (men 24.4%, women 59.6%) and bone cross-sectional area (men 0.5%, women -0.2%). Muscle cross-sectional area in men and women decreased by 0.6% and 0.4%/year, respectively, because of a decrease in quadriceps cross-sectional area (men 65.6%, women 81.6%) and non-quadriceps cross-sectional area (men 34.4%, women 18.4%). Muscle quality in men and women decreased by 0.4% and 0.3%/year, respectively. Thigh cross-sectional area decreased with age mainly because of a decrease in muscle cross-sectional area in men and fat cross-sectional area in women. The rate of decrease in muscle cross-sectional area was 1.5-fold higher in men than in women. Muscle cross-sectional area decreased with age mainly because of a decrease in quadriceps cross-sectional area, especially in women. Decrease in muscle quality with age was similar in both sexes. © 2014 Japan Geriatrics Society.
A distributed query execution engine of big attributed graphs.
Batarfi, Omar; Elshawi, Radwa; Fayoumi, Ayman; Barnawi, Ahmed; Sakr, Sherif
2016-01-01
A graph is a popular data model that has become pervasively used for modeling structural relationships between objects. In practice, in many real-world graphs, the graph vertices and edges need to be associated with descriptive attributes. Such type of graphs are referred to as attributed graphs. G-SPARQL has been proposed as an expressive language, with a centralized execution engine, for querying attributed graphs. G-SPARQL supports various types of graph querying operations including reachability, pattern matching and shortest path where any G-SPARQL query may include value-based predicates on the descriptive information (attributes) of the graph edges/vertices in addition to the structural predicates. In general, a main limitation of centralized systems is that their vertical scalability is always restricted by the physical limits of computer systems. This article describes the design, implementation in addition to the performance evaluation of DG-SPARQL, a distributed, hybrid and adaptive parallel execution engine of G-SPARQL queries. In this engine, the topology of the graph is distributed over the main memory of the underlying nodes while the graph data are maintained in a relational store which is replicated on the disk of each of the underlying nodes. DG-SPARQL evaluates parts of the query plan via SQL queries which are pushed to the underlying relational stores while other parts of the query plan, as necessary, are evaluated via indexless memory-based graph traversal algorithms. Our experimental evaluation shows the efficiency and the scalability of DG-SPARQL on querying massive attributed graph datasets in addition to its ability to outperform the performance of Apache Giraph, a popular distributed graph processing system, by orders of magnitudes.
Keller, Carmen; Junghans, Alex
2017-11-01
Individuals with low numeracy have difficulties with understanding complex graphs. Combining the information-processing approach to numeracy with graph comprehension and information-reduction theories, we examined whether high numerates' better comprehension might be explained by their closer attention to task-relevant graphical elements, from which they would expect numerical information to understand the graph. Furthermore, we investigated whether participants could be trained in improving their attention to task-relevant information and graph comprehension. In an eye-tracker experiment ( N = 110) involving a sample from the general population, we presented participants with 2 hypothetical scenarios (stomach cancer, leukemia) showing survival curves for 2 treatments. In the training condition, participants received written instructions on how to read the graph. In the control condition, participants received another text. We tracked participants' eye movements while they answered 9 knowledge questions. The sum constituted graph comprehension. We analyzed visual attention to task-relevant graphical elements by using relative fixation durations and relative fixation counts. The mediation analysis revealed a significant ( P < 0.05) indirect effect of numeracy on graph comprehension through visual attention to task-relevant information, which did not differ between the 2 conditions. Training had a significant main effect on visual attention ( P < 0.05) but not on graph comprehension ( P < 0.07). Individuals with high numeracy have better graph comprehension due to their greater attention to task-relevant graphical elements than individuals with low numeracy. With appropriate instructions, both groups can be trained to improve their graph-processing efficiency. Future research should examine (e.g., motivational) mediators between visual attention and graph comprehension to develop appropriate instructions that also result in higher graph comprehension.
Evaluation of Graph Pattern Matching Workloads in Graph Analysis Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, Seokyong; Lee, Sangkeun; Lim, Seung-Hwan
2016-01-01
Graph analysis has emerged as a powerful method for data scientists to represent, integrate, query, and explore heterogeneous data sources. As a result, graph data management and mining became a popular area of research, and led to the development of plethora of systems in recent years. Unfortunately, the number of emerging graph analysis systems and the wide range of applications, coupled with a lack of apples-to-apples comparisons, make it difficult to understand the trade-offs between different systems and the graph operations for which they are designed. A fair comparison of these systems is a challenging task for the following reasons:more » multiple data models, non-standardized serialization formats, various query interfaces to users, and diverse environments they operate in. To address these key challenges, in this paper we present a new benchmark suite by extending the Lehigh University Benchmark (LUBM) to cover the most common capabilities of various graph analysis systems. We provide the design process of the benchmark, which generalizes the workflow for data scientists to conduct the desired graph analysis on different graph analysis systems. Equipped with this extended benchmark suite, we present performance comparison for nine subgraph pattern retrieval operations over six graph analysis systems, namely NetworkX, Neo4j, Jena, Titan, GraphX, and uRiKA. Through the proposed benchmark suite, this study reveals both quantitative and qualitative findings in (1) implications in loading data into each system; (2) challenges in describing graph patterns for each query interface; and (3) different sensitivity of each system to query selectivity. We envision that this study will pave the road for: (i) data scientists to select the suitable graph analysis systems, and (ii) data management system designers to advance graph analysis systems.« less
Differentials on graph complexes II: hairy graphs
NASA Astrophysics Data System (ADS)
Khoroshkin, Anton; Willwacher, Thomas; Živković, Marko
2017-10-01
We study the cohomology of the hairy graph complexes which compute the rational homotopy of embedding spaces, generalizing the Vassiliev invariants of knot theory. We provide spectral sequences converging to zero whose first pages contain the hairy graph cohomology. Our results yield a way to construct many nonzero hairy graph cohomology classes out of (known) non-hairy classes by studying the cancellations in those sequences. This provide a first glimpse at the tentative global structure of the hairy graph cohomology.
Protein-protein interaction inference based on semantic similarity of Gene Ontology terms.
Zhang, Shu-Bo; Tang, Qiang-Rong
2016-07-21
Identifying protein-protein interactions is important in molecular biology. Experimental methods to this issue have their limitations, and computational approaches have attracted more and more attentions from the biological community. The semantic similarity derived from the Gene Ontology (GO) annotation has been regarded as one of the most powerful indicators for protein interaction. However, conventional methods based on GO similarity fail to take advantage of the specificity of GO terms in the ontology graph. We proposed a GO-based method to predict protein-protein interaction by integrating different kinds of similarity measures derived from the intrinsic structure of GO graph. We extended five existing methods to derive the semantic similarity measures from the descending part of two GO terms in the GO graph, then adopted a feature integration strategy to combines both the ascending and the descending similarity scores derived from the three sub-ontologies to construct various kinds of features to characterize each protein pair. Support vector machines (SVM) were employed as discriminate classifiers, and five-fold cross validation experiments were conducted on both human and yeast protein-protein interaction datasets to evaluate the performance of different kinds of integrated features, the experimental results suggest the best performance of the feature that combines information from both the ascending and the descending parts of the three ontologies. Our method is appealing for effective prediction of protein-protein interaction. Copyright © 2016 Elsevier Ltd. All rights reserved.
Alternative Fuels Data Center: Maps and Data
Fuel Standard Volumes by Year Generated_thumb20150904-8240-13hgnxh Last update August 2012 View Graph product or destination Last update August 2015 View Graph Graph Download Data Custom_thumb U.S. Ethanol , from 1866-2014 Last update August 2015 View Graph Graph Download Data Generated_thumb20160920-21993
Helping Students Make Sense of Graphs: An Experimental Trial of SmartGraphs Software
ERIC Educational Resources Information Center
Zucker, Andrew; Kay, Rachel; Staudt, Carolyn
2014-01-01
Graphs are commonly used in science, mathematics, and social sciences to convey important concepts; yet students at all ages demonstrate difficulties interpreting graphs. This paper reports on an experimental study of free, Web-based software called SmartGraphs that is specifically designed to help students overcome their misconceptions regarding…
Pan, Yongke; Niu, Wenjia
2017-01-01
Semisupervised Discriminant Analysis (SDA) is a semisupervised dimensionality reduction algorithm, which can easily resolve the out-of-sample problem. Relative works usually focus on the geometric relationships of data points, which are not obvious, to enhance the performance of SDA. Different from these relative works, the regularized graph construction is researched here, which is important in the graph-based semisupervised learning methods. In this paper, we propose a novel graph for Semisupervised Discriminant Analysis, which is called combined low-rank and k-nearest neighbor (LRKNN) graph. In our LRKNN graph, we map the data to the LR feature space and then the kNN is adopted to satisfy the algorithmic requirements of SDA. Since the low-rank representation can capture the global structure and the k-nearest neighbor algorithm can maximally preserve the local geometrical structure of the data, the LRKNN graph can significantly improve the performance of SDA. Extensive experiments on several real-world databases show that the proposed LRKNN graph is an efficient graph constructor, which can largely outperform other commonly used baselines. PMID:28316616
Computing Role Assignments of Proper Interval Graphs in Polynomial Time
NASA Astrophysics Data System (ADS)
Heggernes, Pinar; van't Hof, Pim; Paulusma, Daniël
A homomorphism from a graph G to a graph R is locally surjective if its restriction to the neighborhood of each vertex of G is surjective. Such a homomorphism is also called an R-role assignment of G. Role assignments have applications in distributed computing, social network theory, and topological graph theory. The Role Assignment problem has as input a pair of graphs (G,R) and asks whether G has an R-role assignment. This problem is NP-complete already on input pairs (G,R) where R is a path on three vertices. So far, the only known non-trivial tractable case consists of input pairs (G,R) where G is a tree. We present a polynomial time algorithm that solves Role Assignment on all input pairs (G,R) where G is a proper interval graph. Thus we identify the first graph class other than trees on which the problem is tractable. As a complementary result, we show that the problem is Graph Isomorphism-hard on chordal graphs, a superclass of proper interval graphs and trees.
A binary linear programming formulation of the graph edit distance.
Justice, Derek; Hero, Alfred
2006-08-01
A binary linear programming formulation of the graph edit distance for unweighted, undirected graphs with vertex attributes is derived and applied to a graph recognition problem. A general formulation for editing graphs is used to derive a graph edit distance that is proven to be a metric, provided the cost function for individual edit operations is a metric. Then, a binary linear program is developed for computing this graph edit distance, and polynomial time methods for determining upper and lower bounds on the solution of the binary program are derived by applying solution methods for standard linear programming and the assignment problem. A recognition problem of comparing a sample input graph to a database of known prototype graphs in the context of a chemical information system is presented as an application of the new method. The costs associated with various edit operations are chosen by using a minimum normalized variance criterion applied to pairwise distances between nearest neighbors in the database of prototypes. The new metric is shown to perform quite well in comparison to existing metrics when applied to a database of chemical graphs.
NASA Technical Reports Server (NTRS)
Diana, L. M.; Chaplin, R. L.; Brooks, D. L.; Adams, J. T.; Reyna, L. K.
1990-01-01
An improved technique is presented for employing the 2.3m spectrometer to measure total ionization cross sections, Q sub ion, for positrons incident on He. The new ionization cross section agree with the values reported earlier. Estimates are also presented of total elastic scattering cross section, Q sub el, obtained by subtracting from total scattering cross sections, Q sub tot, reported in the literature, the Q sub ion and Q sub Ps (total positronium formation cross sections) and total excitation cross sections, Q sub ex, published by another researcher. The Q sub ion and Q sub el measured with the 3m high resolution time-of-flight spectrometer for 54.9eV positrons are in accord with the results from the 2.3m spectrometer. The ionization cross sections are in fair agreement with theory tending for the most part to be higher, especially at 76.3 and 88.5eV. The elastic cross section agree quite well with theory to the vicinity of 50eV, but at 60eV and above the experimental elastic cross sections climb to and remain at about 0.30 pi a sub o sq while the theoretical values steadily decrease.
Graphing Calculators in the Secondary Mathematics Classroom. Monograph #21.
ERIC Educational Resources Information Center
Eckert, Paul; And Others
The objective of this presentation is to focus on the use of a hand-held graphics calculator. The specific machine referred to in this monograph is the Casio fx-7000G, chosen because of its low cost, its large viewing screen, its versatility, and its simple operation. Sections include: (1) "Basic Operations with the Casio fx-7000G"; (2) "Graphical…
Resolving the pulpwood canvass with inventory harvest information
Joseph M. McCollum; Tony G. Johnson
2012-01-01
The Resource Use section of the Forest Inventory and Analysis (FIA) Program has done a canvas of wood processing mills for timber product output (TPO) throughout the southern United States. Pulpmills in the South are canvassed on an annual basis, while all other mills (e.g., sawmills, veneer mills, etc.) are canvassed every two years. Attempts have been made to graph...
2016-11-09
the model does not become a full probabilistic attack graph analysis of the network , whose data requirements are currently unrealistic. The second...flow. – Untrustworthy persons may intentionally try to exfiltrate known sensitive data to ex- ternal networks . People may also unintentionally leak...section will provide details on the components, procedures, data requirements, and parameters required to instantiate the network porosity model. These
Quantum walk on a chimera graph
NASA Astrophysics Data System (ADS)
Xu, Shu; Sun, Xiangxiang; Wu, Jizhou; Zhang, Wei-Wei; Arshed, Nigum; Sanders, Barry C.
2018-05-01
We analyse a continuous-time quantum walk on a chimera graph, which is a graph of choice for designing quantum annealers, and we discover beautiful quantum walk features such as localization that starkly distinguishes classical from quantum behaviour. Motivated by technological thrusts, we study continuous-time quantum walk on enhanced variants of the chimera graph and on diminished chimera graph with a random removal of vertices. We explain the quantum walk by constructing a generating set for a suitable subgroup of graph isomorphisms and corresponding symmetry operators that commute with the quantum walk Hamiltonian; the Hamiltonian and these symmetry operators provide a complete set of labels for the spectrum and the stationary states. Our quantum walk characterization of the chimera graph and its variants yields valuable insights into graphs used for designing quantum-annealers.
Nano-fabricated plasmonic optical transformer
Choo, Hyuck; Cabrini, Stefano; Schuck, P. James; Liang, Xiaogan; Yablonovitch, Eli
2015-06-09
The present invention provides a plasmonic optical transformer to produce a highly focuses optical beam spot, where the transformer includes a first metal layer, a dielectric layer formed on the first metal layer, and a second metal layer formed on the dielectric layer, where the first metal layer, the dielectric layer, and the second layer are patterned to a shape including a first section having a first cross section, a second section following the first section having a cross-section tapering from the first section to a smaller cross-section, and a third section following the second section having a cross-section matching the tapered smaller cross-section of the second section.
Ryder, Robert T.; Trippi, Michael H.; Swezey, Christopher S.; Crangle, Robert D.; Hope, Rebecca S.; Rowan, Elisabeth L.; Lentz, Erika E.
2012-01-01
Geologic cross section C-C' is the third in a series of cross sections constructed by the U.S. Geological Survey (USGS) to document and improve understanding of the geologic framework and petroleum systems of the Appalachian basin. Cross section C-C' provides a regional view of the structural and stratigraphic framework of the Appalachian basin from north-central Ohio to the Valley and Ridge province in south-central Pennsylvania, a distance of approximately 260 miles (mi). This cross section is a companion to cross sections E-E' and D-D' that are located about 50 to 125 mi and 25 to 50 mi, respectively, to the southwest. Cross section C-C' contains much information that is useful for evaluating energy resources in the Appalachian basin. Although specific petroleum systems are not identified on the cross section, many of their key elements (such as source rocks, reservoir rocks, seals, and traps) can be inferred from lithologic units, unconformities, and geologic structures shown on the cross section. Other aspects of petroleum systems (such as the timing of petroleum generation and preferred migration pathways) may be evaluated by burial history, thermal history, and fluid flow models based on what is shown on the cross section. Cross section C-C' also provides a general framework (stratigraphic units and general rock types) for the coal-bearing section, although the cross section lacks the detail to illustrate key elements of coal systems (such as paleoclimate, coal quality, and coal rank). In addition, cross section C-C' may be used as a reconnaissance tool to identify plausible geologic structures and strata for the subsurface storage of liquid waste or for the sequestration of carbon dioxide.
On Edge Exchangeable Random Graphs
NASA Astrophysics Data System (ADS)
Janson, Svante
2017-06-01
We study a recent model for edge exchangeable random graphs introduced by Crane and Dempsey; in particular we study asymptotic properties of the random simple graph obtained by merging multiple edges. We study a number of examples, and show that the model can produce dense, sparse and extremely sparse random graphs. One example yields a power-law degree distribution. We give some examples where the random graph is dense and converges a.s. in the sense of graph limit theory, but also an example where a.s. every graph limit is the limit of some subsequence. Another example is sparse and yields convergence to a non-integrable generalized graphon defined on (0,∞).
Diaconis, Persi; Holmes, Susan; Janson, Svante
2015-01-01
We work out a graph limit theory for dense interval graphs. The theory developed departs from the usual description of a graph limit as a symmetric function W (x, y) on the unit square, with x and y uniform on the interval (0, 1). Instead, we fix a W and change the underlying distribution of the coordinates x and y. We find choices such that our limits are continuous. Connections to random interval graphs are given, including some examples. We also show a continuity result for the chromatic number and clique number of interval graphs. Some results on uniqueness of the limit description are given for general graph limits. PMID:26405368
Backović, Mihailo; Krämer, Michael; Maltoni, Fabio; Martini, Antony; Mawatari, Kentarou; Pellen, Mathieu
Weakly interacting dark matter particles can be pair-produced at colliders and detected through signatures featuring missing energy in association with either QCD/EW radiation or heavy quarks. In order to constrain the mass and the couplings to standard model particles, accurate and precise predictions for production cross sections and distributions are of prime importance. In this work, we consider various simplified models with s -channel mediators. We implement such models in the FeynRules/MadGraph5_aMC@NLO framework, which allows to include higher-order QCD corrections in realistic simulations and to study their effect systematically. As a first phenomenological application, we present predictions for dark matter production in association with jets and with a top-quark pair at the LHC, at next-to-leading order accuracy in QCD, including matching/merging to parton showers. Our study shows that higher-order QCD corrections to dark matter production via s -channel mediators have a significant impact not only on total production rates, but also on shapes of distributions. We also show that the inclusion of next-to-leading order effects results in a sizeable reduction of the theoretical uncertainties.
Ivanciuc, Ovidiu
2013-06-01
Chemical and molecular graphs have fundamental applications in chemoinformatics, quantitative structureproperty relationships (QSPR), quantitative structure-activity relationships (QSAR), virtual screening of chemical libraries, and computational drug design. Chemoinformatics applications of graphs include chemical structure representation and coding, database search and retrieval, and physicochemical property prediction. QSPR, QSAR and virtual screening are based on the structure-property principle, which states that the physicochemical and biological properties of chemical compounds can be predicted from their chemical structure. Such structure-property correlations are usually developed from topological indices and fingerprints computed from the molecular graph and from molecular descriptors computed from the three-dimensional chemical structure. We present here a selection of the most important graph descriptors and topological indices, including molecular matrices, graph spectra, spectral moments, graph polynomials, and vertex topological indices. These graph descriptors are used to define several topological indices based on molecular connectivity, graph distance, reciprocal distance, distance-degree, distance-valency, spectra, polynomials, and information theory concepts. The molecular descriptors and topological indices can be developed with a more general approach, based on molecular graph operators, which define a family of graph indices related by a common formula. Graph descriptors and topological indices for molecules containing heteroatoms and multiple bonds are computed with weighting schemes based on atomic properties, such as the atomic number, covalent radius, or electronegativity. The correlation in QSPR and QSAR models can be improved by optimizing some parameters in the formula of topological indices, as demonstrated for structural descriptors based on atomic connectivity and graph distance.
Multistrand superconductor cable
Borden, Albert R.
1985-01-01
Improved multistrand Rutherford-type superconductor cable is produced by using strands which are preformed, prior to being wound into the cable, so that each strand has a variable cross section, with successive portions having a substantially round cross section, a transitional oval cross section, a rectangular cross section, a transitional oval cross section, a round cross section and so forth, in repetitive cycles along the length of the strand. The cable is wound and flattened so that the portions of rectangular cross section extend across the two flat sides of the cable at the strand angle. The portions of round cross section are bent at the edges of the flattened cable, so as to extend between the two flat sides. The rectangular portions of the strands slide easily over one another, so as to facilitate flexing and bending of the cable, while also minimizing the possibility of causing damage to the strands by such flexing or bending. Moreover, the improved cable substantially maintains its compactness and cross-sectional shape when the cable is flexed or bent.
2014-01-01
Background Although previous studies have demonstrated that children with high levels of fundamental movement skill competency are more active throughout the day, little is known regarding children’s fundamental movement skill competency and their physical activity during key time periods of the school day (i.e., lunchtime, recess and after-school). The purpose of this study was to examine the associations between fundamental movement skill competency and objectively measured moderate-to-vigorous physical activity (MVPA) throughout the school day among children attending primary schools in low-income communities. Methods Eight primary schools from low-income communities and 460 children (8.5 ± 0.6 years, 54% girls) were involved in the study. Children’s fundamental movement skill competency (TGMD-2; 6 locomotor and 6 object-control skills), objectively measured physical activity (ActiGraph GT3X and GT3X + accelerometers), height, weight and demographics were assessed. Multilevel linear mixed models were used to assess the cross-sectional associations between fundamental movement skills and MVPA. Results After adjusting for age, sex, BMI and socio-economic status, locomotor skill competency was positively associated with total (P = 0.002, r = 0.15) and after-school (P = 0.014, r = 0.13) MVPA. Object-control skill competency was positively associated with total (P < 0.001, r = 0.20), lunchtime (P = 0.03, r = 0.10), recess (P = 0.006, r = 0.11) and after-school (P = 0.022, r = 0.13) MVPA. Conclusions Object-control skill competency appears to be a better predictor of children’s MVPA during school-based physical activity opportunities than locomotor skill competency. Improving fundamental movement skill competency, particularly object-control skills, may contribute to increased levels of children’s MVPA throughout the day. Trial registration Australian New Zealand Clinical Trials Registry No: ACTRN12611001080910. PMID:24708604
Ludwig, Vera M; Bayley, Adam; Cook, Derek G; Stahl, Daniel; Treasure, Janet L; Asthworth, Mark; Greenough, Anne; Winkley, Kirsty; Bornstein, Stefan R; Ismail, Khalida
2018-04-12
Depressive symptoms are common but rarely considered a risk factor for unhealthy lifestyles associated with cardiovascular disease (CVD). This study investigates whether depressive symptoms are associated with reduced physical activity (PA) in individuals at high risk of developing CVD. Secondary analysis of the cross-sectional baseline data from a randomised controlled trial of an intensive lifestyle intervention. 135 primary care practices in South London, UK. 1742 adults, 49-74 years, 86% male at high (≥20%) risk of developing CVD in the next 10 years as defined via QRISK2 score. The main explanatory variable was depressive symptoms measured via the Patient Health Questionnaire-9 (PHQ-9). The main outcome was daily step count measured with an accelerometer (ActiGraph GT3X) stratified by weekdays and weekend days. The median daily step count of the total sample was 6151 (IQR 3510) with significant differences (P<0.001) in mean daily step count between participants with low (PHQ-9 score: 0-4), mild (PHQ-9 score: 5-9) and moderate to severe depressive symptoms (PHQ-9 score: ≥10). Controlling for age, gender, ethnicity, education level, body mass index (BMI), smoking, consumption of alcohol, day of the week and season, individuals with mild depressive symptoms and those with moderate to severe depressive symptoms walked 13.3% (95% CI 18.8% to 7.9%) and 15.6% (95% CI 23.7% to 6.5%) less than non-depressed individuals, respectively. Furthermore, male gender, white ethnicity, higher education level, lower BMI, non-smoking, moderate alcohol intake, weekdays and summer season were independently associated with higher step count. People at high risk of CVD with depressive symptoms have lower levels of PA. ISRCTN84864870; Pre-results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Spectral fluctuations of quantum graphs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pluhař, Z.; Weidenmüller, H. A.
We prove the Bohigas-Giannoni-Schmit conjecture in its most general form for completely connected simple graphs with incommensurate bond lengths. We show that for graphs that are classically mixing (i.e., graphs for which the spectrum of the classical Perron-Frobenius operator possesses a finite gap), the generating functions for all (P,Q) correlation functions for both closed and open graphs coincide (in the limit of infinite graph size) with the corresponding expressions of random-matrix theory, both for orthogonal and for unitary symmetry.
2-Extendability in Two Classes of Claw-Free Graphs
1992-01-01
extendability of planar graphs, Discrete Math ., 96, 1991, 81-99. [Lai M. Las Verguas, A note on matchings in graphs, Colloque sur la Thiorie des Graphes...43, 1987, 187-222. [LP L. Loviss and M.D. Plummet, Matching Theory, Ann. Discrete Math . 29, North-Holland, Amsterdam, 1986. [P11 M.D. Plummer, On n...extendable graphs, Discrete Math . 31, 1960, 201-210. [P21 Extending matchinp in planar graphs IV, Proc. of the Conference in honor of Cert Sabidussi, Ann
A Visual Evaluation Study of Graph Sampling Techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Fangyan; Zhang, Song; Wong, Pak C.
2017-01-29
We evaluate a dozen prevailing graph-sampling techniques with an ultimate goal to better visualize and understand big and complex graphs that exhibit different properties and structures. The evaluation uses eight benchmark datasets with four different graph types collected from Stanford Network Analysis Platform and NetworkX to give a comprehensive comparison of various types of graphs. The study provides a practical guideline for visualizing big graphs of different sizes and structures. The paper discusses results and important observations from the study.
Wong, Pak C.; Mackey, Patrick S.; Perrine, Kenneth A.; Foote, Harlan P.; Thomas, James J.
2008-12-23
Methods for visualizing a graph by automatically drawing elements of the graph as labels are disclosed. In one embodiment, the method comprises receiving node information and edge information from an input device and/or communication interface, constructing a graph layout based at least in part on that information, wherein the edges are automatically drawn as labels, and displaying the graph on a display device according to the graph layout. In some embodiments, the nodes are automatically drawn as labels instead of, or in addition to, the label-edges.
NASA Astrophysics Data System (ADS)
Patel, Niravkumar D.; Mehta, Rahul; Ali, Nawab; Soulsby, Michael; Chowdhury, Parimal
2013-04-01
The aim of this study was to determine composition of the leg bone tissue of rats that were exposed to simulated microgravity by Hind-Limb Suspension (HLS) by tail for one week. The leg bones were cross sectioned, cleaned of soft tissues, dried and sputter coated, and then placed horizontally on the stage of a Scanning Electron Microscope (SEM) for analysis. Interaction of a 17.5 keV electron beam, incident from the vertical direction on the sample, generated images using two detectors. X-rays emitted from the sample during electron bombardment were measured with an Energy Dispersive Spectroscopy (EDS) feature of SEM using a liquid-nitrogen cooled Si(Li) detector with a resolution of 144 eV at 5.9 keV (25Mn Kα x-ray). Kα- x-rays from carbon, oxygen, phosphorus and calcium formed the major peaks in the spectrum. Relative percentages of these elements were determined using a software that could also correct for ZAF factors namely Z(atomic number), A(X-ray absorption) and F(characteristic fluorescence). The x-rays from the control groups and from the experimental (HLS) groups were analyzed on well-defined parts (femur, tibia and knee) of the leg bone. The SEM analysis shows that there are definite changes in the hydroxyl or phosphate group of the main component of the bone structure, hydroxyapatite [Ca10(PO4)6(OH)2], due to hind limb suspension. In a separate experiment, entire leg bones (both from HLS and control rats) were subjected to mechanical stress by mean of a variable force. The stress vs. strain graph was fitted with linear and polynomial function, and the parameters reflecting the mechanical strength of the bone, under increasing stress, were calculated. From the slope of the linear part of the graph the Young's modulus for HLS bones were calculated and found to be 2.49 times smaller than those for control bones.
Altered Whole-Brain and Network-Based Functional Connectivity in Parkinson's Disease.
de Schipper, Laura J; Hafkemeijer, Anne; van der Grond, Jeroen; Marinus, Johan; Henselmans, Johanna M L; van Hilten, Jacobus J
2018-01-01
Background: Functional imaging methods, such as resting-state functional magnetic resonance imaging, reflect changes in neural connectivity and may help to assess the widespread consequences of disease-specific network changes in Parkinson's disease. In this study we used a relatively new graph analysis approach in functional imaging: eigenvector centrality mapping. This model-free method, applied to all voxels in the brain, identifies prominent regions in the brain network hierarchy and detects localized differences between patient populations. In other neurological disorders, eigenvector centrality mapping has been linked to changes in functional connectivity in certain nodes of brain networks. Objectives: Examining changes in functional brain connectivity architecture on a whole brain and network level in patients with Parkinson's disease. Methods: Whole brain resting-state functional architecture was studied with a recently introduced graph analysis approach (eigenvector centrality mapping). Functional connectivity was further investigated in relation to eight known resting-state networks. Cross-sectional analyses included group comparison of functional connectivity measures of Parkinson's disease patients ( n = 107) with control subjects ( n = 58) and correlations with clinical data, including motor and cognitive impairment and a composite measure of predominantly non-dopaminergic symptoms. Results: Eigenvector centrality mapping revealed that frontoparietal regions were more prominent in the whole-brain network function in patients compared to control subjects, while frontal and occipital brain areas were less prominent in patients. Using standard resting-state networks, we found predominantly increased functional connectivity, namely within sensorimotor system and visual networks in patients. Regional group differences in functional connectivity of both techniques between patients and control subjects partly overlapped for highly connected posterior brain regions, in particular in the posterior cingulate cortex and precuneus. Clinico-functional imaging relations were not found. Conclusions: Changes on the level of functional brain connectivity architecture might provide a different perspective of pathological consequences of Parkinson's disease. The involvement of specific, highly connected (hub) brain regions may influence whole brain functional network architecture in Parkinson's disease.
NASA Astrophysics Data System (ADS)
Yu, C. W.; Hodges, B. R.; Liu, F.
2017-12-01
Development of continental-scale river network models creates challenges where the massive amount of boundary condition data encounters the sensitivity of a dynamic nu- merical model. The topographic data sets used to define the river channel characteristics may include either corrupt data or complex configurations that cause instabilities in a numerical solution of the Saint-Venant equations. For local-scale river models (e.g. HEC- RAS), modelers typically rely on past experience to make ad hoc boundary condition adjustments that ensure a stable solution - the proof of the adjustment is merely the sta- bility of the solution. To date, there do not exist any formal methodologies or automated procedures for a priori detecting/fixing boundary conditions that cause instabilities in a dynamic model. Formal methodologies for data screening and adjustment are a critical need for simulations with a large number of river reaches that draw their boundary con- dition data from a wide variety of sources. At the continental scale, we simply cannot assume that we will have access to river-channel cross-section data that has been ade- quately analyzed and processed. Herein, we argue that problematic boundary condition data for unsteady dynamic modeling can be identified through numerical modeling with the steady-state Saint-Venant equations. The fragility of numerical stability increases with the complexity of branching in river network system and instabilities (even in an unsteady solution) are typically triggered by the nonlinear advection term in Saint-Venant equations. It follows that the behavior of the simpler steady-state equations (which retain the nonlin- ear term) can be used to screen the boundary condition data for problematic regions. In this research, we propose a graph-theory based method to isolate the location of corrupted boundary condition data in a continental-scale river network and demonstrate its utility with a network of O(10^4) elements. Acknowledgement: This research is supported by the National Science Foundation un- der grant number CCF-1331610.
Mitral Valve Chordae Tendineae: Topological and Geometrical Characterization.
Khalighi, Amir H; Drach, Andrew; Bloodworth, Charles H; Pierce, Eric L; Yoganathan, Ajit P; Gorman, Robert C; Gorman, Joseph H; Sacks, Michael S
2017-02-01
Mitral valve (MV) closure depends upon the proper function of each component of the valve apparatus, which includes the annulus, leaflets, and chordae tendineae (CT). Geometry plays a major role in MV mechanics and thus highly impacts the accuracy of computational models simulating MV function and repair. While the physiological geometry of the leaflets and annulus have been previously investigated, little effort has been made to quantitatively and objectively describe CT geometry. The CT constitute a fibrous tendon-like structure projecting from the papillary muscles (PMs) to the leaflets, thereby evenly distributing the loads placed on the MV during closure. Because CT play a major role in determining the shape and stress state of the MV as a whole, their geometry must be well characterized. In the present work, a novel and comprehensive investigation of MV CT geometry was performed to more fully quantify CT anatomy. In vitro micro-tomography 3D images of ovine MVs were acquired, segmented, then analyzed using a curve-skeleton transform. The resulting data was used to construct B-spline geometric representations of the CT structures, enriched with a continuous field of cross-sectional area (CSA) data. Next, Reeb graph models were developed to analyze overall topological patterns, along with dimensional attributes such as segment lengths, 3D orientations, and CSA. Reeb graph results revealed that the topology of ovine MV CT followed a full binary tree structure. Moreover, individual chords are mostly planar geometries that together form a 3D load-bearing support for the MV leaflets. We further demonstrated that, unlike flow-based branching patterns, while individual CT branches became thinner as they propagated further away from the PM heads towards the leaflets, the total CSA almost doubled. Overall, our findings indicate a certain level of regularity in structure, and suggest that population-based MV CT geometric models can be generated to improve current MV repair procedures.
Top-k similar graph matching using TraM in biological networks.
Amin, Mohammad Shafkat; Finley, Russell L; Jamil, Hasan M
2012-01-01
Many emerging database applications entail sophisticated graph-based query manipulation, predominantly evident in large-scale scientific applications. To access the information embedded in graphs, efficient graph matching tools and algorithms have become of prime importance. Although the prohibitively expensive time complexity associated with exact subgraph isomorphism techniques has limited its efficacy in the application domain, approximate yet efficient graph matching techniques have received much attention due to their pragmatic applicability. Since public domain databases are noisy and incomplete in nature, inexact graph matching techniques have proven to be more promising in terms of inferring knowledge from numerous structural data repositories. In this paper, we propose a novel technique called TraM for approximate graph matching that off-loads a significant amount of its processing on to the database making the approach viable for large graphs. Moreover, the vector space embedding of the graphs and efficient filtration of the search space enables computation of approximate graph similarity at a throw-away cost. We annotate nodes of the query graphs by means of their global topological properties and compare them with neighborhood biased segments of the datagraph for proper matches. We have conducted experiments on several real data sets, and have demonstrated the effectiveness and efficiency of the proposed method
NASA Astrophysics Data System (ADS)
Albirri, E. R.; Sugeng, K. A.; Aldila, D.
2018-04-01
Nowadays, in the modern world, since technology and human civilization start to progress, all city in the world is almost connected. The various places in this world are easier to visit. It is an impact of transportation technology and highway construction. The cities which have been connected can be represented by graph. Graph clustering is one of ways which is used to answer some problems represented by graph. There are some methods in graph clustering to solve the problem spesifically. One of them is Highly Connected Subgraphs (HCS) method. HCS is used to identify cluster based on the graph connectivity k for graph G. The connectivity in graph G is denoted by k(G)> \\frac{n}{2} that n is the total of vertices in G, then it is called as HCS or the cluster. This research used literature review and completed with simulation of program in a software. We modified HCS algorithm by using weighted graph. The modification is located in the Process Phase. Process Phase is used to cut the connected graph G into two subgraphs H and \\bar{H}. We also made a program by using software Octave-401. Then we applied the data of Flight Routes Mapping of One of Airlines in Indonesia to our program.
A Ranking Approach on Large-Scale Graph With Multidimensional Heterogeneous Information.
Wei, Wei; Gao, Bin; Liu, Tie-Yan; Wang, Taifeng; Li, Guohui; Li, Hang
2016-04-01
Graph-based ranking has been extensively studied and frequently applied in many applications, such as webpage ranking. It aims at mining potentially valuable information from the raw graph-structured data. Recently, with the proliferation of rich heterogeneous information (e.g., node/edge features and prior knowledge) available in many real-world graphs, how to effectively and efficiently leverage all information to improve the ranking performance becomes a new challenging problem. Previous methods only utilize part of such information and attempt to rank graph nodes according to link-based methods, of which the ranking performances are severely affected by several well-known issues, e.g., over-fitting or high computational complexity, especially when the scale of graph is very large. In this paper, we address the large-scale graph-based ranking problem and focus on how to effectively exploit rich heterogeneous information of the graph to improve the ranking performance. Specifically, we propose an innovative and effective semi-supervised PageRank (SSP) approach to parameterize the derived information within a unified semi-supervised learning framework (SSLF-GR), then simultaneously optimize the parameters and the ranking scores of graph nodes. Experiments on the real-world large-scale graphs demonstrate that our method significantly outperforms the algorithms that consider such graph information only partially.
An asynchronous traversal engine for graph-based rich metadata management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Dong; Carns, Philip; Ross, Robert B.
Rich metadata in high-performance computing (HPC) systems contains extended information about users, jobs, data files, and their relationships. Property graphs are a promising data model to represent heterogeneous rich metadata flexibly. Specifically, a property graph can use vertices to represent different entities and edges to record the relationships between vertices with unique annotations. The high-volume HPC use case, with millions of entities and relationships, naturally requires an out-of-core distributed property graph database, which must support live updates (to ingest production information in real time), low-latency point queries (for frequent metadata operations such as permission checking), and large-scale traversals (for provenancemore » data mining). Among these needs, large-scale property graph traversals are particularly challenging for distributed graph storage systems. Most existing graph systems implement a "level synchronous" breadth-first search algorithm that relies on global synchronization in each traversal step. This performs well in many problem domains; but a rich metadata management system is characterized by imbalanced graphs, long traversal lengths, and concurrent workloads, each of which has the potential to introduce or exacerbate stragglers (i.e., abnormally slow steps or servers in a graph traversal) that lead to low overall throughput for synchronous traversal algorithms. Previous research indicated that the straggler problem can be mitigated by using asynchronous traversal algorithms, and many graph-processing frameworks have successfully demonstrated this approach. Such systems require the graph to be loaded into a separate batch-processing framework instead of being iteratively accessed, however. In this work, we investigate a general asynchronous graph traversal engine that can operate atop a rich metadata graph in its native format. We outline a traversal-aware query language and key optimizations (traversal-affiliate caching and execution merging) necessary for efficient performance. We further explore the effect of different graph partitioning strategies on the traversal performance for both synchronous and asynchronous traversal engines. Our experiments show that the asynchronous graph traversal engine is more efficient than its synchronous counterpart in the case of HPC rich metadata processing, where more servers are involved and larger traversals are needed. Furthermore, the asynchronous traversal engine is more adaptive to different graph partitioning strategies.« less
An asynchronous traversal engine for graph-based rich metadata management
Dai, Dong; Carns, Philip; Ross, Robert B.; ...
2016-06-23
Rich metadata in high-performance computing (HPC) systems contains extended information about users, jobs, data files, and their relationships. Property graphs are a promising data model to represent heterogeneous rich metadata flexibly. Specifically, a property graph can use vertices to represent different entities and edges to record the relationships between vertices with unique annotations. The high-volume HPC use case, with millions of entities and relationships, naturally requires an out-of-core distributed property graph database, which must support live updates (to ingest production information in real time), low-latency point queries (for frequent metadata operations such as permission checking), and large-scale traversals (for provenancemore » data mining). Among these needs, large-scale property graph traversals are particularly challenging for distributed graph storage systems. Most existing graph systems implement a "level synchronous" breadth-first search algorithm that relies on global synchronization in each traversal step. This performs well in many problem domains; but a rich metadata management system is characterized by imbalanced graphs, long traversal lengths, and concurrent workloads, each of which has the potential to introduce or exacerbate stragglers (i.e., abnormally slow steps or servers in a graph traversal) that lead to low overall throughput for synchronous traversal algorithms. Previous research indicated that the straggler problem can be mitigated by using asynchronous traversal algorithms, and many graph-processing frameworks have successfully demonstrated this approach. Such systems require the graph to be loaded into a separate batch-processing framework instead of being iteratively accessed, however. In this work, we investigate a general asynchronous graph traversal engine that can operate atop a rich metadata graph in its native format. We outline a traversal-aware query language and key optimizations (traversal-affiliate caching and execution merging) necessary for efficient performance. We further explore the effect of different graph partitioning strategies on the traversal performance for both synchronous and asynchronous traversal engines. Our experiments show that the asynchronous graph traversal engine is more efficient than its synchronous counterpart in the case of HPC rich metadata processing, where more servers are involved and larger traversals are needed. Furthermore, the asynchronous traversal engine is more adaptive to different graph partitioning strategies.« less
Expanding our understanding of students' use of graphs for learning physics
NASA Astrophysics Data System (ADS)
Laverty, James T.
It is generally agreed that the ability to visualize functional dependencies or physical relationships as graphs is an important step in modeling and learning. However, several studies in Physics Education Research (PER) have shown that many students in fact do not master this form of representation and even have misconceptions about the meaning of graphs that impede learning physics concepts. Working with graphs in classroom settings has been shown to improve student abilities with graphs, particularly when the students can interact with them. We introduce a novel problem type in an online homework system, which requires students to construct the graphs themselves in free form, and requires no hand-grading by instructors. A study of pre/post-test data using the Test of Understanding Graphs in Kinematics (TUG-K) over several semesters indicates that students learn significantly more from these graph construction problems than from the usual graph interpretation problems, and that graph interpretation alone may not have any significant effect. The interpretation of graphs, as well as the representation translation between textual, mathematical, and graphical representations of physics scenarios, are frequently listed among the higher order thinking skills we wish to convey in an undergraduate course. But to what degree do we succeed? Do students indeed employ higher order thinking skills when working through graphing exercises? We investigate students working through a variety of graph problems, and, using a think-aloud protocol, aim to reconstruct the cognitive processes that the students go through. We find that to a certain degree, these problems become commoditized and do not trigger the desired higher order thinking processes; simply translating ``textbook-like'' problems into the graphical realm will not achieve any additional educational goals. Whether the students have to interpret or construct a graph makes very little difference in the methods used by the students. We will also look at the results of using graph problems in an online learning environment. We will show evidence that construction problems lead to a higher degree of difficulty and degree of discrimination than other graph problems and discuss the influence the course has on these variables.
Alternative Fuels Data Center: Maps and Data
-1paywcu Last update August 2014 View Graph Graph Download Data State & Alt Fuel Providers -kgi9ks Trend of S&FP AFV acquisitions by fleet type from 1992-2014 Last update August 2016 View Graph -2015 Last update August 2016 View Graph Graph Download Data Generated_thumb20160907-12999-119sgvk
ERIC Educational Resources Information Center
Xi, Xiaoming
2010-01-01
Motivated by cognitive theories of graph comprehension, this study systematically manipulated characteristics of a line graph description task in a speaking test in ways to mitigate the influence of graph familiarity, a potential source of construct-irrelevant variance. It extends Xi (2005), which found that the differences in holistic scores on…
Building Scalable Knowledge Graphs for Earth Science
NASA Technical Reports Server (NTRS)
Ramachandran, Rahul; Maskey, Manil; Gatlin, Patrick; Zhang, Jia; Duan, Xiaoyi; Miller, J. J.; Bugbee, Kaylin; Christopher, Sundar; Freitag, Brian
2017-01-01
Knowledge Graphs link key entities in a specific domain with other entities via relationships. From these relationships, researchers can query knowledge graphs for probabilistic recommendations to infer new knowledge. Scientific papers are an untapped resource which knowledge graphs could leverage to accelerate research discovery. Goal: Develop an end-to-end (semi) automated methodology for constructing Knowledge Graphs for Earth Science.
Global dynamics for switching systems and their extensions by linear differential equations
NASA Astrophysics Data System (ADS)
Huttinga, Zane; Cummins, Bree; Gedeon, Tomáš; Mischaikow, Konstantin
2018-03-01
Switching systems use piecewise constant nonlinearities to model gene regulatory networks. This choice provides advantages in the analysis of behavior and allows the global description of dynamics in terms of Morse graphs associated to nodes of a parameter graph. The parameter graph captures spatial characteristics of a decomposition of parameter space into domains with identical Morse graphs. However, there are many cellular processes that do not exhibit threshold-like behavior and thus are not well described by a switching system. We consider a class of extensions of switching systems formed by a mixture of switching interactions and chains of variables governed by linear differential equations. We show that the parameter graphs associated to the switching system and any of its extensions are identical. For each parameter graph node, there is an order-preserving map from the Morse graph of the switching system to the Morse graph of any of its extensions. We provide counterexamples that show why possible stronger relationships between the Morse graphs are not valid.
Global dynamics for switching systems and their extensions by linear differential equations.
Huttinga, Zane; Cummins, Bree; Gedeon, Tomáš; Mischaikow, Konstantin
2018-03-15
Switching systems use piecewise constant nonlinearities to model gene regulatory networks. This choice provides advantages in the analysis of behavior and allows the global description of dynamics in terms of Morse graphs associated to nodes of a parameter graph. The parameter graph captures spatial characteristics of a decomposition of parameter space into domains with identical Morse graphs. However, there are many cellular processes that do not exhibit threshold-like behavior and thus are not well described by a switching system. We consider a class of extensions of switching systems formed by a mixture of switching interactions and chains of variables governed by linear differential equations. We show that the parameter graphs associated to the switching system and any of its extensions are identical. For each parameter graph node, there is an order-preserving map from the Morse graph of the switching system to the Morse graph of any of its extensions. We provide counterexamples that show why possible stronger relationships between the Morse graphs are not valid.
Text categorization of biomedical data sets using graph kernels and a controlled vocabulary.
Bleik, Said; Mishra, Meenakshi; Huan, Jun; Song, Min
2013-01-01
Recently, graph representations of text have been showing improved performance over conventional bag-of-words representations in text categorization applications. In this paper, we present a graph-based representation for biomedical articles and use graph kernels to classify those articles into high-level categories. In our representation, common biomedical concepts and semantic relationships are identified with the help of an existing ontology and are used to build a rich graph structure that provides a consistent feature set and preserves additional semantic information that could improve a classifier's performance. We attempt to classify the graphs using both a set-based graph kernel that is capable of dealing with the disconnected nature of the graphs and a simple linear kernel. Finally, we report the results comparing the classification performance of the kernel classifiers to common text-based classifiers.
Supermanifolds from Feynman graphs
NASA Astrophysics Data System (ADS)
Marcolli, Matilde; Rej, Abhijnan
2008-08-01
We generalize the computation of Feynman integrals of log divergent graphs in terms of the Kirchhoff polynomial to the case of graphs with both fermionic and bosonic edges, to which we assign a set of ordinary and Grassmann variables. This procedure gives a computation of the Feynman integrals in terms of a period on a supermanifold, for graphs admitting a basis of the first homology satisfying a condition generalizing the log divergence in this context. The analog in this setting of the graph hypersurfaces is a graph supermanifold given by the divisor of zeros and poles of the Berezinian of a matrix associated with the graph, inside a superprojective space. We introduce a Grothendieck group for supermanifolds and identify the subgroup generated by the graph supermanifolds. This can be seen as a general procedure for constructing interesting classes of supermanifolds with associated periods.
Function plot response: A scalable system for teaching kinematics graphs
NASA Astrophysics Data System (ADS)
Laverty, James; Kortemeyer, Gerd
2012-08-01
Understanding and interpreting graphs are essential skills in all sciences. While students are mostly proficient in plotting given functions and reading values off graphs, they frequently lack the ability to construct and interpret graphs in a meaningful way. Students can use graphs as representations of value pairs, but often fail to interpret them as the representation of functions, and mostly fail to use them as representations of physical reality. Working with graphs in classroom settings has been shown to improve student abilities with graphs, particularly when the students can interact with them. We introduce a novel problem type in an online homework system, which requires students to construct the graphs themselves in free form, and requires no hand-grading by instructors. Initial experiences using the new problem type in an introductory physics course are reported.
Many-core graph analytics using accelerated sparse linear algebra routines
NASA Astrophysics Data System (ADS)
Kozacik, Stephen; Paolini, Aaron L.; Fox, Paul; Kelmelis, Eric
2016-05-01
Graph analytics is a key component in identifying emerging trends and threats in many real-world applications. Largescale graph analytics frameworks provide a convenient and highly-scalable platform for developing algorithms to analyze large datasets. Although conceptually scalable, these techniques exhibit poor performance on modern computational hardware. Another model of graph computation has emerged that promises improved performance and scalability by using abstract linear algebra operations as the basis for graph analysis as laid out by the GraphBLAS standard. By using sparse linear algebra as the basis, existing highly efficient algorithms can be adapted to perform computations on the graph. This approach, however, is often less intuitive to graph analytics experts, who are accustomed to vertex-centric APIs such as Giraph, GraphX, and Tinkerpop. We are developing an implementation of the high-level operations supported by these APIs in terms of linear algebra operations. This implementation is be backed by many-core implementations of the fundamental GraphBLAS operations required, and offers the advantages of both the intuitive programming model of a vertex-centric API and the performance of a sparse linear algebra implementation. This technology can reduce the number of nodes required, as well as the run-time for a graph analysis problem, enabling customers to perform more complex analysis with less hardware at lower cost. All of this can be accomplished without the requirement for the customer to make any changes to their analytics code, thanks to the compatibility with existing graph APIs.
Asada, Yukiko; Abel, Hannah; Skedgel, Chris; Warner, Grace
2017-12-01
Policy Points: Effective graphs can be a powerful tool in communicating health inequality. The choice of graphs is often based on preferences and familiarity rather than science. According to the literature on graph perception, effective graphs allow human brains to decode visual cues easily. Dot charts are easier to decode than bar charts, and thus they are more effective. Dot charts are a flexible and versatile way to display information about health inequality. Consistent with the health risk communication literature, the captions accompanying health inequality graphs should provide a numerical, explicitly calculated description of health inequality, expressed in absolute and relative terms, from carefully thought-out perspectives. Graphs are an essential tool for communicating health inequality, a key health policy concern. The choice of graphs is often driven by personal preferences and familiarity. Our article is aimed at health policy researchers developing health inequality graphs for policy and scientific audiences and seeks to (1) raise awareness of the effective use of graphs in communicating health inequality; (2) advocate for a particular type of graph (ie, dot charts) to depict health inequality; and (3) suggest key considerations for the captions accompanying health inequality graphs. Using composite review methods, we selected the prevailing recommendations for improving graphs in scientific reporting. To find the origins of these recommendations, we reviewed the literature on graph perception and then applied what we learned to the context of health inequality. In addition, drawing from the numeracy literature in health risk communication, we examined numeric and verbal formats to explain health inequality graphs. Many disciplines offer commonsense recommendations for visually presenting quantitative data. The literature on graph perception, which defines effective graphs as those allowing the easy decoding of visual cues in human brains, shows that with their more accurate and easier-to-decode visual cues, dot charts are more effective than bar charts. Dot charts can flexibly present a large amount of information in limited space. They also can easily accommodate typical health inequality information to describe a health variable (eg, life expectancy) by an inequality domain (eg, income) with domain groups (eg, poor and rich) in a population (eg, Canada) over time periods (eg, 2010 and 2017). The numeracy literature suggests that a health inequality graph's caption should provide a numerical, explicitly calculated description of health inequality expressed in absolute and relative terms, from carefully thought-out perspectives. Given the ubiquity of graphs, the health inequality field should learn from the vibrant multidisciplinary literature how to construct effective graphic communications, especially by considering to use dot charts. © 2017 Milbank Memorial Fund.
The Quantity and Quality of Scientific Graphs in Pharmaceutical Advertisements
Cooper, Richelle J; Schriger, David L; Wallace, Roger C; Mikulich, Vladislav J; Wilkes, Michael S
2003-01-01
We characterized the quantity and quality of graphs in all pharmaceutical advertisements, in the 10 U.S. medical journals. Four hundred eighty-four unique advertisements (of 3,185 total advertisements) contained 836 glossy and 455 small-print pages. Forty-nine percent of glossy page area was nonscientific figures/images, 0.4% tables, and 1.6% scientific graphs (74 graphs in 64 advertisements). All 74 graphs were univariate displays, 4% were distributions, and 4% contained confidence intervals for summary measures. Extraneous decoration (66%) and redundancy (46%) were common. Fifty-eight percent of graphs presented an outcome relevant to the drug's indication. Numeric distortion, specifically prohibited by FDA regulations, occurred in 36% of graphs. PMID:12709097
Molecular graph convolutions: moving beyond fingerprints
NASA Astrophysics Data System (ADS)
Kearnes, Steven; McCloskey, Kevin; Berndl, Marc; Pande, Vijay; Riley, Patrick
2016-08-01
Molecular "fingerprints" encoding structural information are the workhorse of cheminformatics and machine learning in drug discovery applications. However, fingerprint representations necessarily emphasize particular aspects of the molecular structure while ignoring others, rather than allowing the model to make data-driven decisions. We describe molecular graph convolutions, a machine learning architecture for learning from undirected graphs, specifically small molecules. Graph convolutions use a simple encoding of the molecular graph—atoms, bonds, distances, etc.—which allows the model to take greater advantage of information in the graph structure. Although graph convolutions do not outperform all fingerprint-based methods, they (along with other graph-based methods) represent a new paradigm in ligand-based virtual screening with exciting opportunities for future improvement.
Stereo matching using census cost over cross window and segmentation-based disparity refinement
NASA Astrophysics Data System (ADS)
Li, Qingwu; Ni, Jinyan; Ma, Yunpeng; Xu, Jinxin
2018-03-01
Stereo matching is a vital requirement for many applications, such as three-dimensional (3-D) reconstruction, robot navigation, object detection, and industrial measurement. To improve the practicability of stereo matching, a method using census cost over cross window and segmentation-based disparity refinement is proposed. First, a cross window is obtained using distance difference and intensity similarity in binocular images. Census cost over the cross window and color cost are combined as the matching cost, which is aggregated by the guided filter. Then, winner-takes-all strategy is used to calculate the initial disparities. Second, a graph-based segmentation method is combined with color and edge information to achieve moderate under-segmentation. The segmented regions are classified into reliable regions and unreliable regions by consistency checking. Finally, the two regions are optimized by plane fitting and propagation, respectively, to match the ambiguous pixels. The experimental results are on Middlebury Stereo Datasets, which show that the proposed method has good performance in occluded and discontinuous regions, and it obtains smoother disparity maps with a lower average matching error rate compared with other algorithms.
AMPX: a modular code system for generating coupled multigroup neutron-gamma libraries from ENDF/B
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greene, N.M.; Lucius, J.L.; Petrie, L.M.
1976-03-01
AMPX is a modular system for producing coupled multigroup neutron-gamma cross section sets. Basic neutron and gamma cross-section data for AMPX are obtained from ENDF/B libraries. Most commonly used operations required to generate and collapse multigroup cross-section sets are provided in the system. AMPX is flexibly dimensioned; neutron group structures, and gamma group structures, and expansion orders to represent anisotropic processes are all arbitrary and limited only by available computer core and budget. The basic processes provided will (1) generate multigroup neutron cross sections; (2) generate multigroup gamma cross sections; (3) generate gamma yields for gamma-producing neutron interactions; (4) combinemore » neutron cross sections, gamma cross sections, and gamma yields into final ''coupled sets''; (5) perform one-dimensional discrete ordinates transport or diffusion theory calculations for neutrons and gammas and, on option, collapse the cross sections to a broad-group structure, using the one-dimensional results as weighting functions; (6) plot cross sections, on option, to facilitate the ''evaluation'' of a particular multigroup set of data; (7) update and maintain multigroup cross section libraries in such a manner as to make it not only easy to combine new data with previously processed data but also to do it in a single pass on the computer; and (8) output multigroup cross sections in convenient formats for other codes. (auth)« less
Alternative Fuels Data Center: Maps and Data
acquisitions by fleet type from 1992-2014 Last update August 2016 View Graph Graph Download Data -m8i0e0 Trend of S&FP AFV acquisitions by fuel type from 1992-2015 Last update August 2016 View Graph transactions from 1997-2014 Last update August 2016 View Graph Graph Download Data Generated_thumb20160907
PuLP/XtraPuLP : Partitioning Tools for Extreme-Scale Graphs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slota, George M; Rajamanickam, Sivasankaran; Madduri, Kamesh
2017-09-21
PuLP/XtraPulp is software for partitioning graphs from several real-world problems. Graphs occur in several places in real world from road networks, social networks and scientific simulations. For efficient parallel processing these graphs have to be partitioned (split) with respect to metrics such as computation and communication costs. Our software allows such partitioning for massive graphs.
Dynamics on Networks of Manifolds
NASA Astrophysics Data System (ADS)
DeVille, Lee; Lerman, Eugene
2015-03-01
We propose a precise definition of a continuous time dynamical system made up of interacting open subsystems. The interconnections of subsystems are coded by directed graphs. We prove that the appropriate maps of graphs called graph fibrations give rise to maps of dynamical systems. Consequently surjective graph fibrations give rise to invariant subsystems and injective graph fibrations give rise to projections of dynamical systems.
Graph edit distance from spectral seriation.
Robles-Kelly, Antonio; Hancock, Edwin R
2005-03-01
This paper is concerned with computing graph edit distance. One of the criticisms that can be leveled at existing methods for computing graph edit distance is that they lack some of the formality and rigor of the computation of string edit distance. Hence, our aim is to convert graphs to string sequences so that string matching techniques can be used. To do this, we use a graph spectral seriation method to convert the adjacency matrix into a string or sequence order. We show how the serial ordering can be established using the leading eigenvector of the graph adjacency matrix. We pose the problem of graph-matching as a maximum a posteriori probability (MAP) alignment of the seriation sequences for pairs of graphs. This treatment leads to an expression in which the edit cost is the negative logarithm of the a posteriori sequence alignment probability. We compute the edit distance by finding the sequence of string edit operations which minimizes the cost of the path traversing the edit lattice. The edit costs are determined by the components of the leading eigenvectors of the adjacency matrix and by the edge densities of the graphs being matched. We demonstrate the utility of the edit distance on a number of graph clustering problems.
NEFI: Network Extraction From Images
Dirnberger, M.; Kehl, T.; Neumann, A.
2015-01-01
Networks are amongst the central building blocks of many systems. Given a graph of a network, methods from graph theory enable a precise investigation of its properties. Software for the analysis of graphs is widely available and has been applied to study various types of networks. In some applications, graph acquisition is relatively simple. However, for many networks data collection relies on images where graph extraction requires domain-specific solutions. Here we introduce NEFI, a tool that extracts graphs from images of networks originating in various domains. Regarding previous work on graph extraction, theoretical results are fully accessible only to an expert audience and ready-to-use implementations for non-experts are rarely available or insufficiently documented. NEFI provides a novel platform allowing practitioners to easily extract graphs from images by combining basic tools from image processing, computer vision and graph theory. Thus, NEFI constitutes an alternative to tedious manual graph extraction and special purpose tools. We anticipate NEFI to enable time-efficient collection of large datasets. The analysis of these novel datasets may open up the possibility to gain new insights into the structure and function of various networks. NEFI is open source and available at http://nefi.mpi-inf.mpg.de. PMID:26521675
NASA Astrophysics Data System (ADS)
Kase, Sue E.; Vanni, Michelle; Knight, Joanne A.; Su, Yu; Yan, Xifeng
2016-05-01
Within operational environments decisions must be made quickly based on the information available. Identifying an appropriate knowledge base and accurately formulating a search query are critical tasks for decision-making effectiveness in dynamic situations. The spreading of graph data management tools to access large graph databases is a rapidly emerging research area of potential benefit to the intelligence community. A graph representation provides a natural way of modeling data in a wide variety of domains. Graph structures use nodes, edges, and properties to represent and store data. This research investigates the advantages of information search by graph query initiated by the analyst and interactively refined within the contextual dimensions of the answer space toward a solution. The paper introduces SLQ, a user-friendly graph querying system enabling the visual formulation of schemaless and structureless graph queries. SLQ is demonstrated with an intelligence analyst information search scenario focused on identifying individuals responsible for manufacturing a mosquito-hosted deadly virus. The scenario highlights the interactive construction of graph queries without prior training in complex query languages or graph databases, intuitive navigation through the problem space, and visualization of results in graphical format.
Metric learning with spectral graph convolutions on brain connectivity networks.
Ktena, Sofia Ira; Parisot, Sarah; Ferrante, Enzo; Rajchl, Martin; Lee, Matthew; Glocker, Ben; Rueckert, Daniel
2018-04-01
Graph representations are often used to model structured data at an individual or population level and have numerous applications in pattern recognition problems. In the field of neuroscience, where such representations are commonly used to model structural or functional connectivity between a set of brain regions, graphs have proven to be of great importance. This is mainly due to the capability of revealing patterns related to brain development and disease, which were previously unknown. Evaluating similarity between these brain connectivity networks in a manner that accounts for the graph structure and is tailored for a particular application is, however, non-trivial. Most existing methods fail to accommodate the graph structure, discarding information that could be beneficial for further classification or regression analyses based on these similarities. We propose to learn a graph similarity metric using a siamese graph convolutional neural network (s-GCN) in a supervised setting. The proposed framework takes into consideration the graph structure for the evaluation of similarity between a pair of graphs, by employing spectral graph convolutions that allow the generalisation of traditional convolutions to irregular graphs and operates in the graph spectral domain. We apply the proposed model on two datasets: the challenging ABIDE database, which comprises functional MRI data of 403 patients with autism spectrum disorder (ASD) and 468 healthy controls aggregated from multiple acquisition sites, and a set of 2500 subjects from UK Biobank. We demonstrate the performance of the method for the tasks of classification between matching and non-matching graphs, as well as individual subject classification and manifold learning, showing that it leads to significantly improved results compared to traditional methods. Copyright © 2017 Elsevier Inc. All rights reserved.
Proving relations between modular graph functions
NASA Astrophysics Data System (ADS)
Basu, Anirban
2016-12-01
We consider modular graph functions that arise in the low energy expansion of the four graviton amplitude in type II string theory. The vertices of these graphs are the positions of insertions of vertex operators on the toroidal worldsheet, while the links are the scalar Green functions connecting the vertices. Graphs with four and five links satisfy several non-trivial relations, which have been proved recently. We prove these relations by using elementary properties of Green functions and the details of the graphs. We also prove a relation between modular graph functions with six links.
Panconnectivity of Locally Connected K(1,3)-Free Graphs
1989-10-15
Graph Theory, 3 (1979) p. 351-356. 22 7. Cun-Quan Zhang, Cycles of Given Lengths in KI, 3-Free Graphs, Discrete Math ., (1988) to appear. I. f 2.f . AA A V V / (S. ...Locally Connected and Hamiltonian-Connected Graphs, Isreal J. Math., 33 (1979) p. 5-8. 4. V. Chvatal and P. Erd6s, A Note on Hamiltonian Circuits, Discrete ... Math ., 2 (1972) p. 111-113. 5. S. V. Kanetkar and P. R. Rao, Connected and Locally 2- Connected, K1.3-Free Graphs are Panconnected, J. Graph Theory, 8
A Graph Based Interface for Representing Volume Visualization Results
NASA Technical Reports Server (NTRS)
Patten, James M.; Ma, Kwan-Liu
1998-01-01
This paper discusses a graph based user interface for representing the results of the volume visualization process. As images are rendered, they are connected to other images in a graph based on their rendering parameters. The user can take advantage of the information in this graph to understand how certain rendering parameter changes affect a dataset, making the visualization process more efficient. Because the graph contains more information than is contained in an unstructured history of images, the image graph is also helpful for collaborative visualization and animation.
Sions, Jaclyn Megan; Smith, Andrew Craig; Hicks, Gregory Evan; Elliott, James Matthew
2016-08-01
To evaluate intra- and inter-examiner reliability for the assessment of relative cross-sectional area, muscle-to-fat infiltration indices, and relative muscle cross-sectional area, i.e., total cross-sectional area minus intramuscular fat, from T1-weighted magnetic resonance images obtained in older adults with chronic low back pain. Reliability study. n = 13 (69.3 ± 8.2 years old) After lumbar magnetic resonance imaging, two examiners produced relative cross-sectional area measurements of multifidi, erector spinae, psoas, and quadratus lumborum by tracing regions of interest just inside fascial borders. Pixel-intensity summaries were used to determine muscle-to-fat infiltration indices; relative muscle cross-sectional area was calculated. Intraclass correlation coefficients were used to estimate intra- and inter-examiner reliability; standard error of measurement was calculated. Intra-examiner intraclass correlation coefficient point estimates for relative cross-sectional area, muscle-to-fat infiltration indices, and relative muscle cross-sectional area were excellent for multifidi and erector spinae across levels L2-L5 (ICC = 0.77-0.99). At L3, intra-examiner reliability was excellent for relative cross-sectional area, muscle-to-fat infiltration indices, and relative muscle cross-sectional area for both psoas and quadratus lumborum (ICC = 0.81-0.99). Inter-examiner intraclass correlation coefficients ranged from poor to excellent for relative cross-sectional area, muscle-to-fat infiltration indices, and relative muscle cross-sectional area. Assessment of relative cross-sectional area, muscle-to-fat infiltration indices, and relative muscle cross-sectional area in older adults with chronic low back pain can be reliably determined by one examiner from T1-weighted images. Such assessments provide valuable information, as muscle-to-fat infiltration indices and relative muscle cross-sectional area indicate that a substantial amount of relative cross-sectional area may be magnetic resonance-visible intramuscular fat in older adults with chronic low back pain. © 2015 American Academy of Pain Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Singh, Suvam; Naghma, Rahla; Kaur, Jaspreet; Antony, Bobby
2016-07-01
The total and ionization cross sections for electron scattering by benzene, halobenzenes, toluene, aniline, and phenol are reported over a wide energy domain. The multi-scattering centre spherical complex optical potential method has been employed to find the total elastic and inelastic cross sections. The total ionization cross section is estimated from total inelastic cross section using the complex scattering potential-ionization contribution method. In the present article, the first theoretical calculations for electron impact total and ionization cross section have been performed for most of the targets having numerous practical applications. A reasonable agreement is obtained compared to existing experimental observations for all the targets reported here, especially for the total cross section.
Parameterized Cross Sections for Pion Production in Proton-Proton Collisions
NASA Technical Reports Server (NTRS)
Blattnig, Steve R.; Swaminathan, Sudha R.; Kruger, Adam T.; Ngom, Moussa; Norbury, John W.; Tripathi, R. K.
2000-01-01
An accurate knowledge of cross sections for pion production in proton-proton collisions finds wide application in particle physics, astrophysics, cosmic ray physics, and space radiation problems, especially in situations where an incident proton is transported through some medium and knowledge of the output particle spectrum is required when given the input spectrum. In these cases, accurate parameterizations of the cross sections are desired. In this paper much of the experimental data are reviewed and compared with a wide variety of different cross section parameterizations. Therefore, parameterizations of neutral and charged pion cross sections are provided that give a very accurate description of the experimental data. Lorentz invariant differential cross sections, spectral distributions, and total cross section parameterizations are presented.
What energy functions can be minimized via graph cuts?
Kolmogorov, Vladimir; Zabih, Ramin
2004-02-01
In the last few years, several new algorithms based on graph cuts have been developed to solve energy minimization problems in computer vision. Each of these techniques constructs a graph such that the minimum cut on the graph also minimizes the energy. Yet, because these graph constructions are complex and highly specific to a particular energy function, graph cuts have seen limited application to date. In this paper, we give a characterization of the energy functions that can be minimized by graph cuts. Our results are restricted to functions of binary variables. However, our work generalizes many previous constructions and is easily applicable to vision problems that involve large numbers of labels, such as stereo, motion, image restoration, and scene reconstruction. We give a precise characterization of what energy functions can be minimized using graph cuts, among the energy functions that can be written as a sum of terms containing three or fewer binary variables. We also provide a general-purpose construction to minimize such an energy function. Finally, we give a necessary condition for any energy function of binary variables to be minimized by graph cuts. Researchers who are considering the use of graph cuts to optimize a particular energy function can use our results to determine if this is possible and then follow our construction to create the appropriate graph. A software implementation is freely available.
Connectivity: Performance Portable Algorithms for graph connectivity v. 0.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slota, George; Rajamanickam, Sivasankaran; Madduri, Kamesh
Graphs occur in several places in real world from road networks, social networks and scientific simulations. Connectivity is a graph analysis software to graph connectivity in modern architectures like multicore CPUs, Xeon Phi and GPUs.
Graph wavelet alignment kernels for drug virtual screening.
Smalter, Aaron; Huan, Jun; Lushington, Gerald
2009-06-01
In this paper, we introduce a novel statistical modeling technique for target property prediction, with applications to virtual screening and drug design. In our method, we use graphs to model chemical structures and apply a wavelet analysis of graphs to summarize features capturing graph local topology. We design a novel graph kernel function to utilize the topology features to build predictive models for chemicals via Support Vector Machine classifier. We call the new graph kernel a graph wavelet-alignment kernel. We have evaluated the efficacy of the wavelet-alignment kernel using a set of chemical structure-activity prediction benchmarks. Our results indicate that the use of the kernel function yields performance profiles comparable to, and sometimes exceeding that of the existing state-of-the-art chemical classification approaches. In addition, our results also show that the use of wavelet functions significantly decreases the computational costs for graph kernel computation with more than ten fold speedup.
Flexibility in data interpretation: effects of representational format.
Braithwaite, David W; Goldstone, Robert L
2013-01-01
Graphs and tables differentially support performance on specific tasks. For tasks requiring reading off single data points, tables are as good as or better than graphs, while for tasks involving relationships among data points, graphs often yield better performance. However, the degree to which graphs and tables support flexibility across a range of tasks is not well-understood. In two experiments, participants detected main and interaction effects in line graphs and tables of bivariate data. Graphs led to more efficient performance, but also lower flexibility, as indicated by a larger discrepancy in performance across tasks. In particular, detection of main effects of variables represented in the graph legend was facilitated relative to detection of main effects of variables represented in the x-axis. Graphs may be a preferable representational format when the desired task or analytical perspective is known in advance, but may also induce greater interpretive bias than tables, necessitating greater care in their use and design.
Mathematical modeling of the malignancy of cancer using graph evolution.
Gunduz-Demir, Cigdem
2007-10-01
We report a novel computational method based on graph evolution process to model the malignancy of brain cancer called glioma. In this work, we analyze the phases that a graph passes through during its evolution and demonstrate strong relation between the malignancy of cancer and the phase of its graph. From the photomicrographs of tissues, which are diagnosed as normal, low-grade cancerous and high-grade cancerous, we construct cell-graphs based on the locations of cells; we probabilistically generate an edge between every pair of cells depending on the Euclidean distance between them. For a cell-graph, we extract connectivity information including the properties of its connected components in order to analyze the phase of the cell-graph. Working with brain tissue samples surgically removed from 12 patients, we demonstrate that cell-graphs generated for different tissue types evolve differently and that they exhibit different phase properties, which distinguish a tissue type from another.
Topological properties of the limited penetrable horizontal visibility graph family
NASA Astrophysics Data System (ADS)
Wang, Minggang; Vilela, André L. M.; Du, Ruijin; Zhao, Longfeng; Dong, Gaogao; Tian, Lixin; Stanley, H. Eugene
2018-05-01
The limited penetrable horizontal visibility graph algorithm was recently introduced to map time series in complex networks. In this work, we extend this algorithm to create a directed-limited penetrable horizontal visibility graph and an image-limited penetrable horizontal visibility graph. We define two algorithms and provide theoretical results on the topological properties of these graphs associated with different types of real-value series. We perform several numerical simulations to check the accuracy of our theoretical results. Finally, we present an application of the directed-limited penetrable horizontal visibility graph to measure real-value time series irreversibility and an application of the image-limited penetrable horizontal visibility graph that discriminates noise from chaos. We also propose a method to measure the systematic risk using the image-limited penetrable horizontal visibility graph, and the empirical results show the effectiveness of our proposed algorithms.
Semi-Automated Annotation of Biobank Data Using Standard Medical Terminologies in a Graph Database.
Hofer, Philipp; Neururer, Sabrina; Goebel, Georg
2016-01-01
Data describing biobank resources frequently contains unstructured free-text information or insufficient coding standards. (Bio-) medical ontologies like Orphanet Rare Diseases Ontology (ORDO) or the Human Disease Ontology (DOID) provide a high number of concepts, synonyms and entity relationship properties. Such standard terminologies increase quality and granularity of input data by adding comprehensive semantic background knowledge from validated entity relationships. Moreover, cross-references between terminology concepts facilitate data integration across databases using different coding standards. In order to encourage the use of standard terminologies, our aim is to identify and link relevant concepts with free-text diagnosis inputs within a biobank registry. Relevant concepts are selected automatically by lexical matching and SPARQL queries against a RDF triplestore. To ensure correctness of annotations, proposed concepts have to be confirmed by medical data administration experts before they are entered into the registry database. Relevant (bio-) medical terminologies describing diseases and phenotypes were identified and stored in a graph database which was tied to a local biobank registry. Concept recommendations during data input trigger a structured description of medical data and facilitate data linkage between heterogeneous systems.
X-Graphs: Language and Algorithms for Heterogeneous Graph Streams
2017-09-01
INTRODUCTION 1 3 METHODS , ASUMPTIONS, AND PROCEDURES 2 Software Abstractions for Graph Analytic Applications 2 High performance Platforms for Graph Processing...data is stored in a distributed file system. 3 METHODS , ASUMPTIONS, AND PROCEDURES Software Abstractions for Graph Analytic Applications To...implementations of novel methods for networks analysis: several methods for detection of overlapping communities, personalized PageRank, node embeddings into a d
Olechnovic, Kliment; Margelevicius, Mindaugas; Venclovas, Ceslovas
2011-03-01
We present Voroprot, an interactive cross-platform software tool that provides a unique set of capabilities for exploring geometric features of protein structure. Voroprot allows the construction and visualization of the Apollonius diagram (also known as the additively weighted Voronoi diagram), the Apollonius graph, protein alpha shapes, interatomic contact surfaces, solvent accessible surfaces, pockets and cavities inside protein structure. Voroprot is available for Windows, Linux and Mac OS X operating systems and can be downloaded from http://www.ibt.lt/bioinformatics/voroprot/.
NASA Astrophysics Data System (ADS)
van Eijck, Michiel; Goedhart, Martin J.; Ellermeijer, Ton
2011-01-01
Polysemy in graph-related practices is the phenomenon that a single graph can sustain different meanings assigned to it. Considerable research has been done on polysemy in graph-related practices in school science in which graphs are rather used as scientific tools. However, graphs in science textbooks are also used rather pedagogically to illustrate domain-specific textbook content and less empirical work has been done in this respect. The aim of this study is therefore to better understand polysemy in the domain-specific pedagogical use of graphs in science textbooks. From socio-cultural and cultural-historical perspectives, we perceive polysemy as irreducible to either the meaning-making (semiotic) resources provided by the graph or its readers who assign meaning to it. Departing from this framework, we simultaneously investigated: (a) the meanings 44 pre-university biology students assigned to the Cartesian plane of a graph that is commonly used as a pedagogical tool in Dutch high school biology textbooks (an electrocardiogram); (b) the semiotic resources provided by this graph; and (c) the educational practices of which it is supposedly a part according to the actions constituted by the textbooks that were to be conducted by students. Drawing on this case, we show polysemy in the pedagogical use of graphs in science textbooks. In turn, we show how this polysemy can be explained dialectically as the result of both the meaning-making resources provided by the textbooks and the graph-related practices in which students supposedly engaged by using their textbooks. The educational implications of these findings are discussed.
Garzo, Elisa; Fernández-Pascual, Mercedes; Morcillo, Cesar; Fereres, Alberto; Gómez-Guillamón, M Luisa; Tjallingii, W Fred
2017-02-18
Resistance of the melon line TGR-1551 to the aphid Aphis gossypii is based on preventing aphids from ingesting phloem sap. In electrical penetration graphs (EPGs), this resistance has been characterized with A. gossypii showing unusually long phloem salivation periods (waveform E1) mostly followed by pathway activities (waveform C) or if followed by phloem ingestion (waveform E2), ingestion was not sustained for more than 10 min. Stylectomy with aphids on susceptible and resistant plants was performed during EPG recording while the stylet tips were phloem inserted. This was followed by dissection of the penetrated leaf section, plant tissue fixation, resin embedding, and ultrathin sectioning for transmission electron microscopic observation in order to study the resistance mechanism in the TGR. The most obvious aspect appeared to be the coagulation of phloem proteins inside the stylet canals and the punctured sieve elements. Stylets of 5 aphids per genotype were amputated during sieve element (SE) salivation (E1) and SE ingestion (E2). Cross-sections of stylet bundles in susceptible melon plants showed that the contents of the stylet canals were totally clear and also, no coagulated phloem proteins occurred in their punctured sieve elements. In contrast, electron-dense coagulations were found in both locations in the resistant plants. Due to calcium binding, aphid saliva has been hypothesized to play an essential role in preventing/suppressing such coagulations that cause occlusion of sieves plate and in the food canal of the aphid's stylets. Doubts about this role of E1 salivation are discussed on the basis of our results. © 2017 Institute of Zoology, Chinese Academy of Sciences.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brost, Randolph C.; McLendon, William Clarence,
2013-01-01
Modeling geospatial information with semantic graphs enables search for sites of interest based on relationships between features, without requiring strong a priori models of feature shape or other intrinsic properties. Geospatial semantic graphs can be constructed from raw sensor data with suitable preprocessing to obtain a discretized representation. This report describes initial work toward extending geospatial semantic graphs to include temporal information, and initial results applying semantic graph techniques to SAR image data. We describe an efficient graph structure that includes geospatial and temporal information, which is designed to support simultaneous spatial and temporal search queries. We also report amore » preliminary implementation of feature recognition, semantic graph modeling, and graph search based on input SAR data. The report concludes with lessons learned and suggestions for future improvements.« less
The Vertex Version of Weighted Wiener Number for Bicyclic Molecular Structures
Gao, Wei
2015-01-01
Graphs are used to model chemical compounds and drugs. In the graphs, each vertex represents an atom of molecule and edges between the corresponding vertices are used to represent covalent bounds between atoms. We call such a graph, which is derived from a chemical compound, a molecular graph. Evidence shows that the vertex-weighted Wiener number, which is defined over this molecular graph, is strongly correlated to both the melting point and boiling point of the compounds. In this paper, we report the extremal vertex-weighted Wiener number of bicyclic molecular graph in terms of molecular structural analysis and graph transformations. The promising prospects of the application for the chemical and pharmacy engineering are illustrated by theoretical results achieved in this paper. PMID:26640513
Graph traversals, genes, and matroids: An efficient case of the travelling salesman problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gusfield, D.; Stelling, P.; Wang, Lusheng
1996-12-31
In this paper the authors consider graph traversal problems that arise from a particular technology for DNA sequencing - sequencing by hybridization (SBH). They first explain the connection of the graph problems to SBH and then focus on the traversal problems. They describe a practical polynomial time solution to the Travelling Salesman Problem in a rich class of directed graphs (including edge weighted binary de Bruijn graphs), and provide a bounded-error approximation algorithm for the maximum weight TSP in a superset of those directed graphs. The authors also establish the existence of a matroid structure defined on the set ofmore » Euler and Hamilton paths in the restricted class of graphs. 8 refs., 5 figs.« less
Molecular graph convolutions: moving beyond fingerprints
Kearnes, Steven; McCloskey, Kevin; Berndl, Marc; Pande, Vijay; Riley, Patrick
2016-01-01
Molecular “fingerprints” encoding structural information are the workhorse of cheminformatics and machine learning in drug discovery applications. However, fingerprint representations necessarily emphasize particular aspects of the molecular structure while ignoring others, rather than allowing the model to make data-driven decisions. We describe molecular graph convolutions, a machine learning architecture for learning from undirected graphs, specifically small molecules. Graph convolutions use a simple encoding of the molecular graph—atoms, bonds, distances, etc.—which allows the model to take greater advantage of information in the graph structure. Although graph convolutions do not outperform all fingerprint-based methods, they (along with other graph-based methods) represent a new paradigm in ligand-based virtual screening with exciting opportunities for future improvement. PMID:27558503
An investigation of MCNP6.1 beryllium oxide S(α, β) cross sections
Sartor, Raymond F.; Glazener, Natasha N.
2016-03-08
In MCNP6.1, materials are constructed by identifying the constituent isotopes (or elements in a few cases) individually. This list selects the corresponding microscopic cross sections calculated from the free-gas model to create the material macroscopic cross sections. Furthermore, the free-gas model and the corresponding material macroscopic cross sections assume that the interactions of atoms do not affect the nuclear cross sections.
DBCC Software as Database for Collisional Cross-Sections
NASA Astrophysics Data System (ADS)
Moroz, Daniel; Moroz, Paul
2014-10-01
Interactions of species, such as atoms, radicals, molecules, electrons, and photons, in plasmas used for materials processing could be very complex, and many of them could be described in terms of collisional cross-sections. Researchers involved in plasma simulations must select reasonable cross-sections for collisional processes for implementing them into their simulation codes to be able to correctly simulate plasmas. However, collisional cross-section data are difficult to obtain, and, for some collisional processes, the cross-sections are still not known. Data on collisional cross-sections can be obtained from numerous sources including numerical calculations, experiments, journal articles, conference proceedings, scientific reports, various universities' websites, national labs and centers specifically devoted to collecting data on cross-sections. The cross-sections data received from different sources could be partial, corresponding to limited energy ranges, or could even not be in agreement. The DBCC software package was designed to help researchers in collecting, comparing, and selecting cross-sections, some of which could be constructed from others or chosen as defaults. This is important as different researchers may place trust in different cross-sections or in different sources. We will discuss the details of DBCC and demonstrate how it works and why it is beneficial to researchers working on plasma simulations.
Evolutionary dynamics on graphs
NASA Astrophysics Data System (ADS)
Lieberman, Erez; Hauert, Christoph; Nowak, Martin A.
2005-01-01
Evolutionary dynamics have been traditionally studied in the context of homogeneous or spatially extended populations. Here we generalize population structure by arranging individuals on a graph. Each vertex represents an individual. The weighted edges denote reproductive rates which govern how often individuals place offspring into adjacent vertices. The homogeneous population, described by the Moran process, is the special case of a fully connected graph with evenly weighted edges. Spatial structures are described by graphs where vertices are connected with their nearest neighbours. We also explore evolution on random and scale-free networks. We determine the fixation probability of mutants, and characterize those graphs for which fixation behaviour is identical to that of a homogeneous population. Furthermore, some graphs act as suppressors and others as amplifiers of selection. It is even possible to find graphs that guarantee the fixation of any advantageous mutant. We also study frequency-dependent selection and show that the outcome of evolutionary games can depend entirely on the structure of the underlying graph. Evolutionary graph theory has many fascinating applications ranging from ecology to multi-cellular organization and economics.
DELTACON: A Principled Massive-Graph Similarity Function with Attribution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koutra, Danai; Shah, Neil; Vogelstein, Joshua T.
How much did a network change since yesterday? How different is the wiring between Bob's brain (a left-handed male) and Alice's brain (a right-handed female)? Graph similarity with known node correspondence, i.e. the detection of changes in the connectivity of graphs, arises in numerous settings. In this work, we formally state the axioms and desired properties of the graph similarity functions, and evaluate when state-of-the-art methods fail to detect crucial connectivity changes in graphs. We propose DeltaCon, a principled, intuitive, and scalable algorithm that assesses the similarity between two graphs on the same nodes (e.g. employees of a company, customersmore » of a mobile carrier). In our experiments on various synthetic and real graphs we showcase the advantages of our method over existing similarity measures. We also employ DeltaCon to real applications: (a) we classify people to groups of high and low creativity based on their brain connectivity graphs, and (b) do temporal anomaly detection in the who-emails-whom Enron graph.« less
Nagoor Gani, A; Latha, S R
2016-01-01
A Hamiltonian cycle in a graph is a cycle that visits each node/vertex exactly once. A graph containing a Hamiltonian cycle is called a Hamiltonian graph. There have been several researches to find the number of Hamiltonian cycles of a Hamilton graph. As the number of vertices and edges grow, it becomes very difficult to keep track of all the different ways through which the vertices are connected. Hence, analysis of large graphs can be efficiently done with the assistance of a computer system that interprets graphs as matrices. And, of course, a good and well written algorithm will expedite the analysis even faster. The most convenient way to quickly test whether there is an edge between two vertices is to represent graphs using adjacent matrices. In this paper, a new algorithm is proposed to find fuzzy Hamiltonian cycle using adjacency matrix and the degree of the vertices of a fuzzy graph. A fuzzy graph structure is also modeled to illustrate the proposed algorithms with the selected air network of Indigo airlines.
DELTACON: A Principled Massive-Graph Similarity Function with Attribution
Koutra, Danai; Shah, Neil; Vogelstein, Joshua T.; ...
2014-05-22
How much did a network change since yesterday? How different is the wiring between Bob's brain (a left-handed male) and Alice's brain (a right-handed female)? Graph similarity with known node correspondence, i.e. the detection of changes in the connectivity of graphs, arises in numerous settings. In this work, we formally state the axioms and desired properties of the graph similarity functions, and evaluate when state-of-the-art methods fail to detect crucial connectivity changes in graphs. We propose DeltaCon, a principled, intuitive, and scalable algorithm that assesses the similarity between two graphs on the same nodes (e.g. employees of a company, customersmore » of a mobile carrier). In our experiments on various synthetic and real graphs we showcase the advantages of our method over existing similarity measures. We also employ DeltaCon to real applications: (a) we classify people to groups of high and low creativity based on their brain connectivity graphs, and (b) do temporal anomaly detection in the who-emails-whom Enron graph.« less
Measuring Graph Comprehension, Critique, and Construction in Science
NASA Astrophysics Data System (ADS)
Lai, Kevin; Cabrera, Julio; Vitale, Jonathan M.; Madhok, Jacquie; Tinker, Robert; Linn, Marcia C.
2016-08-01
Interpreting and creating graphs plays a critical role in scientific practice. The K-12 Next Generation Science Standards call for students to use graphs for scientific modeling, reasoning, and communication. To measure progress on this dimension, we need valid and reliable measures of graph understanding in science. In this research, we designed items to measure graph comprehension, critique, and construction and developed scoring rubrics based on the knowledge integration (KI) framework. We administered the items to over 460 middle school students. We found that the items formed a coherent scale and had good reliability using both item response theory and classical test theory. The KI scoring rubric showed that most students had difficulty linking graphs features to science concepts, especially when asked to critique or construct graphs. In addition, students with limited access to computers as well as those who speak a language other than English at home have less integrated understanding than others. These findings point to the need to increase the integration of graphing into science instruction. The results suggest directions for further research leading to comprehensive assessments of graph understanding.
ERIC Educational Resources Information Center
Association for Education in Journalism and Mass Communication.
The newspaper section of the Proceedings contains the following 18 papers: "The Role of Headlines and Nut Graphs in Helping Readers Learn from News Stories" (Glen L. Bleske); "Daily Newspaper Reporters' Views of Journalistic Roles: An Integrated Perspective" (Dan Berkowitz and James TerKeurst); "'Cohen V. Cowles Media':…
The World of AWRT: A Profile of the Membership of American Women in Radio and Television, Inc.
ERIC Educational Resources Information Center
American Women in Radio and Television, Inc., Washington, DC.
Based on a survey of 40.8 percent (1,094) of the members of American Women in Radio and Television (AWRT), this report documents women's characteristics and their contributions to the broadcasting field. Sections of the report provide bar graphs depicting: (1) the types of companies and agencies where AWRT members work; (2) the types of jobs held…
Deformation quantization with separation of variables of an endomorphism bundle
NASA Astrophysics Data System (ADS)
Karabegov, Alexander
2014-01-01
Given a holomorphic Hermitian vector bundle E and a star-product with separation of variables on a pseudo-Kähler manifold, we construct a star product on the sections of the endomorphism bundle of the dual bundle E∗ which also has the appropriately generalized property of separation of variables. For this star product we prove a generalization of Gammelgaard's graph-theoretic formula.
Ryder, Robert T.; Swezey, Christopher S.; Crangle, Robert D.; Trippi, Michael H.
2008-01-01
Geologic cross section E-E' is the first in a series of cross sections planned by the U.S. Geological Survey (USGS) to document and improve understanding of the geologic framework and petroleum systems of the Appalachian basin. Cross section E-E' provides a regional view of the structural and stratigraphic framework of the basin from the Findlay arch in northwestern Ohio to the Valley and Ridge province in eastern West Virginia, a distance of approximately 380 miles (mi) (fig. 1, on sheet 1). Cross section E-E' updates earlier geologic cross sections through the central Appalachian basin by Renfro and Feray (1970), Bennison (1978), and Bally and Snelson (1980) and a stratigraphic cross section by Colton (1970). Although other published cross sections through parts of the basin show more structural detail (for example, Shumaker, 1985; Kulander and Dean, 1986) and stratigraphic detail (for example, Ryder, 1992; de Witt and others, 1993; Hettinger, 2001), these other cross sections are of more limited extent geographically and stratigraphically. Although specific petroleum systems in the Appalachian basin are not identified on the cross section, many of their key elements (such as source rocks, reservoir rocks, seals, and traps) can be inferred from lithologic units, unconformities, and geologic structures shown on the cross section. Other aspects of petroleum systems (such as the timing of petroleum generation and preferred migration pathways) may be evaluated by burial history, thermal history, and fluid flow models based on information shown on the cross section. Cross section E-E' lacks the detail to illustrate key elements of coal systems (such as paleoclimate, coal quality, and coal rank), but it does provide a general framework (stratigraphic units and general rock types) for the coal-bearing section. Also, cross section E-E' may be used as a reconnaissance tool to identify plausible geologic structures and strata for the subsurface storage of liquid waste (for example, Colton, 1961; Lloyd and Reid, 1990) or for the sequestration of carbon dioxide (for example, Smith and others, 2002; Lucier and others, 2006).
NASA Astrophysics Data System (ADS)
Wu, Jing; Waldstein, Sebastian M.; Gerendas, Bianca S.; Langs, Georg; Simader, Christian; Schmidt-Erfurth, Ursula
2015-03-01
Spectral-domain Optical Coherence Tomography (SD-OCT) is a non-invasive modality for acquiring high- resolution, three-dimensional (3D) cross-sectional volumetric images of the retina and the subretinal layers. SD-OCT also allows the detailed imaging of retinal pathology, aiding clinicians in the diagnosis of sight degrading diseases such as age-related macular degeneration (AMD), glaucoma and retinal vein occlusion (RVO). Disease diagnosis, assessment, and treatment will require a patient to undergo multiple OCT scans, possibly using multiple scanners, to accurately and precisely gauge disease activity, progression and treatment success. However, cross-vendor imaging and patient movement may result in poor scan spatial correlation potentially leading to incorrect diagnosis or treatment analysis. The retinal fovea is the location of the highest visual acuity and is present in all patients, thus it is critical to vision and highly suitable for use as a primary landmark for cross-vendor/cross-patient registration for precise comparison of disease states. However, the location of the fovea in diseased eyes is extremely challenging to locate due to varying appearance and the presence of retinal layer destroying pathology. Thus categorising and detecting the fovea type is an important prior stage to automatically computing the fovea position. Presented here is an automated cross-vendor method for fovea distinction in 3D SD-OCT scans of patients suffering from RVO, categorising scans into three distinct types. OCT scans are preprocessed by motion correction and noise filing followed by segmentation using a kernel graph-cut approach. A statistically derived mask is applied to the resulting scan creating an ROI around the probable fovea location from which the uppermost retinal surface is delineated. For a normal appearance retina, minimisation to zero thickness is computed using the top two retinal surfaces. 3D local minima detection and layer thickness analysis are used to differentiate between the remaining two fovea types. Validation employs ground truth fovea types identified by clinical experts at the Vienna Reading Center (VRC). The results presented here are intended to show the feasibility of this method for the accurate and reproducible distinction of retinal fovea types from multiple vendor 3D SD-OCT scans of patients suffering from RVO, and for use in fovea position detection systems as a landmark for intra- and cross-vendor 3D OCT registration.
RATGRAPH: Computer Graphing of Rational Functions.
ERIC Educational Resources Information Center
Minch, Bradley A.
1987-01-01
Presents an easy-to-use Applesoft BASIC program that graphs rational functions and any asymptotes that the functions might have. Discusses the nature of rational functions, graphing them manually, employing a computer to graph rational functions, and describes how the program works. (TW)
Groupies in multitype random graphs.
Shang, Yilun
2016-01-01
A groupie in a graph is a vertex whose degree is not less than the average degree of its neighbors. Under some mild conditions, we show that the proportion of groupies is very close to 1/2 in multitype random graphs (such as stochastic block models), which include Erdős-Rényi random graphs, random bipartite, and multipartite graphs as special examples. Numerical examples are provided to illustrate the theoretical results.
GraQL: A Query Language for High-Performance Attributed Graph Databases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chavarría-Miranda, Daniel; Castellana, Vito G.; Morari, Alessandro
Graph databases have gained increasing interest in the last few years due to the emergence of data sources which are not easily analyzable in traditional relational models or for which a graph data model is the natural representation. In order to understand the design and implementation choices for an attributed graph database backend and query language, we have started to design our infrastructure for attributed graph databases. In this paper, we describe the design considerations of our in-memory attributed graph database system with a particular focus on the data definition and query language components.
Graph-based normalization and whitening for non-linear data analysis.
Aaron, Catherine
2006-01-01
In this paper we construct a graph-based normalization algorithm for non-linear data analysis. The principle of this algorithm is to get a spherical average neighborhood with unit radius. First we present a class of global dispersion measures used for "global normalization"; we then adapt these measures using a weighted graph to build a local normalization called "graph-based" normalization. Then we give details of the graph-based normalization algorithm and illustrate some results. In the second part we present a graph-based whitening algorithm built by analogy between the "global" and the "local" problem.
The investigation of social networks based on multi-component random graphs
NASA Astrophysics Data System (ADS)
Zadorozhnyi, V. N.; Yudin, E. B.
2018-01-01
The methods of non-homogeneous random graphs calibration are developed for social networks simulation. The graphs are calibrated by the degree distributions of the vertices and the edges. The mathematical foundation of the methods is formed by the theory of random graphs with the nonlinear preferential attachment rule and the theory of Erdôs-Rényi random graphs. In fact, well-calibrated network graph models and computer experiments with these models would help developers (owners) of the networks to predict their development correctly and to choose effective strategies for controlling network projects.
Bipartite graphs as models of population structures in evolutionary multiplayer games.
Peña, Jorge; Rochat, Yannick
2012-01-01
By combining evolutionary game theory and graph theory, "games on graphs" study the evolutionary dynamics of frequency-dependent selection in population structures modeled as geographical or social networks. Networks are usually represented by means of unipartite graphs, and social interactions by two-person games such as the famous prisoner's dilemma. Unipartite graphs have also been used for modeling interactions going beyond pairwise interactions. In this paper, we argue that bipartite graphs are a better alternative to unipartite graphs for describing population structures in evolutionary multiplayer games. To illustrate this point, we make use of bipartite graphs to investigate, by means of computer simulations, the evolution of cooperation under the conventional and the distributed N-person prisoner's dilemma. We show that several implicit assumptions arising from the standard approach based on unipartite graphs (such as the definition of replacement neighborhoods, the intertwining of individual and group diversity, and the large overlap of interaction neighborhoods) can have a large impact on the resulting evolutionary dynamics. Our work provides a clear example of the importance of construction procedures in games on graphs, of the suitability of bigraphs and hypergraphs for computational modeling, and of the importance of concepts from social network analysis such as centrality, centralization and bipartite clustering for the understanding of dynamical processes occurring on networked population structures.
Graphing evolutionary pattern and process: a history of techniques in archaeology and paleobiology.
Lyman, R Lee
2009-02-01
Graphs displaying evolutionary patterns are common in paleontology and in United States archaeology. Both disciplines subscribed to a transformational theory of evolution and graphed evolution as a sequence of archetypes in the late nineteenth and early twentieth centuries. U.S. archaeologists in the second decade of the twentieth century, and paleontologists shortly thereafter, developed distinct graphic styles that reflected the Darwinian variational model of evolution. Paleobiologists adopted the view of a species as a set of phenotypically variant individuals and graphed those variations either as central tendencies or as histograms of frequencies of variants. Archaeologists presumed their artifact types reflected cultural norms of prehistoric artisans and the frequency of specimens in each type reflected human choice and type popularity. They graphed cultural evolution as shifts in frequencies of specimens representing each of several artifact types. Confusion of pattern and process is exemplified by a paleobiologist misinterpreting the process illustrated by an archaeological graph, and an archaeologist misinterpreting the process illustrated by a paleobiological graph. Each style of graph displays particular evolutionary patterns and implies particular evolutionary processes. Graphs of a multistratum collection of prehistoric mammal remains and a multistratum collection of artifacts demonstrate that many graph styles can be used for both kinds of collections.
Horizontal visibility graphs generated by type-I intermittency
NASA Astrophysics Data System (ADS)
Núñez, Ángel M.; Luque, Bartolo; Lacasa, Lucas; Gómez, Jose Patricio; Robledo, Alberto
2013-05-01
The type-I intermittency route to (or out of) chaos is investigated within the horizontal visibility (HV) graph theory. For that purpose, we address the trajectories generated by unimodal maps close to an inverse tangent bifurcation and construct their associated HV graphs. We show how the alternation of laminar episodes and chaotic bursts imprints a fingerprint in the resulting graph structure. Accordingly, we derive a phenomenological theory that predicts quantitative values for several network parameters. In particular, we predict that the characteristic power-law scaling of the mean length of laminar trend sizes is fully inherited by the variance of the graph degree distribution, in good agreement with the numerics. We also report numerical evidence on how the characteristic power-law scaling of the Lyapunov exponent as a function of the distance to the tangent bifurcation is inherited in the graph by an analogous scaling of block entropy functionals defined on the graph. Furthermore, we are able to recast the full set of HV graphs generated by intermittent dynamics into a renormalization-group framework, where the fixed points of its graph-theoretical renormalization-group flow account for the different types of dynamics. We also establish that the nontrivial fixed point of this flow coincides with the tangency condition and that the corresponding invariant graph exhibits extremal entropic properties.
A comparison of total reaction cross section models used in particle and heavy ion transport codes
NASA Astrophysics Data System (ADS)
Sihver, Lembit; Lantz, M.; Takechi, M.; Kohama, A.; Ferrari, A.; Cerutti, F.; Sato, T.
To be able to calculate the nucleon-nucleus and nucleus-nucleus total reaction cross sections with precision is very important for studies of basic nuclear properties, e.g. nuclear structure. This is also of importance for particle and heavy ion transport calculations because, in all particle and heavy ion transport codes, the probability function that a projectile particle will collide within a certain distance x in the matter depends on the total reaction cross sections. Furthermore, the total reaction cross sections will also scale the calculated partial fragmentation cross sections. It is therefore crucial that accurate total reaction cross section models are used in the transport calculations. In this paper, different models for calculating nucleon-nucleus and nucleus-nucleus total reaction cross sections are compared and discussed.
Vertically stabilized elongated cross-section tokamak
Sheffield, George V.
1977-01-01
This invention provides a vertically stabilized, non-circular (minor) cross-section, toroidal plasma column characterized by an external separatrix. To this end, a specific poloidal coil means is added outside a toroidal plasma column containing an endless plasma current in a tokamak to produce a rectangular cross-section plasma column along the equilibrium axis of the plasma column. By elongating the spacing between the poloidal coil means the plasma cross-section is vertically elongated, while maintaining vertical stability, efficiently to increase the poloidal flux in linear proportion to the plasma cross-section height to achieve a much greater plasma volume than could be achieved with the heretofore known round cross-section plasma columns. Also, vertical stability is enhanced over an elliptical cross-section plasma column, and poloidal magnetic divertors are achieved.
Loyen, Anne; Clarke-Cornwell, Alexandra M; Anderssen, Sigmund A; Hagströmer, Maria; Sardinha, Luís B; Sundquist, Kristina; Ekelund, Ulf; Steene-Johannessen, Jostein; Baptista, Fátima; Hansen, Bjørge H; Wijndaele, Katrien; Brage, Søren; Lakerveld, Jeroen; Brug, Johannes; van der Ploeg, Hidde P
2017-07-01
The objective of this study was to pool, harmonise and re-analyse national accelerometer data from adults in four European countries in order to describe population levels of sedentary time and physical inactivity. Five cross-sectional studies were included from England, Portugal, Norway and Sweden. ActiGraph accelerometer count data were centrally processed using the same algorithms. Multivariable logistic regression analyses were conducted to study the associations of sedentary time and physical inactivity with sex, age, weight status and educational level, in both the pooled sample and the separate study samples. Data from 9509 participants were used. On average, participants were sedentary for 530 min/day, and accumulated 36 min/day of moderate to vigorous intensity physical activity. Twenty-three percent accumulated more than 10 h of sedentary time/day, and 72% did not meet the physical activity recommendations. Nine percent of all participants were classified as high sedentary and low active. Participants from Norway showed the highest levels of sedentary time, while participants from England were the least physically active. Age and weight status were positively associated with sedentary time and not meeting the physical activity recommendations. Men and higher-educated people were more likely to be highly sedentary, while women and lower-educated people were more likely to be inactive. We found high levels of sedentary time and physical inactivity in four European countries. Older people and obese people were most likely to display these behaviours and thus deserve special attention in interventions and policy planning. In order to monitor these behaviours, accelerometer-based cross-European surveillance is recommended.
BACKSCAT Lidar Simulation Version 3.0: Technical Documentation and Users Guide
1992-12-03
Raman Cross Section of Some Simple Gases, J. Opt. Soc. Am., 63:73. 20 Penny, C.M., St. Peters, R.L., and Lapp, M., (1974) Absolute Rotational Raman...of the molecule, and the remaining columns list the relative normalized cross sections for the respective excitation wavelength. The absolute Raman...cross section is obtained by simply multiplying the relative normalized cross section for a molecular species of interest by the absolute cross section
Exact and approximate graph matching using random walks.
Gori, Marco; Maggini, Marco; Sarti, Lorenzo
2005-07-01
In this paper, we propose a general framework for graph matching which is suitable for different problems of pattern recognition. The pattern representation we assume is at the same time highly structured, like for classic syntactic and structural approaches, and of subsymbolic nature with real-valued features, like for connectionist and statistic approaches. We show that random walk based models, inspired by Google's PageRank, give rise to a spectral theory that nicely enhances the graph topological features at node level. As a straightforward consequence, we derive a polynomial algorithm for the classic graph isomorphism problem, under the restriction of dealing with Markovian spectrally distinguishable graphs (MSD), a class of graphs that does not seem to be easily reducible to others proposed in the literature. The experimental results that we found on different test-beds of the TC-15 graph database show that the defined MSD class "almost always" covers the database, and that the proposed algorithm is significantly more efficient than top scoring VF algorithm on the same data. Most interestingly, the proposed approach is very well-suited for dealing with partial and approximate graph matching problems, derived for instance from image retrieval tasks. We consider the objects of the COIL-100 visual collection and provide a graph-based representation, whose node's labels contain appropriate visual features. We show that the adoption of classic bipartite graph matching algorithms offers a straightforward generalization of the algorithm given for graph isomorphism and, finally, we report very promising experimental results on the COIL-100 visual collection.
Yu, Qingbao; Erhardt, Erik B.; Sui, Jing; Du, Yuhui; He, Hao; Hjelm, Devon; Cetin, Mustafa S.; Rachakonda, Srinivas; Miller, Robyn L.; Pearlson, Godfrey; Calhoun, Vince D.
2014-01-01
Graph theory-based analysis has been widely employed in brain imaging studies, and altered topological properties of brain connectivity have emerged as important features of mental diseases such as schizophrenia. However, most previous studies have focused on graph metrics of stationary brain graphs, ignoring that brain connectivity exhibits fluctuations over time. Here we develop a new framework for accessing dynamic graph properties of time-varying functional brain connectivity in resting state fMRI data and apply it to healthy controls (HCs) and patients with schizophrenia (SZs). Specifically, nodes of brain graphs are defined by intrinsic connectivity networks (ICNs) identified by group independent component analysis (ICA). Dynamic graph metrics of the time-varying brain connectivity estimated by the correlation of sliding time-windowed ICA time courses of ICNs are calculated. First- and second-level connectivity states are detected based on the correlation of nodal connectivity strength between time-varying brain graphs. Our results indicate that SZs show decreased variance in the dynamic graph metrics. Consistent with prior stationary functional brain connectivity works, graph measures of identified first-level connectivity states show lower values in SZs. In addition, more first-level connectivity states are disassociated with the second-level connectivity state which resembles the stationary connectivity pattern computed by the entire scan. Collectively, the findings provide new evidence about altered dynamic brain graphs in schizophrenia which may underscore the abnormal brain performance in this mental illness. PMID:25514514
Energy and Mass-Number Dependence of Hadron-Nucleus Total Reaction Cross Sections
NASA Astrophysics Data System (ADS)
Kohama, Akihisa; Iida, Kei; Oyamatsu, Kazuhiro
2016-09-01
We thoroughly investigate how proton-nucleus total reaction cross sections depend on the target mass number A and the proton incident energy. In doing so, we systematically analyze nuclear reaction data that are sensitive to nuclear size, namely, proton-nucleus total reaction cross sections and differential elastic cross sections, using a phenomenological black-sphere approximation of nuclei that we are developing. In this framework, the radius of the black sphere is found to be a useful length scale that simultaneously accounts for the observed proton-nucleus total reaction cross section and first diffraction peak in the proton elastic differential cross section. This framework, which is shown here to be applicable to antiprotons, is expected to be applicable to any kind of projectile that is strongly attenuated in the nucleus. On the basis of a cross-section formula constructed within this framework, we find that a less familiar A1/6 dependence plays a crucial role in describing the energy dependence of proton-nucleus total reaction cross sections.
Activation cross section and isomeric cross-section ratio for the 151Eu(n,2n)150m,gEu process
NASA Astrophysics Data System (ADS)
Luo, Junhua; Li, Suyuan; Jiang, Li
2018-07-01
The cross sections of 151Eu(n,2n)150m,gEu reactions and their isomeric cross section ratios σm/σt have been measured experimentally. Cross sections are measured, relative to a reference 93Nb(n,2n)92mNb reaction cross section, by means of the activation technique at three neutron energies 13.5, 14.1, and 14.8 MeV. Monoenergetic neutron beams were formed via the 3H(d,n)4He reaction and both Eu2O3 samples and Nb monitor foils were activated together to determine the reaction cross section and the incident neutron flux. The activities induced in the reaction products were measured using high-resolution gamma ray spectroscopy. Cross sections were also evaluated theoretically using the numerical nuclear model code, TALYS-1.8 with different level density options at neutron energies varying from the reaction threshold to 20 MeV. Results are discussed and compared with the corresponding literature.
Total reaction cross sections in CEM and MCNP6 at intermediate energies
Kerby, Leslie M.; Mashnik, Stepan G.
2015-05-14
Accurate total reaction cross section models are important to achieving reliable predictions from spallation and transport codes. The latest version of the Cascade Exciton Model (CEM) as incorporated in the code CEM03.03, and the Monte Carlo N-Particle transport code (MCNP6), both developed at Los Alamos National Laboratory (LANL), each use such cross sections. Having accurate total reaction cross section models in the intermediate energy region (50 MeV to 5 GeV) is very important for different applications, including analysis of space environments, use in medical physics, and accelerator design, to name just a few. The current inverse cross sections used inmore » the preequilibrium and evaporation stages of CEM are based on the Dostrovsky et al. model, published in 1959. Better cross section models are now available. Implementing better cross section models in CEM and MCNP6 should yield improved predictions for particle spectra and total production cross sections, among other results.« less
Total reaction cross sections in CEM and MCNP6 at intermediate energies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerby, Leslie M.; Mashnik, Stepan G.
Accurate total reaction cross section models are important to achieving reliable predictions from spallation and transport codes. The latest version of the Cascade Exciton Model (CEM) as incorporated in the code CEM03.03, and the Monte Carlo N-Particle transport code (MCNP6), both developed at Los Alamos National Laboratory (LANL), each use such cross sections. Having accurate total reaction cross section models in the intermediate energy region (50 MeV to 5 GeV) is very important for different applications, including analysis of space environments, use in medical physics, and accelerator design, to name just a few. The current inverse cross sections used inmore » the preequilibrium and evaporation stages of CEM are based on the Dostrovsky et al. model, published in 1959. Better cross section models are now available. Implementing better cross section models in CEM and MCNP6 should yield improved predictions for particle spectra and total production cross sections, among other results.« less
Some Applications of Graph Theory to Clustering
ERIC Educational Resources Information Center
Hubert, Lawrence J.
1974-01-01
The connection between graph theory and clustering is reviewed and extended. Major emphasis is on restating, in a graph-theoretic context, selected past work in clustering, and conversely, developing alternative strategies from several standard concepts used in graph theory per se. (Author/RC)