US GeoData Available Through the Internet
,
2000-01-01
The U.S. Geological Survey (USGS) offers certain US GeoData data sets through the Internet. They can be retrieved using the World Wide Web or anonymous File Transfer Protocol (FTP). The data bases and their directory paths are as follows: * 1:24,000-scale digital line graph data in SDTS format (/pub/data/DLG/24K) * 1:2,000,000-scale digital line graph data in SDTS format (/pub/data/DLG/2M) * 1:100,000-scale digital line graph data (/pub/data/DLG/100K) * 1:100,000-scale land use and land cover data (/pub/data/LULC/100K) * 1:250,000-scale land use and land cover data (/pub/data/LULC/250K) * 1-degree digital elevation model data (/pub/data/DEM/250)
US GeoData Available Through the Internet
,
2000-01-01
The U.S. Geological Survey (USGS) offers certain US GeoData data sets through the Internet. They can be retrieved using the World Wide Web or anonymous File Transfer Protocol (FTP). The data bases and their directory paths are as follows: * 1:24,000-scale digital line graph data in SDTS format (/pub/data/DLG/24K) * 1:2,000,000-scale digital line graph data in SDTS format (/pub/data/DLG/2M) * 1:100,000-scale digital line graph data (/pub/data/DLG/100K) * 1:100,000-scale land use and land cover data (/pub/data/LULC/100K) * 1:250,000-scale land use and land cover data (/pub/data/LULC/250K) * 1:24,000-scale digital elevation data (/pub/data/DEM/7.5min) * 1-degree digital elevation model data (/pub/data/DEM/250)
U.S. Geological Survey DLG-3 and Bureau of the Census TIGER data. Development and GIS applications
Batten, Lawrence G.
1990-01-01
The U.S. Geological Survey has been actively developing digital cartographic and geographic data and standards since the early 1970's. One product is Digital Line Graph data, which offer a consistently accurate source of base category geographic information. The Bureau of the Census has combined their Dual Independent Map Encoding data with the Geological Survey's 1:100,000-scale Digital Line Graph data to prepare for the 1990 decennial census. The resulting Topologically Integrated Geographic Encoding and Referencing data offer a wealth of information. A major area of research using these data is in transportation analysis. The attributes associated with Digital Line Graphs can be used to determine the average travel times along each segment. Geographic information system functions can then be used to optimize routes through the network and to generate street name lists. Additional aspects of the subject are discussed.
Digital line graphs from 1:24,000-scale maps
,
1990-01-01
The Earth Science Information Centers (ESIC) distribute digital cartographic/geographic data files produced by the U.S. Geological Survey (USGS) as part of the National Mapping Program. Digital cartographic data flles are grouped into four basic types. The first of these, called a Digital Line . Graph (DLG), is line map information in digital form. These data files include information on planimetric base categories, such as transportation, hydrography, and boundaries. The second type, called a Digital Elevation Model (DEM), consists of a sampled array of elevations for a number of ground positions that are usually at regularly spaced intervals. The third type is Land Use and Land Cover digital data, which provides information on nine major classes of land use such as urban, agricultural, or forest as wen as associated map data such as political units and Federal land ownership. The fourth type, the Geographic Names Information System, provides primary information for all known places, features, and areas in the United States identified by a proper name.
Digital line graphs from 1:100,000-scale maps
,
1989-01-01
The National Cartographic Information Center (NCIC) distributes digital cartographic/geographic data files produced by the U.S. Geological Survey (USGS) as part of the National Mapping Program. Digital cartographic data files may be grouped into four basic types. The first of these, called a Digital Line Graph (DLG), is line map information in digital form. These data files include information on planimetric base categories, such as transportation, hydrography, and boundaries. The second form, called a Digital Elevation Model (OEM), consists of a sampled array of elevations for ground positions that are usually, but not always, at regularly spaced intervals. The third type is Land Use and Land Cover digital data, which provides information on nine major classes of land use such as urban, agricultural, or forest as well as associated map data such as political units and Federal land ownership. The fourth type, the Geographic Names Information System, provides primary information for known places, features, and areas in the United States identified by a proper name.
RAILROAD DIGITAL LINE GRAPHS FOR THE MID-ATLANTIC INTEGRATED ASSESSMENT (MAIA) STUDY AREA
This data set is a geographic information system (GIS) coverage of railroads for the United States Environmental Protection Agency (USEPA) Mid-Atlantic Integrated Assessment (MAIA) Project region. The coverage was produced using US Geological Survey transportation digital line ...
1:2,000,000-scale digital line graph data on CD-ROM
,
1995-01-01
Updated U.S. Geological Survey digital line graph (DLG) data collected at a scale of 1:2,000,000 are now available on two compact discs-read only memory (CD-ROM). Each CD-ROM contains digital cartographic data for 49 States and the District of Columbia. The U.S. Virgin Islands, Puerto Rico, and Alaska will be ready within the next year. These DLG data were originally collected from maps published in 1970. Extensive revisions have been made and no data source more than 5 years old was used in this update. In addition, text files containing information such as place names and population have been added for the first time. The records in these text files can be related to corresponding features in the DLG data files. Metadata that comply with the Federal Geographic Data Committee Content Standards for Digital Geospatial Metadata are included for each category of DLG data.
USGS DLGs are digital representations of program-quadrangle format and sectional maps. All DLG data distributed by the United States Geological Survey (USGS) are DLG-Level 3 (DLG-3), which means the data contain a full range of attribute codes, have full topological structuring, ...
DIGITAL LINE GRAPHS - USGS 1:24,000
USGS DLGs are digital representations of program-quadrangle format and sectional maps. All DLG data distributed by the United States Geological Survey (USGS) are DLG-Level 3 (DLG-3), which means the data contain a full range of attribute codes, have full topological structuring, ...
DIGITAL LINE GRAPHS - USGS 1:100,000
USGS DLGs are digital representations of program-quadrangle format and sectional maps. All DLG data distributed by the United States Geological Survey (USGS) are DLG-Level 3 (DLG-3), which means the data contain a full range of attribute codes, have full topological structuring, ...
This data set is a geographic information system (GIS) coverage of pipelines, transmission lines, and miscellaneous transportation features for the United States Environmental Protection Agency (USEPA) Mid-Atlantic Integrated Assessment (MAIA) Project region. The coverage was p...
Digitization of a geologic map for the Quebec-Maine-Gulf of Maine global geoscience transect
Wright, Bruce E.; Stewart, David B.
1990-01-01
The Bedrock Geologic Map of Maine was digitized and combined with digital geologic data for Quebec and the Gulf of Maine for the Quebec-Maine-Gulf of Maine Geologic Transect Project. This map is being combined with digital geophysical data to produce three-dimensional depictions of the subsurface geology and to produce cross sections of the Earth's crust. It is an essential component of a transect that stretches from the craton near Quebec City, Quebec, to the Atlantic Ocean Basin south of Georges Bank. The transect is part of the Global Geosciences Transect Project of the International Lithosphere Program. The Digital Line Graph format is used for storage of the digitized data. A coding scheme similar to that used for base category planimetric data was developed to assign numeric codes to the digitized geologic data. These codes were used to assign attributes to polygon and line features to describe rock type, age, name, tectonic setting of original deposition, mineralogy, and composition of igneous plutonic rocks, as well as faults and other linear features. The digital geologic data can be readily edited, rescaled, and reprojected. The attribute codes allow generalization and selective retrieval of the geologic features. The codes allow assignment of map colors based on age, lithology, or other attribute. The Digital Line Graph format is a general transfer format that is supported by many software vendors and is easily transferred between systems.
,
2000-01-01
The U.S. Geological Survey's (USGS) Earth Explorer Web site provides access to millions of land-related products, including the following: Satellite images from Landsat, advanced very high resolution radiometer (AVHRR), and Corona data sets. Aerial photographs from the National Aerial Photography Program, NASA, and USGS data sets. Digital cartographic data from digital elevation models, digital line graphs, digital raster graphics, and digital orthophoto quadrangles. USGS paper maps Digital, film, and paper products are available, and many products can be previewed before ordering.
An enhanced digital line graph design
Guptill, Stephen C.
1990-01-01
In response to increasing information demands on its digital cartographic data, the U.S. Geological Survey has designed an enhanced version of the Digital Line Graph, termed Digital Line Graph - Enhanced (DLG-E). In the DLG-E model, the phenomena represented by geographic and cartographic data are termed entities. Entities represent individual phenomena in the real world. A feature is an abstraction of a set of entities, with the feature description encompassing only selected properties of the entities (typically the properties that have been portrayed cartographically on a map). Buildings, bridges, roads, streams, grasslands, and counties are examples of features. A feature instance, that is, one occurrence of a feature, is described in the digital environment by feature objects and spatial objects. A feature object identifies a feature instance and its nonlocational attributes. Nontopological relationships are associated with feature objects. The locational aspects of the feature instance are represented by spatial objects. Four spatial objects (points, nodes, chains, and polygons) and their topological relationships are defined. To link the locational and nonlocational aspects of the feature instance, a given feature object is associated with (or is composed of) a set of spatial objects. These objects, attributes, and relationships are the components of the DLG-E data model. To establish a domain of features for DLG-E, an approach using a set of classes, or views, of spatial entities was adopted. The five views that were developed are cover, division, ecosystem, geoposition, and morphology. The views are exclusive; each view is a self-contained analytical approach to the entire range of world features. Because each view is independent of the others, a single point on the surface of the Earth can be represented under multiple views. Under the five views, over 200 features were identified and defined. This set constitutes an initial domain of DLG-E features.
Study of cryogenic propellant systems for loading the space shuttle. Part 2: Hydrogen systems
NASA Technical Reports Server (NTRS)
Steward, W. G.
1975-01-01
Computer simulation studies of liquid hydrogen fill and vent systems for the space shuttle are studied. The computer programs calculate maximum and minimum permissible flow rates during cooldown as limited by thermal stress considerations, fill line cooldown time, pressure drop, flow rates, vapor content, vent line pressure drop and vent line discharge temperature. The input data for these programs are selected through graphic displays which schematically depict the part of the system being analyzed. The computed output is also displayed in the form of printed messages and graphs. Digital readouts of graph coordinates may also be obtained. Procedures are given for operation of the graphic display unit and the associated minicomputer and timesharing computer.
Usage of "Powergraph" software at laboratory lessons of "general physics" department of MEPhI
NASA Astrophysics Data System (ADS)
Klyachin, N. A.; Matronchik, A. Yu.; Khangulyan, E. V.
2017-01-01
One considers usage of "PowerGraph" software in laboratory exercise "Study of sodium spectrum" of physical experiment lessons. Togethe with the design of experiment setup, one discusses the sodium spectra digitized with computer audio chip. Usage of "PowerGraph" software in laboratory experiment "Study of sodium spectrum" allows an efficient visualization of the sodium spectrum and analysis of its fine structure. In particular, it allows quantitative measurements of the wavelengths and line relative intensities.
,
1993-01-01
The Earth Science Information Center (ESIC) distributes digital cartographic/geographic data files produced by the U.S. Geological Survey (USGS) as part of the National Mapping Program. Digital cartographic data files may be grouped into four basic types. The first of these, called a Digital Line Graph (DLG), is the line map information in digital form. These data files include information on base data categories, such as transportation, hypsography, hydrography, and boundaries. The second type, called a Digital Elevation Model (DEM), consists of a sampled array of elevations for a number of ground positions at regularly spaced intervals. The third type is Land Use and Land Cover digital data which provides information on nine major classes of land use such as urban, agricultural, or forest as well as associated map data such as political units and Federal land ownership. The fourth type, the Geographic Names Information System, provides primary information for all known places, features, and areas in the United States identified by a proper name.
This data set is a geographic information system (GIS) coverage of the trails, footbridges, and perimeters of parking areas (Class 5 Roads) for the United States Environmental Protection Agency (USEPA) Mid-Atlantic Integrated Assessment (MAIA) Project region. The coverage was p...
This data set is a geographic information system (GIS) coverage of the lower level divided roads and streets (Class 3 Roads) for the United States Environmental Protection Agency (USEPA) Mid-Atlantic Integrated Assessment (MAIA) Project region. The coverage was produced using U...
This data set is a geographic information system (GIS) coverage of the Interstate and United States Highways (Class 1 Roads) for the United States Environmental Protection Agency (USEPA) Mid-Atlantic Integrated Assessment (MAIA) Project region. The coverage was produced using U...
This data set is a geographic information system (GIS) coverage of the lower level roads and streets (Class 4 Roads) for the United States Environmental Protection Agency (USEPA) Mid-Atlantic Integrated Assessment (MAIA) Project region. The coverage was produced using US Geolog...
This data set is a geographic information system (GIS) coverage of the state and county highways (Class 2 Roads) for the United States Environmental Protection Agency (USEPA) Mid-Atlantic Integrated Assessment (MAIA) Project region. The coverage was produced using US Geological...
US Geological Survey customers speak out
Gillespie, S.; Snyder, G.
1995-01-01
Provides results of a customer survey carried out in 1994 by the US Geological Survey. Uses of cartographic products are classified, as are application areas, accuracy satisfaction, media, Digital Line Graph requirements in update, and frequency of product use. USGS responses and plans for the future are noted. -M.Blakemore
Land use and land cover digital data from 1:250,000- and 1:100,000- scale maps
,
1990-01-01
The Earth Science Information Centers (ESIC) distribute digital cartographic/geographic data files produced by the U.S. Geological Survey (USGS) as part of the National Mapping Program. The data files are grouped into four basic types. The first type, called a Digital Line Graph (DLG), is line map information in digital form. These data files include information on planimetric base categories, such as transportation, hydrography, and boundaries. The second type, called a Digital Elevation Model (DEM), consists of a sampled array of elevations for ground positions that are usually at regularly spaced intervals. The third type, Land Use and Land Cover digital data, provide information on nine major classes of land use such as urban, agricultural, or forest as well as associated map data such as political units and Federal land ownership. The fourth type, the Geographic Names Information System, provides primary information for known places, features, and areas in the United States identified by a proper name.
Creating a standardized watersheds database for the Lower Rio Grande/Río Bravo, Texas
Brown, J.R.; Ulery, Randy L.; Parcher, Jean W.
2000-01-01
This report describes the creation of a large-scale watershed database for the lower Rio Grande/Río Bravo Basin in Texas. The watershed database includes watersheds delineated to all 1:24,000-scale mapped stream confluences and other hydrologically significant points, selected watershed characteristics, and hydrologic derivative datasets.Computer technology allows generation of preliminary watershed boundaries in a fraction of the time needed for manual methods. This automated process reduces development time and results in quality improvements in watershed boundaries and characteristics. These data can then be compiled in a permanent database, eliminating the time-consuming step of data creation at the beginning of a project and providing a stable base dataset that can give users greater confidence when further subdividing watersheds.A standardized dataset of watershed characteristics is a valuable contribution to the understanding and management of natural resources. Vertical integration of the input datasets used to automatically generate watershed boundaries is crucial to the success of such an effort. The optimum situation would be to use the digital orthophoto quadrangles as the source of all the input datasets. While the hydrographic data from the digital line graphs can be revised to match the digital orthophoto quadrangles, hypsography data cannot be revised to match the digital orthophoto quadrangles. Revised hydrography from the digital orthophoto quadrangle should be used to create an updated digital elevation model that incorporates the stream channels as revised from the digital orthophoto quadrangle. Computer-generated, standardized watersheds that are vertically integrated with existing digital line graph hydrographic data will continue to be difficult to create until revisions can be made to existing source datasets. Until such time, manual editing will be necessary to make adjustments for man-made features and changes in the natural landscape that are not reflected in the digital elevation model data.
Creating a standardized watersheds database for the lower Rio Grande/Rio Bravo, Texas
Brown, Julie R.; Ulery, Randy L.; Parcher, Jean W.
2000-01-01
This report describes the creation of a large-scale watershed database for the lower Rio Grande/Rio Bravo Basin in Texas. The watershed database includes watersheds delineated to all 1:24,000-scale mapped stream confluences and other hydrologically significant points, selected watershed characteristics, and hydrologic derivative datasets. Computer technology allows generation of preliminary watershed boundaries in a fraction of the time needed for manual methods. This automated process reduces development time and results in quality improvements in watershed boundaries and characteristics. These data can then be compiled in a permanent database, eliminating the time-consuming step of data creation at the beginning of a project and providing a stable base dataset that can give users greater confidence when further subdividing watersheds. A standardized dataset of watershed characteristics is a valuable contribution to the understanding and management of natural resources. Vertical integration of the input datasets used to automatically generate watershed boundaries is crucial to the success of such an effort. The optimum situation would be to use the digital orthophoto quadrangles as the source of all the input datasets. While the hydrographic data from the digital line graphs can be revised to match the digital orthophoto quadrangles, hypsography data cannot be revised to match the digital orthophoto quadrangles. Revised hydrography from the digital orthophoto quadrangle should be used to create an updated digital elevation model that incorporates the stream channels as revised from the digital orthophoto quadrangle. Computer-generated, standardized watersheds that are vertically integrated with existing digital line graph hydrographic data will continue to be difficult to create until revisions can be made to existing source datasets. Until such time, manual editing will be necessary to make adjustments for man-made features and changes in the natural landscape that are not reflected in the digital elevation model data.
Raster and vector processing for scanned linework
Greenlee, David D.
1987-01-01
An investigation of raster editing techniques, including thinning, filling, and node detecting, was performed by using specialized software. The techniques were based on encoding the state of the 3-by-3 neighborhood surrounding each pixel into a single byte. A prototypical method for converting the edited raster linkwork into vectors was also developed. Once vector representations of the lines were formed, they were formatted as a Digital Line Graph, and further refined by deletion of nonessential vertices and by smoothing with a curve-fitting technique.
D Central Line Extraction of Fossil Oyster Shells
NASA Astrophysics Data System (ADS)
Djuricic, A.; Puttonen, E.; Harzhauser, M.; Mandic, O.; Székely, B.; Pfeifer, N.
2016-06-01
Photogrammetry provides a powerful tool to digitally document protected, inaccessible, and rare fossils. This saves manpower in relation to current documentation practice and makes the fragile specimens more available for paleontological analysis and public education. In this study, high resolution orthophoto (0.5 mm) and digital surface models (1 mm) are used to define fossil boundaries that are then used as an input to automatically extract fossil length information via central lines. In general, central lines are widely used in geosciences as they ease observation, monitoring and evaluation of object dimensions. Here, the 3D central lines are used in a novel paleontological context to study fossilized oyster shells with photogrammetric and LiDAR-obtained 3D point cloud data. 3D central lines of 1121 Crassostrea gryphoides oysters of various shapes and sizes were computed in the study. Central line calculation included: i) Delaunay triangulation between the fossil shell boundary points and formation of the Voronoi diagram; ii) extraction of Voronoi vertices and construction of a connected graph tree from them; iii) reduction of the graph to the longest possible central line via Dijkstra's algorithm; iv) extension of longest central line to the shell boundary and smoothing by an adjustment of cubic spline curve; and v) integration of the central line into the corresponding 3D point cloud. The resulting longest path estimate for the 3D central line is a size parameter that can be applied in oyster shell age determination both in paleontological and biological applications. Our investigation evaluates ability and performance of the central line method to measure shell sizes accurately by comparing automatically extracted central lines with manually collected reference data used in paleontological analysis. Our results show that the automatically obtained central line length overestimated the manually collected reference by 1.5% in the test set, which is deemed sufficient for the selected paleontological application, namely shell age determination.
,
1990-01-01
The development of geographic information systems (GIS) is a rapidly growing industry that supports natural resources, studies, land management, environmental analysis, and urban and transporation planning. The increasing use of computers for storing and analyzing earth science information has greatly expanded the demand for digital cartographic and geographic data. Digital cartography involves the collection, storage, processing, analysis, and display of map data with the aid of computers. The U.S. Geological Survey (USGS), the Nation's largest earth science research agency, through its National Mapping Program, has expanded digital cartography operations to include the collection of elevation, planimetric, land use and land cover, and geographic names information in digital form. This digital information is available on 9-track magnetic tapes and, in the case of 1:2,000,000-scale planimetric digital line graph data, in Compact Disc Read Only Memory (CD-ROM) format. Digital information can be used with all types of geographic and land information systems.
Effects of self-graphing and goal setting on the math fact fluency of students with disabilities.
Figarola, Patricia M; Gunter, Philip L; Reffel, Julia M; Worth, Susan R; Hummel, John; Gerber, Brian L
2008-01-01
We evaluated the impact of goal setting and students' participation in graphing their own performance data on the rate of math fact calculations. Participants were 3 students with mild disabilities in the first and second grades; 2 of the 3 students were also identified with Attention-Deficit/Hyperactivity Disorder (ADHD). They were taught to use Microsoft Excel® software to graph their rate of correct calculations when completing timed, independent practice sheets consisting of single-digit mathematics problems. Two students' rates of correct calculations nearly always met or exceeded the aim line established for their correct calculations. Additional interventions were required for the third student. Results are discussed in terms of implications and future directions for increasing the use of evaluation components in classrooms for students at risk for behavior disorders and academic failure.
A signal-flow-graph approach to on-line gradient calculation.
Campolucci, P; Uncini, A; Piazza, F
2000-08-01
A large class of nonlinear dynamic adaptive systems such as dynamic recurrent neural networks can be effectively represented by signal flow graphs (SFGs). By this method, complex systems are described as a general connection of many simple components, each of them implementing a simple one-input, one-output transformation, as in an electrical circuit. Even if graph representations are popular in the neural network community, they are often used for qualitative description rather than for rigorous representation and computational purposes. In this article, a method for both on-line and batch-backward gradient computation of a system output or cost function with respect to system parameters is derived by the SFG representation theory and its known properties. The system can be any causal, in general nonlinear and time-variant, dynamic system represented by an SFG, in particular any feedforward, time-delay, or recurrent neural network. In this work, we use discrete-time notation, but the same theory holds for the continuous-time case. The gradient is obtained in a straightforward way by the analysis of two SFGs, the original one and its adjoint (obtained from the first by simple transformations), without the complex chain rule expansions of derivatives usually employed. This method can be used for sensitivity analysis and for learning both off-line and on-line. On-line learning is particularly important since it is required by many real applications, such as digital signal processing, system identification and control, channel equalization, and predistortion.
NASA Astrophysics Data System (ADS)
Huffmann, Master; Siegel, Edward Carl-Ludwig
2013-03-01
Newcomb-Benford(NeWBe)-Siegel log-law BEC Digit-Physics Network/Graph-Physics Barabasi et.al. evolving-``complex''-networks/graphs BEC JAMMING DOA attacks: Amazon(weekends: Microsoft I.E.-7/8(vs. Firefox): Memorial-day, Labor-day,...), MANY U.S.-Banks:WF,BoA,UB,UBS,...instantiations AGAIN militate for MANDATORY CONVERSION to PARALLEL ANALOG FAULT-TOLERANT but slow(er) SECURITY-ASSURANCE networks/graphs in parallel with faster ``sexy'' DIGITAL-Networks/graphs:``Cloud'', telecomm: n-G,..., because of common ACHILLES-HEEL VULNERABILITY: DIGITS!!! ``In fast-hare versus slow-tortoise race, Slow-But-Steady ALWAYS WINS!!!'' (Zeno). {Euler [#s(1732)] ∑- ∏()-Riemann[Monats. Akad. Berlin (1859)] ∑- ∏()- Kummer-Bernoulli (#s)}-Newcomb [Am.J.Math.4(1),39 (81) discovery of the QUANTUM!!!]-{Planck (01)]}-{Einstein (05)]-Poincar e [Calcul Probabilités,313(12)]-Weyl[Goett. Nach.(14); Math.Ann.77,313(16)]-(Bose (24)-Einstein(25)]-VS. -Fermi (27)-Dirac(27))-Menger [Dimensiontheorie(29)]-Benford [J.Am. Phil.Soc.78,115(38)]-Kac[Maths Stats.-Reason. (55)]- Raimi [Sci.Am.221,109(69)]-Jech-Hill [Proc.AMS,123,3,887(95)] log-function
Effects of Self-Graphing and Goal Setting on the Math Fact Fluency of Students with Disabilities
Figarola, Patricia M; Gunter, Philip L; Reffel, Julia M; Worth, Susan R; Hummel, John; Gerber, Brian L
2008-01-01
We evaluated the impact of goal setting and students' participation in graphing their own performance data on the rate of math fact calculations. Participants were 3 students with mild disabilities in the first and second grades; 2 of the 3 students were also identified with Attention-Deficit/Hyperactivity Disorder (ADHD). They were taught to use Microsoft Excel® software to graph their rate of correct calculations when completing timed, independent practice sheets consisting of single-digit mathematics problems. Two students' rates of correct calculations nearly always met or exceeded the aim line established for their correct calculations. Additional interventions were required for the third student. Results are discussed in terms of implications and future directions for increasing the use of evaluation components in classrooms for students at risk for behavior disorders and academic failure. PMID:22477686
Cartographic services contract...for everything geographic
,
2003-01-01
The U.S. Geological Survey's (USGS) Cartographic Services Contract (CSC) is used to award work for photogrammetric and mapping services under the umbrella of Architect-Engineer (A&E) contracting. The A&E contract is broad in scope and can accommodate any activity related to standard, nonstandard, graphic, and digital cartographic products. Services provided may include, but are not limited to, photogrammetric mapping and aerotriangulation; orthophotography; thematic mapping (for example, land characterization); analog and digital imagery applications; geographic information systems development; surveying and control acquisition, including ground-based and airborne Global Positioning System; analog and digital image manipulation, analysis, and interpretation; raster and vector map digitizing; data manipulations (for example, transformations, conversions, generalization, integration, and conflation); primary and ancillary data acquisition (for example, aerial photography, satellite imagery, multispectral, multitemporal, and hyperspectral data); image scanning and processing; metadata production, revision, and creation; and production or revision of standard USGS products defined by formal and informal specification and standards, such as those for digital line graphs, digital elevation models, digital orthophoto quadrangles, and digital raster graphics.
Graph-based layout analysis for PDF documents
NASA Astrophysics Data System (ADS)
Xu, Canhui; Tang, Zhi; Tao, Xin; Li, Yun; Shi, Cao
2013-03-01
To increase the flexibility and enrich the reading experience of e-book on small portable screens, a graph based method is proposed to perform layout analysis on Portable Document Format (PDF) documents. Digital born document has its inherent advantages like representing texts and fractional images in explicit form, which can be straightforwardly exploited. To integrate traditional image-based document analysis and the inherent meta-data provided by PDF parser, the page primitives including text, image and path elements are processed to produce text and non text layer for respective analysis. Graph-based method is developed in superpixel representation level, and page text elements corresponding to vertices are used to construct an undirected graph. Euclidean distance between adjacent vertices is applied in a top-down manner to cut the graph tree formed by Kruskal's algorithm. And edge orientation is then used in a bottom-up manner to extract text lines from each sub tree. On the other hand, non-textual objects are segmented by connected component analysis. For each segmented text and non-text composite, a 13-dimensional feature vector is extracted for labelling purpose. The experimental results on selected pages from PDF books are presented.
An Atlas of Computed Equivalent Widths of Quasar Broad Emission Lines
NASA Astrophysics Data System (ADS)
Korista, Kirk; Baldwin, Jack; Ferland, Gary; Verner, Dima
We present graphically the results of several thousand photoionization calculations of broad emission-line clouds in quasars, spanning 7 orders of magnitude in hydrogen ionizing flux and particle density. The equivalent widths of 42 quasar emission lines are presented as contours in the particle density-ionizing flux plane for a typical incident continuum shape, solar chemical abundances, and cloud column density of N(H) = 1023 cm-2. Results are similarly given for a small subset of emission lines for two other column densities (1022 and 1024 cm-2), five other incident continuum shapes, and a gas metallicity of 5 Z⊙. These graphs should prove useful in the analysis of quasar emission-line data and in the detailed modeling of quasar broad emission-line regions. The digital results of these emission-line grids and many more are available over the Internet.
The Design and Product of National 1:1000000 Cartographic Data of Topographic Map
NASA Astrophysics Data System (ADS)
Wang, Guizhi
2016-06-01
National administration of surveying, mapping and geoinformation started to launch the project of national fundamental geographic information database dynamic update in 2012. Among them, the 1:50000 database was updated once a year, furthermore the 1:250000 database was downsized and linkage-updated on the basis. In 2014, using the latest achievements of 1:250000 database, comprehensively update the 1:1000000 digital line graph database. At the same time, generate cartographic data of topographic map and digital elevation model data. This article mainly introduce national 1:1000000 cartographic data of topographic map, include feature content, database structure, Database-driven Mapping technology, workflow and so on.
The One Universal Graph — a free and open graph database
NASA Astrophysics Data System (ADS)
Ng, Liang S.; Champion, Corbin
2016-02-01
Recent developments in graph database mostly are huge projects involving big organizations, big operations and big capital, as the name Big Data attests. We proposed the concept of One Universal Graph (OUG) which states that all observable and known objects and concepts (physical, conceptual or digitally represented) can be connected with only one single graph; furthermore the OUG can be implemented with a very simple text file format with free software, capable of being executed on Android or smaller devices. As such the One Universal Graph Data Exchange (GOUDEX) modules can potentially be installed on hundreds of millions of Android devices and Intel compatible computers shipped annually. Coupled with its open nature and ability to connect to existing leading search engines and databases currently in operation, GOUDEX has the potential to become the largest and a better interface for users and programmers to interact with the data on the Internet. With a Web User Interface for users to use and program in native Linux environment, Free Crowdware implemented in GOUDEX can help inexperienced users learn programming with better organized documentation for free software, and is able to manage programmer's contribution down to a single line of code or a single variable in software projects. It can become the first practically realizable “Internet brain” on which a global artificial intelligence system can be implemented. Being practically free and open, One Universal Graph can have significant applications in robotics, artificial intelligence as well as social networks.
Inserting Phase Change Lines into Microsoft Excel® Graphs.
Dubuque, Erick M
2015-10-01
Microsoft Excel® is a popular graphing tool used by behavior analysts to visually display data. However, this program is not always friendly to the graphing conventions used by behavior analysts. For example, adding phase change lines has typically been a cumbersome process involving the insertion of line objects that do not move when new data is added to a graph. The purpose of this article is to describe a novel way to add phase change lines that move when new data is added and when graphs are resized.
Restrepo, John F; Garcia-Sucerquia, Jorge
2013-01-01
The number of colloidal particles per unit of volume that can be imaged correctly with digital lensless holographic microscopy (DLHM) is determined numerically. Typical in-line DLHM holograms with controlled concentration are modeled and reconstructed numerically. By quantifying the ratio of the retrieved particles from the reconstructed hologram to the number of the seeding particles in the modeled intensity, the limit of concentration of the colloidal suspensions up to which DLHM can operate successfully is found numerically. A new shadow density parameter for spherical illumination is defined. The limit of performance of DLHM is determined from a graph of the shadow density versus the efficiency of the microscope.
Ali, Nadia; Peebles, David
2013-02-01
We report three experiments investigating the ability of undergraduate college students to comprehend 2 x 2 "interaction" graphs from two-way factorial research designs. Factorial research designs are an invaluable research tool widely used in all branches of the natural and social sciences, and the teaching of such designs lies at the core of many college curricula. Such data can be represented in bar or line graph form. Previous studies have shown, however, that people interpret these two graphical forms differently. In Experiment 1, participants were required to interpret interaction data in either bar or line graphs while thinking aloud. Verbal protocol analysis revealed that line graph users were significantly more likely to misinterpret the data or fail to interpret the graph altogether. The patterns of errors line graph users made were interpreted as arising from the operation of Gestalt principles of perceptual organization, and this interpretation was used to develop two modified versions of the line graph, which were then tested in two further experiments. One of the modifications resulted in a significant improvement in performance. Results of the three experiments support the proposed explanation and demonstrate the effects (both positive and negative) of Gestalt principles of perceptual organization on graph comprehension. We propose that our new design provides a more balanced representation of the data than the standard line graph for nonexpert users to comprehend the full range of relationships in two-way factorial research designs and may therefore be considered a more appropriate representation for use in educational and other nonexpert contexts.
Dim target detection method based on salient graph fusion
NASA Astrophysics Data System (ADS)
Hu, Ruo-lan; Shen, Yi-yan; Jiang, Jun
2018-02-01
Dim target detection is one key problem in digital image processing field. With development of multi-spectrum imaging sensor, it becomes a trend to improve the performance of dim target detection by fusing the information from different spectral images. In this paper, one dim target detection method based on salient graph fusion was proposed. In the method, Gabor filter with multi-direction and contrast filter with multi-scale were combined to construct salient graph from digital image. And then, the maximum salience fusion strategy was designed to fuse the salient graph from different spectral images. Top-hat filter was used to detect dim target from the fusion salient graph. Experimental results show that proposal method improved the probability of target detection and reduced the probability of false alarm on clutter background images.
ERIC Educational Resources Information Center
Xi, Xiaoming
2010-01-01
Motivated by cognitive theories of graph comprehension, this study systematically manipulated characteristics of a line graph description task in a speaking test in ways to mitigate the influence of graph familiarity, a potential source of construct-irrelevant variance. It extends Xi (2005), which found that the differences in holistic scores on…
Sub-Nyquist Sampling and Moire-Like Waveform Distortions
NASA Technical Reports Server (NTRS)
Williams, Glenn L.
2000-01-01
Investigations of aliasing effects in digital waveform sampling have revealed the existence of a mathematical field and a pseudo-alias domain lying to the left of a "Nyquist line" in a plane defining the boundary between two domains of sampling. To the right of the line lies the classic alias domain. For signals band-limited below the Nyquist limit, displayed output may show a false modulation envelope. The effect occurs whenever the sample rate and the signal frequency are related by ratios of mutually prime integers. Belying the principal of a 10:1 sampling ratio being "good enough", this distortion easily occurs in graphed one-dimensional waveforms and two-dimensional images and occurs daily on television.
Around the Sun in a Graphing Calculator.
ERIC Educational Resources Information Center
Demana, Franklin; Waits, Bert K.
1989-01-01
Discusses the use of graphing calculators for polar and parametric equations. Presents eight lines of the program for the graph of a parametric equation and 11 lines of the program for a graph of a polar equation. Illustrates the application of the programs for planetary motion and free-fall motion. (YP)
Schruben, Paul G.
1996-01-01
This CD-ROM contains digital versions of the geology and resource assessment maps of Costa Rica originally published by the U.S. Geological Survey (USGS), the Direccion General de Geologia, Minas e Hidrocarburos, and the Universidad de Costa Rica in 1987 at a scale of 1:500,000 in USGS Folio I-1865. The following layers of the map are available on the CD-ROM: geology, favorable domains for selected deposit types, Bouguer gravity, isostatic gravity, mineral deposits, and rock geochemistry sample points. Some of the layers are provided in the following formats: ArcView 1 for Windows and UNIX, ARC/INFO 6.1.2 Export, Digital Line Graph (DLG) Optional, and Drawing Exchange File (DXF). This CD-ROM was produced in accordance with the ISO 9660 and Apple Computer's HFS standards.
Schruben, Paul G.
1997-01-01
This CD-ROM contains digital versions of the geology and resource assessment maps of Costa Rica originally published in USGS Folio I-1865 (U.S. Geological Survey, the Direccion General de Geologia, Minas e Hidrocarburos, and the Universidad de Costa Rica, 1987) at a scale of 1:500,000. The following layers are available on the CD-ROM: geology and faults; favorable domains for selected deposit types; Bouguer gravity data; isostatic gravity contours; mineral deposits, prospects, and occurrences; and rock geochemistry sample points. For DOS users, the CD-ROM contains MAPPER, a user-friendly map display program. Some of the maps are also provided in the following additional formats on the CD-ROM: (1) ArcView 1 and 3, (2) ARC/INFO 6.1.2 Export, (3) Digital Line Graph (DLG) Optional, and (4) Drawing Exchange File (DXF.)
Comparison of Student Understanding of Line Graph Slope in Physics and Mathematics
ERIC Educational Resources Information Center
Planinic, Maja; Milin-Sipus, Zeljka; Katic, Helena; Susac, Ana; Ivanjek, Lana
2012-01-01
This study gives an insight into the differences between student understanding of line graph slope in the context of physics (kinematics) and mathematics. Two pairs of parallel physics and mathematics questions that involved estimation and interpretation of line graph slope were constructed and administered to 114 Croatian second year high school…
A Critical Review of Line Graphs in Behavior Analytic Journals
ERIC Educational Resources Information Center
Kubina, Richard M., Jr.; Kostewicz, Douglas E.; Brennan, Kaitlyn M.; King, Seth A.
2017-01-01
Visual displays such as graphs have played an instrumental role in psychology. One discipline relies almost exclusively on graphs in both applied and basic settings, behavior analysis. The most common graphic used in behavior analysis falls under the category of time series. The line graph represents the most frequently used display for visual…
ERIC Educational Resources Information Center
Boote, Stacy K.
2014-01-01
This study examined how 12- and 13-year-old students' mathematics and science background knowledge affected line graph interpretations and how interpretations were affected by graph question levels. A purposive sample of 14 students engaged in think aloud interviews while completing an excerpted Test of Graphing in Science. Data were…
NASA Technical Reports Server (NTRS)
Cheng, Thomas D.; Angelici, Gary L.; Slye, Robert E.; Ma, Matt
1991-01-01
The USDA presently uses labor-intensive photographic interpretation procedures to delineate large geographical areas into manageable size sampling units for the estimation of domestic crop and livestock production. Computer software to automate the boundary delineation procedure, called the computer-assisted stratification and sampling (CASS) system, was developed using a Hewlett Packard color-graphics workstation. The CASS procedures display Thematic Mapper (TM) satellite digital imagery on a graphics display workstation as the backdrop for the onscreen delineation of sampling units. USGS Digital Line Graph (DLG) data for roads and waterways are displayed over the TM imagery to aid in identifying potential sample unit boundaries. Initial analysis conducted with three Missouri counties indicated that CASS was six times faster than the manual techniques in delineating sampling units.
A perceptive method for handwritten text segmentation
NASA Astrophysics Data System (ADS)
Lemaitre, Aurélie; Camillerapp, Jean; Coüasnon, Bertrand
2011-01-01
This paper presents a new method to address the problem of handwritten text segmentation into text lines and words. Thus, we propose a method based on the cooperation among points of view that enables the localization of the text lines in a low resolution image, and then to associate the pixels at a higher level of resolution. Thanks to the combination of levels of vision, we can detect overlapping characters and re-segment the connected components during the analysis. Then, we propose a segmentation of lines into words based on the cooperation among digital data and symbolic knowledge. The digital data are obtained from distances inside a Delaunay graph, which gives a precise distance between connected components, at the pixel level. We introduce structural rules in order to take into account some generic knowledge about the organization of a text page. This cooperation among information gives a bigger power of expression and ensures the global coherence of the recognition. We validate this work using the metrics and the database proposed for the segmentation contest of ICDAR 2009. Thus, we show that our method obtains very interesting results, compared to the other methods of the literature. More precisely, we are able to deal with slope and curvature, overlapping text lines and varied kinds of writings, which are the main difficulties met by the other methods.
A novel line segment detection algorithm based on graph search
NASA Astrophysics Data System (ADS)
Zhao, Hong-dan; Liu, Guo-ying; Song, Xu
2018-02-01
To overcome the problem of extracting line segment from an image, a method of line segment detection was proposed based on the graph search algorithm. After obtaining the edge detection result of the image, the candidate straight line segments are obtained in four directions. For the candidate straight line segments, their adjacency relationships are depicted by a graph model, based on which the depth-first search algorithm is employed to determine how many adjacent line segments need to be merged. Finally we use the least squares method to fit the detected straight lines. The comparative experimental results verify that the proposed algorithm has achieved better results than the line segment detector (LSD).
ERIC Educational Resources Information Center
Yoder, Sharon K.
This book discusses four kinds of graphs that are taught in mathematics at the middle school level: pictographs, bar graphs, line graphs, and circle graphs. The chapters on each of these types of graphs contain information such as starting, scaling, drawing, labeling, and finishing the graphs using "LogoWriter." The final chapter of the…
Digital Social Network Mining for Topic Discovery
NASA Astrophysics Data System (ADS)
Moradianzadeh, Pooya; Mohi, Maryam; Sadighi Moshkenani, Mohsen
Networked computers are expanding more and more around the world, and digital social networks becoming of great importance for many people's work and leisure. This paper mainly focused on discovering the topic of exchanging information in digital social network. In brief, our method is to use a hierarchical dictionary of related topics and words that mapped to a graph. Then, with comparing the extracted keywords from the context of social network with graph nodes, probability of relation between context and desired topics will be computed. This model can be used in many applications such as advertising, viral marketing and high-risk group detection.
Dexter: Data Extractor for scanned graphs
NASA Astrophysics Data System (ADS)
Demleitner, Markus
2011-12-01
The NASA Astrophysics Data System (ADS) now holds 1.3 million scanned pages, containing numerous plots and figures for which the original data sets are lost or inaccessible. The availability of scans of the figures can significantly ease the regeneration of the data sets. For this purpose, the ADS has developed Dexter, a Java applet that supports the user in this process. Dexter's basic functionality is to let the user manually digitize a plot by marking points and defining the coordinate transformation from the logical to the physical coordinate system. Advanced features include automatic identification of axes, tracing lines and finding points matching a template.
Structure and strategy in encoding simplified graphs
NASA Technical Reports Server (NTRS)
Schiano, Diane J.; Tversky, Barbara
1992-01-01
Tversky and Schiano (1989) found a systematic bias toward the 45-deg line in memory for the slopes of identical lines when embedded in graphs, but not in maps, suggesting the use of a cognitive reference frame specifically for encoding meaningful graphs. The present experiments explore this issue further using the linear configurations alone as stimuli. Experiments 1 and 2 demonstrate that perception and immediate memory for the slope of a test line within orthogonal 'axes' are predictable from purely structural considerations. In Experiments 3 and 4, subjects were instructed to use a diagonal-reference strategy in viewing the stimuli, which were described as 'graphs' only in Experiment 3. Results for both studies showed the diagonal bias previously found only for graphs. This pattern provides converging evidence for the diagonal as a cognitive reference frame in encoding linear graphs, and demonstrates that even in highly simplified displays, strategic factors can produce encoding biases not predictable solely from stimulus structure alone.
Bars, Lines, & Pies: A Graphing Skills Program. Expect the Unexpected with Math[R
ERIC Educational Resources Information Center
Actuarial Foundation, 2013
2013-01-01
"Bars, Lines, & Pies" is a dynamic math program designed to build graphing skills in students, while also showing them the relevance of math in their lives. Developed by The Actuarial Foundation along with Scholastic, the graphing lessons and activities involve engaging, real-world examples about the environment and recycling. In these lessons,…
NASA Astrophysics Data System (ADS)
Szyjka, Sebastian P.
The purpose of this study was to determine the extent to which six cognitive and attitudinal variables predicted pre-service elementary teachers' performance on line graphing. Predictors included Illinois teacher education basic skills sub-component scores in reading comprehension and mathematics, logical thinking performance scores, as well as measures of attitudes toward science, mathematics and graphing. This study also determined the strength of the relationship between each prospective predictor variable and the line graphing performance variable, as well as the extent to which measures of attitude towards science, mathematics and graphing mediated relationships between scores on mathematics, reading, logical thinking and line graphing. Ninety-four pre-service elementary education teachers enrolled in two different elementary science methods courses during the spring 2009 semester at Southern Illinois University Carbondale participated in this study. Each subject completed five different instruments designed to assess science, mathematics and graphing attitudes as well as logical thinking and graphing ability. Sixty subjects provided copies of primary basic skills score reports that listed subset scores for both reading comprehension and mathematics. The remaining scores were supplied by a faculty member who had access to a database from which the scores were drawn. Seven subjects, whose scores could not be found, were eliminated from final data analysis. Confirmatory factor analysis (CFA) was conducted in order to establish validity and reliability of the Questionnaire of Attitude Toward Line Graphs in Science (QALGS) instrument. CFA tested the statistical hypothesis that the five main factor structures within the Questionnaire of Attitude Toward Statistical Graphs (QASG) would be maintained in the revised QALGS. Stepwise Regression Analysis with backward elimination was conducted in order to generate a parsimonious and precise predictive model. This procedure allowed the researcher to explore the relationships among the affective and cognitive variables that were included in the regression analysis. The results for CFA indicated that the revised QALGS measure was sound in its psychometric properties when tested against the QASG. Reliability statistics indicated that the overall reliability for the 32 items in the QALGS was .90. The learning preferences construct had the lowest reliability (.67), while enjoyment (.89), confidence (.86) and usefulness (.77) constructs had moderate to high reliabilities. The first four measurement models fit the data well as indicated by the appropriate descriptive and statistical indices. However, the fifth measurement model did not fit the data well statistically, and only fit well with two descriptive indices. The results addressing the research question indicated that mathematical and logical thinking ability were significant predictors of line graph performance among the remaining group of variables. These predictors accounted for 41% of the total variability on the line graph performance variable. Partial correlation coefficients indicated that mathematics ability accounted for 20.5% of the variance on the line graphing performance variable when removing the effect of logical thinking. The logical thinking variable accounted for 4.7% of the variance on the line graphing performance variable when removing the effect of mathematics ability.
Study of Chromatic parameters of Line, Total, Middle graphs and Graph operators of Bipartite graph
NASA Astrophysics Data System (ADS)
Nagarathinam, R.; Parvathi, N.
2018-04-01
Chromatic parameters have been explored on the basis of graph coloring process in which a couple of adjacent nodes receives different colors. But the Grundy and b-coloring executes maximum colors under certain restrictions. In this paper, Chromatic, b-chromatic and Grundy number of some graph operators of bipartite graph has been investigat
ERIC Educational Resources Information Center
Conway, Lorraine
This packet of student materials contains a variety of worksheet activities dealing with science graphs and science word games. These reproducible materials deal with: (1) bar graphs; (2) line graphs; (3) circle graphs; (4) pictographs; (5) histograms; (6) artgraphs; (7) designing your own graphs; (8) medical prefixes; (9) color prefixes; (10)…
Modernization and multiscale databases at the U.S. geological survey
Morrison, J.L.
1992-01-01
The U.S. Geological Survey (USGS) has begun a digital cartographic modernization program. Keys to that program are the creation of a multiscale database, a feature-based file structure that is derived from a spatial data model, and a series of "templates" or rules that specify the relationships between instances of entities in reality and features in the database. The database will initially hold data collected from the USGS standard map products at scales of 1:24,000, 1:100,000, and 1:2,000,000. The spatial data model is called the digital line graph-enhanced model, and the comprehensive rule set consists of collection rules, product generation rules, and conflict resolution rules. This modernization program will affect the USGS mapmaking process because both digital and graphic products will be created from the database. In addition, non-USGS map users will have more flexibility in uses of the databases. These remarks are those of the session discussant made in response to the six papers and the keynote address given in the session. ?? 1992.
Intuitive color-based visualization of multimedia content as large graphs
NASA Astrophysics Data System (ADS)
Delest, Maylis; Don, Anthony; Benois-Pineau, Jenny
2004-06-01
Data visualization techniques are penetrating in various technological areas. In the field of multimedia such as information search and retrieval in multimedia archives, or digital media production and post-production, data visualization methodologies based on large graphs give an exciting alternative to conventional storyboard visualization. In this paper we develop a new approach to visualization of multimedia (video) documents based both on large graph clustering and preliminary video segmenting and indexing.
QSPR modeling: graph connectivity indices versus line graph connectivity indices
Basak; Nikolic; Trinajstic; Amic; Beslo
2000-07-01
Five QSPR models of alkanes were reinvestigated. Properties considered were molecular surface-dependent properties (boiling points and gas chromatographic retention indices) and molecular volume-dependent properties (molar volumes and molar refractions). The vertex- and edge-connectivity indices were used as structural parameters. In each studied case we computed connectivity indices of alkane trees and alkane line graphs and searched for the optimum exponent. Models based on indices with an optimum exponent and on the standard value of the exponent were compared. Thus, for each property we generated six QSPR models (four for alkane trees and two for the corresponding line graphs). In all studied cases QSPR models based on connectivity indices with optimum exponents have better statistical characteristics than the models based on connectivity indices with the standard value of the exponent. The comparison between models based on vertex- and edge-connectivity indices gave in two cases (molar volumes and molar refractions) better models based on edge-connectivity indices and in three cases (boiling points for octanes and nonanes and gas chromatographic retention indices) better models based on vertex-connectivity indices. Thus, it appears that the edge-connectivity index is more appropriate to be used in the structure-molecular volume properties modeling and the vertex-connectivity index in the structure-molecular surface properties modeling. The use of line graphs did not improve the predictive power of the connectivity indices. Only in one case (boiling points of nonanes) a better model was obtained with the use of line graphs.
Tangent Lines without Calculus
ERIC Educational Resources Information Center
Rabin, Jeffrey M.
2008-01-01
This article presents a problem that can help high school students develop the concept of instantaneous velocity and connect it with the slope of a tangent line to the graph of position versus time. It also gives a method for determining the tangent line to the graph of a polynomial function at any point without using calculus. (Contains 1 figure.)
Men's interpretations of graphical information in a videotape decision aid 1
Pylar, Jan; Wills, Celia E.; Lillie, Janet; Rovner, David R.; Kelly‐Blake, Karen; Holmes‐Rovner, Margaret
2007-01-01
Abstract Objective To examine men's interpretations of graphical information types viewed in a high‐quality, previously tested videotape decision aid (DA). Setting, participants, design A community‐dwelling sample of men >50 years of age (N = 188) balanced by education (college/non‐college) and race (Black/White) were interviewed just following their viewing of a videotape DA. A descriptive study design was used to examine men's interpretations of a representative sample of the types of graphs that were shown in the benign prostatic hyperplasia videotape DA. Main variables studied Men provided their interpretation of graphs information presented in three formats that varied in complexity: pictograph, line and horizontal bar graph. Audiotape transcripts of men's responses were coded for meaning and content‐related interpretation statements. Results Men provided both meaning and content‐focused interpretations of the graphs. Accuracy of interpretation was lower than hypothesized on the basis of literature review (85.4% for pictograph, 65.7% for line graph, 47.8% for horizontal bar graph). Accuracy for pictograph and line graphs was associated with education level, = 3.94, P = 0.047, and = 7.55, P = 0.006, respectively. Accuracy was uncorrelated with men's reported liking of the graphs, = 2.00, P = 0.441. Conclusion While men generally liked the DA, accuracy of graphs interpretation was associated with format complexity and education level. Graphs are often recommended to improve comprehension of information in DAs. However, additional evaluation is needed in experimental and naturalistic observational settings to develop best practice standards for data representation. PMID:17524011
Tangent Lines without Derivatives for Quadratic and Cubic Equations
ERIC Educational Resources Information Center
Carroll, William J.
2009-01-01
In the quadratic equation, y = ax[superscript 2] + bx + c, the equation y = bx + c is identified as the equation of the line tangent to the parabola at its y-intercept. This is extended to give a convenient method of graphing tangent lines at any point on the graph of a quadratic or a cubic equation. (Contains 5 figures.)
High-Resolution Dual-Comb Spectroscopy with Ultra-Low Noise Frequency Combs
NASA Astrophysics Data System (ADS)
Hänsel, Wolfgang; Giunta, Michele; Beha, Katja; Perry, Adam J.; Holzwarth, R.
2017-06-01
Dual-comb spectroscopy is a powerful tool for fast broad-band spectroscopy due to the parallel interrogation of thousands of spectral lines. Here we report on the spectroscopic analysis of acetylene vapor in a pressurized gas cell using two ultra-low noise frequency combs with a repetition rate around 250 MHz. Optical referencing to a high-finesse cavity yields a sub-Hertz stability of all individual comb lines (including the virtual comb lines between 0 Hz and the carrier) and permits one to pick a small difference of repetition rate for the two frequency combs on the order of 300 Hz, thus representing an optical spectrum of 100 THz (˜3300 \\wn) within half the free spectral range (125 MHz). The transmission signal is derived straight from a photodetector and recorded with a high-resolution spectrum analyzer or digitized with a computer-controlled AD converter. The figure to the right shows a schematic of the experimental setup which is all fiber-coupled with polarization-maintaining fiber except for the spectroscopic cell. The graph on the lower right reveals a portion of the recorded radio-frequency spectrum which has been scaled to the optical domain. The location of the measured absorption coincides well with data taken from the HITRAN data base. Due to the intrinsic linewidth of all contributing comb lines, each sampling point in the transmission graph corresponds to the probing at an optical frequency with sub-Hertz resolution. This resolution is maintained in coherent wavelength conversion processes such as difference-frequency generation (DFG), sum-frequency generation (SFG) or non-linear broadening (self-phase modulation), and is therefore easily transferred to a wide spectral range from the mid infrared up to the visible spectrum.
Comet Meteor Shower Put Magnesium and Iron into Martian Atmosphere
2014-11-07
The places where the red line on this graph extends higher than the blue line show detection of metals added to the Martian atmosphere from dust particles released by a passing comet on Oct. 19, 2014. The graphed data are from NASA MAVEN spacecraft.
A simple 2D composite image analysis technique for the crystal growth study of L-ascorbic acid.
Kumar, Krishan; Kumar, Virender; Lal, Jatin; Kaur, Harmeet; Singh, Jasbir
2017-06-01
This work was destined for 2D crystal growth studies of L-ascorbic acid using the composite image analysis technique. Growth experiments on the L-ascorbic acid crystals were carried out by standard (optical) microscopy, laser diffraction analysis, and composite image analysis. For image analysis, the growth of L-ascorbic acid crystals was captured as digital 2D RGB images, which were then processed to composite images. After processing, the crystal boundaries emerged as white lines against the black (cancelled) background. The crystal boundaries were well differentiated by peaks in the intensity graphs generated for the composite images. The lengths of crystal boundaries measured from the intensity graphs of composite images were in good agreement (correlation coefficient "r" = 0.99) with the lengths measured by standard microscopy. On the contrary, the lengths measured by laser diffraction were poorly correlated with both techniques. Therefore, the composite image analysis can replace the standard microscopy technique for the crystal growth studies of L-ascorbic acid. © 2017 Wiley Periodicals, Inc.
Graphing as a Problem-Solving Strategy.
ERIC Educational Resources Information Center
Cohen, Donald
1984-01-01
The focus is on how line graphs can be used to approximate solutions to rate problems and to suggest equations that offer exact algebraic solutions to the problem. Four problems requiring progressively greater graphing sophistication are presented plus four exercises. (MNS)
Integrating multisource land use and land cover data
Wright, Bruce E.; Tait, Mike; Lins, K.F.; Crawford, J.S.; Benjamin, S.P.; Brown, Jesslyn F.
1995-01-01
As part of the U.S. Geological Survey's (USGS) land use and land cover (LULC) program, the USGS in cooperation with the Environmental Systems Research Institute (ESRI) is collecting and integrating LULC data for a standard USGS 1:100,000-scale product. The LULC data collection techniques include interpreting spectrally clustered Landsat Thematic Mapper (TM) images; interpreting 1-meter resolution digital panchromatic orthophoto images; and, for comparison, aggregating locally available large-scale digital data of urban areas. The area selected is the Vancouver, WA-OR quadrangle, which has a mix of urban, rural agriculture, and forest land. Anticipated products include an integrated LULC prototype data set in a standard classification scheme referenced to the USGS digital line graph (DLG) data of the area and prototype software to develop digital LULC data sets.This project will evaluate a draft standard LULC classification system developed by the USGS for use with various source material and collection techniques. Federal, State, and local governments, and private sector groups will have an opportunity to evaluate the resulting prototype software and data sets and to provide recommendations. It is anticipated that this joint research endeavor will increase future collaboration among interested organizations, public and private, for LULC data collection using common standards and tools.
Dowding, Dawn; Merrill, Jacqueline A; Onorato, Nicole; Barrón, Yolanda; Rosati, Robert J; Russell, David
2018-02-01
To explore home care nurses' numeracy and graph literacy and their relationship to comprehension of visualized data. A multifactorial experimental design using online survey software. Nurses were recruited from 2 Medicare-certified home health agencies. Numeracy and graph literacy were measured using validated scales. Nurses were randomized to 1 of 4 experimental conditions. Each condition displayed data for 1 of 4 quality indicators, in 1 of 4 different visualized formats (bar graph, line graph, spider graph, table). A mixed linear model measured the impact of numeracy, graph literacy, and display format on data understanding. In all, 195 nurses took part in the study. They were slightly more numerate and graph literate than the general population. Overall, nurses understood information presented in bar graphs most easily (88% correct), followed by tables (81% correct), line graphs (77% correct), and spider graphs (41% correct). Individuals with low numeracy and low graph literacy had poorer comprehension of information displayed across all formats. High graph literacy appeared to enhance comprehension of data regardless of numeracy capabilities. Clinical dashboards are increasingly used to provide information to clinicians in visualized format, under the assumption that visual display reduces cognitive workload. Results of this study suggest that nurses' comprehension of visualized information is influenced by their numeracy, graph literacy, and the display format of the data. Individual differences in numeracy and graph literacy skills need to be taken into account when designing dashboard technology. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com
ERIC Educational Resources Information Center
Earnest, Darrell
2015-01-01
This article reports on students' problem-solving approaches across three representations--number lines, coordinate planes, and function graphs--the axes of which conventional mathematics treats in terms of consistent geometric and numeric coordinations. I consider these representations to be a part of a "hierarchical representational…
ERIC Educational Resources Information Center
Earnest, Darrell Steven
2012-01-01
This dissertation explores fifth and eighth grade students' interpretations of three kinds of mathematical representations: number lines, the Cartesian plane, and graphs of linear functions. Two studies were conducted. In Study 1, I administered the paper-and-pencil Linear Representations Assessment (LRA) to examine students'…
NASA Astrophysics Data System (ADS)
Kobylkin, Konstantin
2016-10-01
Computational complexity and approximability are studied for the problem of intersecting of a set of straight line segments with the smallest cardinality set of disks of fixed radii r > 0 where the set of segments forms straight line embedding of possibly non-planar geometric graph. This problem arises in physical network security analysis for telecommunication, wireless and road networks represented by specific geometric graphs defined by Euclidean distances between their vertices (proximity graphs). It can be formulated in a form of known Hitting Set problem over a set of Euclidean r-neighbourhoods of segments. Being of interest computational complexity and approximability of Hitting Set over so structured sets of geometric objects did not get much focus in the literature. Strong NP-hardness of the problem is reported over special classes of proximity graphs namely of Delaunay triangulations, some of their connected subgraphs, half-θ6 graphs and non-planar unit disk graphs as well as APX-hardness is given for non-planar geometric graphs at different scales of r with respect to the longest graph edge length. Simple constant factor approximation algorithm is presented for the case where r is at the same scale as the longest edge length.
System for line drawings interpretation
NASA Astrophysics Data System (ADS)
Boatto, L.; Consorti, Vincenzo; Del Buono, Monica; Eramo, Vincenzo; Esposito, Alessandra; Melcarne, F.; Meucci, Mario; Mosciatti, M.; Tucci, M.; Morelli, Arturo
1992-08-01
This paper describes an automatic system that extracts information from line drawings, in order to feed CAD or GIS systems. The line drawings that we analyze contain interconnected thin lines, dashed lines, text, and symbols. Characters and symbols may overlap with lines. Our approach is based on the properties of the run representation of a binary image that allow giving the image a graph structure. Using this graph structure, several algorithms have been designed to identify, directly in the raster image, straight segments, dashed lines, text, symbols, hatching lines, etc. Straight segments and dashed lines are converted into vectors, with high accuracy and good noise immunity. Characters and symbols are recognized by means of a recognizer, specifically developed for this application, designed to be insensitive to rotation and scaling. Subsequent processing steps include an `intelligent'' search through the graph in order to detect closed polygons, dashed lines, text strings, and other higher-level logical entities, followed by the identification of relationships (adjacency, inclusion, etc.) between them. Relationships are further translated into a formal description of the drawing. The output of the system can be used as input to a Geographic Information System package. The system is currently used by the Italian Land Register Authority to process cadastral maps.
Resolution power in digital in-line holography
NASA Astrophysics Data System (ADS)
Garcia-Sucerquia, J.; Xu, W.; Jericho, S. K.; Jericho, M. H.; Klages, P.; Kreuzer, H. J.
2006-01-01
Digital in-line holographic microscopy (DIHM) can achieve wavelength resolution both laterally and in depth with the simple optical setup consisting of a laser illuminating a wavelength-sized pinhole and a CCD camera for recording the hologram. The reconstruction is done numerically on the basis of the Kirchhoff-Helmholtz transform which yields a three-dimensional image of the objects throughout the sample volume. Resolution in DIHM depends on several controllable factors or parameters: (1) pinhole size controlling spatial coherence, (2) numerical aperture given by the size and positioning of the recording CCD chip, (3) pixel density and dynamic range controlling fringe resolution and noise level in the hologram and (4) wavelength. We present a detailed study of the individual and combined effects of these factors by doing an analytical analysis coupled with numerical simulations of holograms and their reconstruction. The result of this analysis is a set of criteria, also in the form of graphs, which can be used for the optimum design of the DIHM setup. We will also present a series of experimental results that test and confirm our theoretical analysis. The ultimate resolution to date is the imaging of the motion of submicron spheres and bacteria, a few microns apart, with speeds of hundreds of microns per second.
Self-organizing maps for learning the edit costs in graph matching.
Neuhaus, Michel; Bunke, Horst
2005-06-01
Although graph matching and graph edit distance computation have become areas of intensive research recently, the automatic inference of the cost of edit operations has remained an open problem. In the present paper, we address the issue of learning graph edit distance cost functions for numerically labeled graphs from a corpus of sample graphs. We propose a system of self-organizing maps (SOMs) that represent the distance measuring spaces of node and edge labels. Our learning process is based on the concept of self-organization. It adapts the edit costs in such a way that the similarity of graphs from the same class is increased, whereas the similarity of graphs from different classes decreases. The learning procedure is demonstrated on two different applications involving line drawing graphs and graphs representing diatoms, respectively.
Exploring and Making Sense of Large Graphs
2015-08-01
and bold) are n × n ; vectors (lower-case bold) are n × 1 column vectors, and scalars (in lower-case plain font) typically correspond to strength of...graph is often denoted as |V| or n . Edges or Links: A finite set E of lines between objects in a graph. The edges represent relationships between the...Adjacency matrix of a simple, unweighted and undirected graph. Adjacency matrix: The adjacency matrix of a graph G is an n × n matrix A, whose element aij
NASA Astrophysics Data System (ADS)
Cruz-Roa, Angel; Xu, Jun; Madabhushi, Anant
2015-01-01
Nuclear architecture or the spatial arrangement of individual cancer nuclei on histopathology images has been shown to be associated with different grades and differential risk for a number of solid tumors such as breast, prostate, and oropharyngeal. Graph-based representations of individual nuclei (nuclei representing the graph nodes) allows for mining of quantitative metrics to describe tumor morphology. These graph features can be broadly categorized into global and local depending on the type of graph construction method. While a number of local graph (e.g. Cell Cluster Graphs) and global graph (e.g. Voronoi, Delaunay Triangulation, Minimum Spanning Tree) features have been shown to associated with cancer grade, risk, and outcome for different cancer types, the sensitivity of the preceding segmentation algorithms in identifying individual nuclei can have a significant bearing on the discriminability of the resultant features. This therefore begs the question as to which features while being discriminative of cancer grade and aggressiveness are also the most resilient to the segmentation errors. These properties are particularly desirable in the context of digital pathology images, where the method of slide preparation, staining, and type of nuclear segmentation algorithm employed can all dramatically affect the quality of the nuclear graphs and corresponding features. In this paper we evaluated the trade off between discriminability and stability of both global and local graph-based features in conjunction with a few different segmentation algorithms and in the context of two different histopathology image datasets of breast cancer from whole-slide images (WSI) and tissue microarrays (TMA). Specifically in this paper we investigate a few different performance measures including stability, discriminability and stability vs discriminability trade off, all of which are based on p-values from the Kruskal-Wallis one-way analysis of variance for local and global graph features. Apart from identifying the set of local and global features that satisfied the trade off between stability and discriminability, our most interesting finding was that a simple segmentation method was sufficient to identify the most discriminant features for invasive tumour detection in TMAs, whereas for tumour grading in WSI, the graph based features were more sensitive to the accuracy of the segmentation algorithm employed.
JavaGenes and Condor: Cycle-Scavenging Genetic Algorithms
NASA Technical Reports Server (NTRS)
Globus, Al; Langhirt, Eric; Livny, Miron; Ramamurthy, Ravishankar; Soloman, Marvin; Traugott, Steve
2000-01-01
A genetic algorithm code, JavaGenes, was written in Java and used to evolve pharmaceutical drug molecules and digital circuits. JavaGenes was run under the Condor cycle-scavenging batch system managing 100-170 desktop SGI workstations. Genetic algorithms mimic biological evolution by evolving solutions to problems using crossover and mutation. While most genetic algorithms evolve strings or trees, JavaGenes evolves graphs representing (currently) molecules and circuits. Java was chosen as the implementation language because the genetic algorithm requires random splitting and recombining of graphs, a complex data structure manipulation with ample opportunities for memory leaks, loose pointers, out-of-bound indices, and other hard to find bugs. Java garbage-collection memory management, lack of pointer arithmetic, and array-bounds index checking prevents these bugs from occurring, substantially reducing development time. While a run-time performance penalty must be paid, the only unacceptable performance we encountered was using standard Java serialization to checkpoint and restart the code. This was fixed by a two-day implementation of custom checkpointing. JavaGenes is minimally integrated with Condor; in other words, JavaGenes must do its own checkpointing and I/O redirection. A prototype Java-aware version of Condor was developed using standard Java serialization for checkpointing. For the prototype to be useful, standard Java serialization must be significantly optimized. JavaGenes is approximately 8700 lines of code and a few thousand JavaGenes jobs have been run. Most jobs ran for a few days. Results include proof that genetic algorithms can evolve directed and undirected graphs, development of a novel crossover operator for graphs, a paper in the journal Nanotechnology, and another paper in preparation.
NASA Astrophysics Data System (ADS)
Sharma, Harshita; Zerbe, Norman; Heim, Daniel; Wienert, Stephan; Lohmann, Sebastian; Hellwich, Olaf; Hufnagl, Peter
2016-03-01
This paper describes a novel graph-based method for efficient representation and subsequent classification in histological whole slide images of gastric cancer. Her2/neu immunohistochemically stained and haematoxylin and eosin stained histological sections of gastric carcinoma are digitized. Immunohistochemical staining is used in practice by pathologists to determine extent of malignancy, however, it is laborious to visually discriminate the corresponding malignancy levels in the more commonly used haematoxylin and eosin stain, and this study attempts to solve this problem using a computer-based method. Cell nuclei are first isolated at high magnification using an automatic cell nuclei segmentation strategy, followed by construction of cell nuclei attributed relational graphs of the tissue regions. These graphs represent tissue architecture comprehensively, as they contain information about cell nuclei morphology as vertex attributes, along with knowledge of neighborhood in the form of edge linking and edge attributes. Global graph characteristics are derived and ensemble learning is used to discriminate between three types of malignancy levels, namely, non-tumor, Her2/neu positive tumor and Her2/neu negative tumor. Performance is compared with state of the art methods including four texture feature groups (Haralick, Gabor, Local Binary Patterns and Varma Zisserman features), color and intensity features, and Voronoi diagram and Delaunay triangulation. Texture, color and intensity information is also combined with graph-based knowledge, followed by correlation analysis. Quantitative assessment is performed using two cross validation strategies. On investigating the experimental results, it can be concluded that the proposed method provides a promising way for computer-based analysis of histopathological images of gastric cancer.
Leite, Rodrigo Oliveira; de Aquino, André Carlos Busanelli
2016-01-01
Previous researches support that graphs are relevant decision aids to tasks related to the interpretation of numerical information. Moreover, literature shows that different types of graphical information can help or harm the accuracy on decision making of accountants and financial analysts. We conducted a 4×2 mixed-design experiment to examine the effects of numerical information disclosure on financial analysts’ accuracy, and investigated the role of overconfidence in decision making. Results show that compared to text, column graph enhanced accuracy on decision making, followed by line graphs. No difference was found between table and textual disclosure. Overconfidence harmed accuracy, and both genders behaved overconfidently. Additionally, the type of disclosure (text, table, line graph and column graph) did not affect the overconfidence of individuals, providing evidence that overconfidence is a personal trait. This study makes three contributions. First, it provides evidence from a larger sample size (295) of financial analysts instead of a smaller sample size of students that graphs are relevant decision aids to tasks related to the interpretation of numerical information. Second, it uses the text as a baseline comparison to test how different ways of information disclosure (line and column graphs, and tables) can enhance understandability of information. Third, it brings an internal factor to this process: overconfidence, a personal trait that harms the decision-making process of individuals. At the end of this paper several research paths are highlighted to further study the effect of internal factors (personal traits) on financial analysts’ accuracy on decision making regarding numerical information presented in a graphical form. In addition, we offer suggestions concerning some practical implications for professional accountants, auditors, financial analysts and standard setters. PMID:27508519
Cardoso, Ricardo Lopes; Leite, Rodrigo Oliveira; de Aquino, André Carlos Busanelli
2016-01-01
Previous researches support that graphs are relevant decision aids to tasks related to the interpretation of numerical information. Moreover, literature shows that different types of graphical information can help or harm the accuracy on decision making of accountants and financial analysts. We conducted a 4×2 mixed-design experiment to examine the effects of numerical information disclosure on financial analysts' accuracy, and investigated the role of overconfidence in decision making. Results show that compared to text, column graph enhanced accuracy on decision making, followed by line graphs. No difference was found between table and textual disclosure. Overconfidence harmed accuracy, and both genders behaved overconfidently. Additionally, the type of disclosure (text, table, line graph and column graph) did not affect the overconfidence of individuals, providing evidence that overconfidence is a personal trait. This study makes three contributions. First, it provides evidence from a larger sample size (295) of financial analysts instead of a smaller sample size of students that graphs are relevant decision aids to tasks related to the interpretation of numerical information. Second, it uses the text as a baseline comparison to test how different ways of information disclosure (line and column graphs, and tables) can enhance understandability of information. Third, it brings an internal factor to this process: overconfidence, a personal trait that harms the decision-making process of individuals. At the end of this paper several research paths are highlighted to further study the effect of internal factors (personal traits) on financial analysts' accuracy on decision making regarding numerical information presented in a graphical form. In addition, we offer suggestions concerning some practical implications for professional accountants, auditors, financial analysts and standard setters.
ERIC Educational Resources Information Center
Hillman, Thomas
2014-01-01
This article examines mathematical activity with digital technology by tracing it from its development through its use in classrooms. Drawing on material-semiotic approaches from the field of Science and Technology Studies, it examines the visions of mathematical activity that developers had for an advanced graphing calculator. It then follows the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosmanis, Ansis
2011-02-15
I introduce a continuous-time quantum walk on graphs called the quantum snake walk, the basis states of which are fixed-length paths (snakes) in the underlying graph. First, I analyze the quantum snake walk on the line, and I show that, even though most states stay localized throughout the evolution, there are specific states that most likely move on the line as wave packets with momentum inversely proportional to the length of the snake. Next, I discuss how an algorithm based on the quantum snake walk might potentially be able to solve an extended version of the glued trees problem, whichmore » asks to find a path connecting both roots of the glued trees graph. To the best of my knowledge, no efficient quantum algorithm solving this problem is known yet.« less
Taking Advantage of Automated Assessment of Student-Constructed Graphs in Science
ERIC Educational Resources Information Center
Vitale, Jonathan M.; Lai, Kevin; Linn, Marcia C.
2015-01-01
We present a new system for automated scoring of graph construction items that address complex science concepts, feature qualitative prompts, and support a range of possible solutions. This system utilizes analysis of spatial features (e.g., slope of a line) to evaluate potential student ideas represented within graphs. Student ideas are then…
ERIC Educational Resources Information Center
Boote, Stacy K.; Boote, David N.
2017-01-01
Students often struggle to interpret graphs correctly, despite emphasis on graphic literacy in U.S. education standards documents. The purpose of this study was to describe challenges sixth graders with varying levels of science and mathematics achievement encounter when transitioning from interpreting graphs having discrete independent variables…
JavaGenes: Evolving Graphs with Crossover
NASA Technical Reports Server (NTRS)
Globus, Al; Atsatt, Sean; Lawton, John; Wipke, Todd
2000-01-01
Genetic algorithms usually use string or tree representations. We have developed a novel crossover operator for a directed and undirected graph representation, and used this operator to evolve molecules and circuits. Unlike strings or trees, a single point in the representation cannot divide every possible graph into two parts, because graphs may contain cycles. Thus, the crossover operator is non-trivial. A steady-state, tournament selection genetic algorithm code (JavaGenes) was written to implement and test the graph crossover operator. All runs were executed by cycle-scavagging on networked workstations using the Condor batch processing system. The JavaGenes code has evolved pharmaceutical drug molecules and simple digital circuits. Results to date suggest that JavaGenes can evolve moderate sized drug molecules and very small circuits in reasonable time. The algorithm has greater difficulty with somewhat larger circuits, suggesting that directed graphs (circuits) are more difficult to evolve than undirected graphs (molecules), although necessary differences in the crossover operator may also explain the results. In principle, JavaGenes should be able to evolve other graph-representable systems, such as transportation networks, metabolic pathways, and computer networks. However, large graphs evolve significantly slower than smaller graphs, presumably because the space-of-all-graphs explodes combinatorially with graph size. Since the representation strongly affects genetic algorithm performance, adding graphs to the evolutionary programmer's bag-of-tricks should be beneficial. Also, since graph evolution operates directly on the phenotype, the genotype-phenotype translation step, common in genetic algorithm work, is eliminated.
Fast Multiclass Segmentation using Diffuse Interface Methods on Graphs
2013-02-01
000 28 × 28 images of handwritten digits 0 through 9. Examples of entries can be found in Figure 6. The task is to classify each of the images into the...database of handwritten digits .” [Online]. Available: http://yann.lecun.com/exdb/mnist/ [36] J. Lellmann, J. H. Kappes, J. Yuan, F. Becker, and C...corresponding digit . The images include digits from 0 to 9; thus, this is a 10 class segmentation problem. To construct the weight matrix, we used N
Large-scale Graph Computation on Just a PC
2014-05-01
edges for several vertices simultaneously). We compared the performance of GraphChi-DB to Neo4j using their Java API (we discuss MySQL comparison in the...75 4.7.6 Comparison to RDBMS ( MySQL ) . . . . . . . . . . . . . . . . . . . . . 75 4.7.7 Summary of the...Windows method, GraphChi. The C++ implementation has circa 8,000 lines of code. We have also de- veloped a Java -version of GraphChi, but it does not
FORTRAN plotting subroutines for the space plasma laboratory
NASA Technical Reports Server (NTRS)
Williams, R.
1983-01-01
The computer program known as PLOTRW was custom made to satisfy some of the graphics requirements for the data collected in the Space Plasma Laboratory at the Johnson Space Center (JSC). The general requirements for the program were as follows: (1) all subroutines shall be callable through a FORTRAN source program; (2) all graphs shall fill one page and be properly labeled; (3) there shall be options for linear axes and logarithmic axes; (4) each axis shall have tick marks equally spaced with numeric values printed at the beginning tick mark and at the last tick mark; and (5) there shall be three options for plotting. These are: (1) point plot, (2) line plot and (3) point-line plot. The subroutines were written in FORTRAN IV for the LSI-11 Digital equipment Corporation (DEC) Computer. The program is now operational and can be run on any TEKTRONICX graphics terminal that uses a DEC Real-Time-11 (RT-11) operating system.
Distortions in memory for visual displays
NASA Technical Reports Server (NTRS)
Tversky, Barbara
1989-01-01
Systematic errors in perception and memory present a challenge to theories of perception and memory and to applied psychologists interested in overcoming them as well. A number of systematic errors in memory for maps and graphs are reviewed, and they are accounted for by an analysis of the perceptual processing presumed to occur in comprehension of maps and graphs. Visual stimuli, like verbal stimuli, are organized in comprehension and memory. For visual stimuli, the organization is a consequence of perceptual processing, which is bottom-up or data-driven in its earlier stages, but top-down and affected by conceptual knowledge later on. Segregation of figure from ground is an early process, and figure recognition later; for both, symmetry is a rapidly detected and ecologically valid cue. Once isolated, figures are organized relative to one another and relative to a frame of reference. Both perceptual (e.g., salience) and conceptual factors (e.g., significance) seem likely to affect selection of a reference frame. Consistent with the analysis, subjects perceived and remembered curves in graphs and rivers in maps as more symmetric than they actually were. Symmetry, useful for detecting and recognizing figures, distorts map and graph figures alike. Top-down processes also seem to operate in that calling attention to the symmetry vs. asymmetry of a slightly asymmetric curve yielded memory errors in the direction of the description. Conceptual frame of reference effects were demonstrated in memory for lines embedded in graphs. In earlier work, the orientation of map figures was distorted in memory toward horizontal or vertical. In recent work, graph lines, but not map lines, were remembered as closer to an imaginary 45 deg line than they had been. Reference frames are determined by both perceptual and conceptual factors, leading to selection of the canonical axes as a reference frame in maps, but selection of the imaginary 45 deg as a reference frame in graphs.
Digital Rights Management Implemented by RDF Graph Approach
ERIC Educational Resources Information Center
Yang, Jin Tan; Horng, Huai-Chien
2006-01-01
This paper proposes a design framework for constructing Digital Rights Management (DRM) that enables learning objects in legal usage. The central theme of this framework is that any design of a DRM must have theories as foundations to make the maintenance, extension or interoperability easy. While a learning objective consists of learning…
NASA Technical Reports Server (NTRS)
2004-01-01
In these line graphs of laboratory spectra, it is evident that different minerals have different spectra. The graph on the left shows the typical minerals found in igneous rocks, which are rocks related to magma or volcanic activity. The graph on the right shows iron-bearing candidates for further study and comparison to spectra from the Mars Exploration Rover panoramic cameras on Mars.Math Description Engine Software Development Kit
NASA Technical Reports Server (NTRS)
Shelton, Robert O.; Smith, Stephanie L.; Dexter, Dan E.; Hodgson, Terry R.
2010-01-01
The Math Description Engine Software Development Kit (MDE SDK) can be used by software developers to make computer-rendered graphs more accessible to blind and visually-impaired users. The MDE SDK generates alternative graph descriptions in two forms: textual descriptions and non-verbal sound renderings, or sonification. It also enables display of an animated trace of a graph sonification on a visual graph component, with color and line-thickness options for users having low vision or color-related impairments. A set of accessible graphical user interface widgets is provided for operation by end users and for control of accessible graph displays. Version 1.0 of the MDE SDK generates text descriptions for 2D graphs commonly seen in math and science curriculum (and practice). The mathematically rich text descriptions can also serve as a virtual math and science assistant for blind and sighted users, making graphs more accessible for everyone. The MDE SDK has a simple application programming interface (API) that makes it easy for programmers and Web-site developers to make graphs accessible with just a few lines of code. The source code is written in Java for cross-platform compatibility and to take advantage of Java s built-in support for building accessible software application interfaces. Compiled-library and NASA Open Source versions are available with API documentation and Programmer s Guide at http:/ / prim e.jsc.n asa. gov.
Bove, Dana J.; Knepper, Daniel H.
2000-01-01
This data set covering the western part of Colorado includes water quality data from eight different sources (points), nine U.S. Geological Survey Digital Raster Graph (DRG) files for topographic bases, a compilation of Tertiary age intrusions (polygons and lines), and two geotiff files showing areas of hydrothermally altered rock. These data were compiled for use with an ongoing mineral resource assessment of theGrand Mesa, Uncompahgre, and Gunnison National Forests (GMUG) and intervening Bureau of Land Management(BLM) lands. This compilation was assembled to give federal land managers a preliminary view of water within sub-basinal areas, and to show possible relationships to Tertiary age intrusion and areas of hydrothermal alteration.
Bathymetric map of the south part of Great Salt Lake, Utah, 2005
Baskin, Robert L.; Allen, David V.
2005-01-01
The U.S. Geological Survey, in cooperation with the Utah Department of Natural Resources, Division of Wildlife Resources, collected bathymetric data for the south part of Great Salt Lake during 2002–04 using a single beam, high-definition fathometer and real-time differential global positioning system. Approximately 7.6 million depth readings were collected along more than 1,050 miles of survey transects for construction of this map. Sound velocities were obtained in conjunction with the bathymetric data to provide time-of-travel corrections to the depth calculations. Data were processed with commercial hydrographic software and exported into geographic information system (GIS) software for mapping. Because of the shallow nature of the lake and the limitations of the instrumentation, contours above an altitude of 4,193 feet were digitized from existing USGS 1:24,000 source-scale digital line graph data.For additional information on methods used to derive the bathymetric contours for this map, please see Baskin, Robert L., 2005, Calculation of area and volume for the south part of Great Salt Lake, Utah, U.S. Geological Survey Open-File Report OFR–2005–1327.
NASA Astrophysics Data System (ADS)
Keller, Stacy Kathryn
This study examined how intermediate elementary students' mathematics and science background knowledge affected their interpretation of line graphs and how their interpretations were affected by graph question levels. A purposive sample of 14 6th-grade students engaged in think aloud interviews (Ericsson & Simon, 1993) while completing an excerpted Test of Graphing in Science (TOGS) (McKenzie & Padilla, 1986). Hand gestures were video recorded. Student performance on the TOGS was assessed using an assessment rubric created from previously cited factors affecting students' graphing ability. Factors were categorized using Bertin's (1983) three graph question levels. The assessment rubric was validated by Padilla and a veteran mathematics and science teacher. Observational notes were also collected. Data were analyzed using Roth and Bowen's semiotic process of reading graphs (2001). Key findings from this analysis included differences in the use of heuristics, self-generated questions, science knowledge, and self-motivation. Students with higher prior achievement used a greater number and variety of heuristics and more often chose appropriate heuristics. They also monitored their understanding of the question and the adequacy of their strategy and answer by asking themselves questions. Most used their science knowledge spontaneously to check their understanding of the question and the adequacy of their answers. Students with lower and moderate prior achievement favored one heuristic even when it was not useful for answering the question and rarely asked their own questions. In some cases, if students with lower prior achievement had thought about their answers in the context of their science knowledge, they would have been able to recognize their errors. One student with lower prior achievement motivated herself when she thought the questions were too difficult. In addition, students answered the TOGS in one of three ways: as if they were mathematics word problems, science data to be analyzed, or they were confused and had to guess. A second set of findings corroborated how science background knowledge affected graph interpretation: correct science knowledge supported students' reasoning, but it was not necessary to answer any question correctly; correct science knowledge could not compensate for incomplete mathematics knowledge; and incorrect science knowledge often distracted students when they tried to use it while answering a question. Finally, using Roth and Bowen's (2001) two-stage semiotic model of reading graphs, representative vignettes showed emerging patterns from the study. This study added to our understanding of the role of science content knowledge during line graph interpretation, highlighted the importance of heuristics and mathematics procedural knowledge, and documented the importance of perception attentions, motivation, and students' self-generated questions. Recommendations were made for future research in line graph interpretation in mathematics and science education and for improving instruction in this area.
ERIC Educational Resources Information Center
Bodner, Todd E.
2016-01-01
This article revisits how the end points of plotted line segments should be selected when graphing interactions involving a continuous target predictor variable. Under the standard approach, end points are chosen at ±1 or 2 standard deviations from the target predictor mean. However, when the target predictor and moderator are correlated or the…
Evaluation of force-torque displays for use with space station telerobotic activities
NASA Technical Reports Server (NTRS)
Hendrich, Robert C.; Bierschwale, John M.; Manahan, Meera K.; Stuart, Mark A.; Legendre, A. Jay
1992-01-01
Recent experiments which addressed Space Station remote manipulation tasks found that tactile force feedback (reflecting forces and torques encountered at the end-effector through the manipulator hand controller) does not improve performance significantly. Subjective response from astronaut and non-astronaut test subjects indicated that force information, provided visually, could be useful. No research exists which specifically investigates methods of presenting force-torque information visually. This experiment was designed to evaluate seven different visual force-torque displays which were found in an informal telephone survey. The displays were prototyped in the HyperCard programming environment. In a within-subjects experiment, 14 subjects nullified forces and torques presented statically, using response buttons located at the bottom of the screen. Dependent measures included questionnaire data, errors, and response time. Subjective data generally demonstrate that subjects rated variations of pseudo-perspective displays consistently better than bar graph and digital displays. Subjects commented that the bar graph and digital displays could be used, but were not compatible with using hand controllers. Quantitative data show similar trends to the subjective data, except that the bar graph and digital displays both provided good performance, perhaps do to the mapping of response buttons to display elements. Results indicate that for this set of displays, the pseudo-perspective displays generally represent a more intuitive format for presenting force-torque information.
Rea, A.H.; Becker, C.J.
1997-01-01
This compact disc contains 25 digital map data sets covering the State of Oklahoma that may be of interest to the general public, private industry, schools, and government agencies. Fourteen data sets are statewide. These data sets include: administrative boundaries; 104th U.S. Congressional district boundaries; county boundaries; latitudinal lines; longitudinal lines; geographic names; indexes of U.S. Geological Survey 1:100,000, and 1:250,000-scale topographic quadrangles; a shaded-relief image; Oklahoma State House of Representatives district boundaries; Oklahoma State Senate district boundaries; locations of U.S. Geological Survey stream gages; watershed boundaries and hydrologic cataloging unit numbers; and locations of weather stations. Eleven data sets are divided by county and are located in 77 county subdirectories. These data sets include: census block group boundaries with selected demographic data; city and major highways text; geographic names; land surface elevation contours; elevation points; an index of U.S. Geological Survey 1:24,000-scale topographic quadrangles; roads, streets and address ranges; highway text; school district boundaries; streams, river and lakes; and the public land survey system. All data sets are provided in a readily accessible format. Most data sets are provided in Digital Line Graph (DLG) format. The attributes for many of the DLG files are stored in related dBASE(R)-format files and may be joined to the data set polygon attribute or arc attribute tables using dBASE(R)-compatible software. (Any use of trade names in this publication is for descriptive purposes only and does not imply endorsement by the U.S. Government.) Point attribute tables are provided in dBASE(R) format only, and include the X and Y map coordinates of each point. Annotation (text plotted in map coordinates) are provided in AutoCAD Drawing Exchange format (DXF) files. The shaded-relief image is provided in TIFF format. All data sets except the shaded-relief image also are provided in ARC/INFO export-file format.
Optimal graph based segmentation using flow lines with application to airway wall segmentation.
Petersen, Jens; Nielsen, Mads; Lo, Pechin; Saghir, Zaigham; Dirksen, Asger; de Bruijne, Marleen
2011-01-01
This paper introduces a novel optimal graph construction method that is applicable to multi-dimensional, multi-surface segmentation problems. Such problems are often solved by refining an initial coarse surface within the space given by graph columns. Conventional columns are not well suited for surfaces with high curvature or complex shapes but the proposed columns, based on properly generated flow lines, which are non-intersecting, guarantee solutions that do not self-intersect and are better able to handle such surfaces. The method is applied to segment human airway walls in computed tomography images. Comparison with manual annotations on 649 cross-sectional images from 15 different subjects shows significantly smaller contour distances and larger area of overlap than are obtained with recently published graph based methods. Airway abnormality measurements obtained with the method on 480 scan pairs from a lung cancer screening trial are reproducible and correlate significantly with lung function.
1998-02-05
This graph depicts the increased signal quality possible with optical fibers made from ZBLAN, a family of heavy-metal fluoride glasses (fluorine combined zirconium, barium, lanthanum, aluminum, and sodium) as compared to silica fibers. NASA is conducting research on pulling ZBLAN fibers in the low-g environment of space to prevent crystallization that limits ZBLAN's usefulness in optical fiber-based communications. In the graph, a line closer to the black theoretical maximum line is better. Photo credit: NASA/Marshall Space Flight Center
The Area of a Surface Generated by Revolving a Graph about Any Line
ERIC Educational Resources Information Center
Goins, Edray Herber; Washington, Talitha M.
2013-01-01
We discuss a general formula for the area of the surface that is generated by a graph [t[subscript 0], t[subscript 1] [right arrow] [the set of real numbers][superscript 2] sending t [maps to] (x(t), y(t)) revolved around a general line L : Ax + By = C. As a corollary, we obtain a formula for the area of the surface formed by revolving y = f(x)…
Science 101: When Drawing Graphs from Collected Data, Why Don't You Just "Connect the Dots?"
ERIC Educational Resources Information Center
Robertson, William C.
2007-01-01
Using "error bars" on graphs is a good way to help students see that, within the inherent uncertainty of the measurements due to the instruments used for measurement, the data points do, in fact, lie along the line that represents the linear relationship. In this article, the author explains why connecting the dots on graphs of collected data is…
ERIC Educational Resources Information Center
Renton Vocational Inst., WA.
The teacher's guide and collection of transparency masters are designed for use in teaching adult basic education (ABE) students how to read and interpret graphs and charts. Covered in the individual lessons of the instructional unit are the reading and interpretation of charts as well as picture, line, bar, and circle graphs. Each unit contains a…
Flexibility in data interpretation: effects of representational format.
Braithwaite, David W; Goldstone, Robert L
2013-01-01
Graphs and tables differentially support performance on specific tasks. For tasks requiring reading off single data points, tables are as good as or better than graphs, while for tasks involving relationships among data points, graphs often yield better performance. However, the degree to which graphs and tables support flexibility across a range of tasks is not well-understood. In two experiments, participants detected main and interaction effects in line graphs and tables of bivariate data. Graphs led to more efficient performance, but also lower flexibility, as indicated by a larger discrepancy in performance across tasks. In particular, detection of main effects of variables represented in the graph legend was facilitated relative to detection of main effects of variables represented in the x-axis. Graphs may be a preferable representational format when the desired task or analytical perspective is known in advance, but may also induce greater interpretive bias than tables, necessitating greater care in their use and design.
Map showing location of observation wells in Massachusetts and Rhode Island
Rader, J.C.
1995-01-01
This map shows the locations of the 136 observation wells from the observation-well network maintained by the U.S. Geological Survey in Massachusetts and Rhode Island. The wells are identified by town name and well number. The map shows the location of the 10 observation wells that have digital recorders and the 126 observation wells that are measured by local observers. The aquifer material (sand, till, or bedrock) in which a well is located is noted. County and town boundaries are shown on the map. These features are presented at a scale of 1:400,000 (map size is about 38 by 30 inches). The map includes textual information describing the uses of observation-well data. The information is organized by construction, water supply, water quality, and statistical analysis. The map also presents observation well information, which was obtained from the annual data report of the Massachusetts--Rhode Island District. This infor- mation is presented in tabular form and includes town name, well number, aquifer material in which the well is located, and well depth below the land surface. The map was produced from a digital data base using a Geographic Information System. State boundaries were generated from digital line graphs maintained by the U.S. Geological Survey. Town and county boundaries were digitized from stable-base materials maintained by State agencies. The map was prepared in cooperation with State agencies of Massachusetts and Rhode Island.
Subtil, Fabien; Rabilloud, Muriel
2015-07-01
The receiver operating characteristic curves (ROC curves) are often used to compare continuous diagnostic tests or determine the optimal threshold of a test; however, they do not consider the costs of misclassifications or the disease prevalence. The ROC graph was extended to allow for these aspects. Two new lines are added to the ROC graph: a sensitivity line and a specificity line. Their slopes depend on the disease prevalence and on the ratio of the net benefit of treating a diseased subject to the net cost of treating a nondiseased one. First, these lines help researchers determine the range of specificities within which test comparisons of partial areas under the curves is clinically relevant. Second, the ROC curve point the farthest from the specificity line is shown to be the optimal threshold in terms of expected utility. This method was applied: (1) to determine the optimal threshold of ratio specific immunoglobulin G (IgG)/total IgG for the diagnosis of congenital toxoplasmosis and (2) to select, among two markers, the most accurate for the diagnosis of left ventricular hypertrophy in hypertensive subjects. The two additional lines transform the statistically valid ROC graph into a clinically relevant tool for test selection and threshold determination. Copyright © 2015 Elsevier Inc. All rights reserved.
Phase-change lines, scale breaks, and trend lines using Excel 2013.
Deochand, Neil; Costello, Mack S; Fuqua, R Wayne
2015-01-01
The development of graphing skills for behavior analysts is an ongoing process. Specialized graphing software is often expensive, is not widely disseminated, and may require specific training. Dixon et al. (2009) provided an updated task analysis for graph making in the widely used platform Excel 2007. Vanselow and Bourret (2012) provided online tutorials that outline some alternate methods also using Office 2007. This article serves as an update to those task analyses and includes some alternative and underutilized methods in Excel 2013. To examine the utility of our recommendations, 12 psychology graduate students were presented with the task analyses, and the experimenters evaluated their performance and noted feedback. The task analyses were rated favorably. © Society for the Experimental Analysis of Behavior.
Automatic extraction of numeric strings in unconstrained handwritten document images
NASA Astrophysics Data System (ADS)
Haji, M. Mehdi; Bui, Tien D.; Suen, Ching Y.
2011-01-01
Numeric strings such as identification numbers carry vital pieces of information in documents. In this paper, we present a novel algorithm for automatic extraction of numeric strings in unconstrained handwritten document images. The algorithm has two main phases: pruning and verification. In the pruning phase, the algorithm first performs a new segment-merge procedure on each text line, and then using a new regularity measure, it prunes all sequences of characters that are unlikely to be numeric strings. The segment-merge procedure is composed of two modules: a new explicit character segmentation algorithm which is based on analysis of skeletal graphs and a merging algorithm which is based on graph partitioning. All the candidate sequences that pass the pruning phase are sent to a recognition-based verification phase for the final decision. The recognition is based on a coarse-to-fine approach using probabilistic RBF networks. We developed our algorithm for the processing of real-world documents where letters and digits may be connected or broken in a document. The effectiveness of the proposed approach is shown by extensive experiments done on a real-world database of 607 documents which contains handwritten, machine-printed and mixed documents with different types of layouts and levels of noise.
Brundage, Michael D; Smith, Katherine C; Little, Emily A; Bantug, Elissa T; Snyder, Claire F
2015-10-01
Patient-reported outcomes (PROs) promote patient-centered care by using PRO research results ("group-level data") to inform decision making and by monitoring individual patient's PROs ("individual-level data") to inform care. We investigated the interpretability of current PRO data presentation formats. This cross-sectional mixed-methods study randomized purposively sampled cancer patients and clinicians to evaluate six group-data or four individual-data formats. A self-directed exercise assessed participants' interpretation accuracy and ratings of ease-of-understanding and usefulness (0 = least to 10 = most) of each format. Semi-structured qualitative interviews explored helpful and confusing format attributes. We reached thematic saturation with 50 patients (44 % < college graduate) and 20 clinicians. For group-level data, patients rated simple line graphs highest for ease-of-understanding and usefulness (median 8.0; 33 % selected for easiest to understand/most useful) and clinicians rated simple line graphs highest for ease-of-understanding and usefulness (median 9.0, 8.5) but most often selected line graphs with confidence limits or norms (30 % for each format for easiest to understand/most useful). Qualitative results support that clinicians value confidence intervals, norms, and p values, but patients find them confusing. For individual-level data, both patients and clinicians rated line graphs highest for ease-of-understanding (median 8.0 patients, 8.5 clinicians) and usefulness (median 8.0, 9.0) and selected them as easiest to understand (50, 70 %) and most useful (62, 80 %). The qualitative interviews supported highlighting scores requiring clinical attention and providing reference values. This study has identified preferences and opportunities for improving on current formats for PRO presentation and will inform development of best practices for PRO presentation. Both patients and clinicians prefer line graphs across group-level data and individual-level data formats, but clinicians prefer greater detail (e.g., statistical details) for group-level data.
Water Vapor Reaches Mars' Middle Atmosphere During Global Dust Storm
2018-01-23
Rising air during a 2007 global dust storm on Mars lofted water vapor into the planet's middle atmosphere, researchers learned from data graphed here, derived from observations by the Mars Climate Sounder instrument on NASA's Mars Reconnaissance Orbiter. The two vertical black lines in the right half of the graph (at about 260 and 310 on the horizontal scale) mark the beginning and end of the most recent global dust storm on Mars, which burst from regional scale to globe-encircling scale in July 2007. The presence of more colored dots, particularly green ones, in the upper portion of the graph between those lines, compared to the upper portion of the graph outside those lines, documents the uplift of water vapor in connection with the global dust storm. The vertical scale is altitude, labeled at left in kilometers above the surface of Mars (50 kilometers is about 30 miles; 80 kilometers is about 50 miles). The color bar below the graph gives the key to how much water vapor each dot represents, in parts per million, by volume, in Mars' atmosphere. Note that green to yellow represents about 100 times as much water as purple does. The horizontal axis of the graph is time, from January 2006 to February 2008. It is labeled with numbers representing the 360 degrees of Mars' orbit around the Sun, from zero to 360 degrees and then further on to include the first 30 degrees of the following Martian year. (The zero point is autumnal equinox -- end of summer -- in Mars' northern hemisphere.) This graph, based on Mars Reconnaissance Orbiter observations, was used in a January 2018 paper in Nature Astronomy by Nicholas Heavens of Hampton University in Hampton, Virginia, and co-authors. The paper presents Martian dust storms' uplifting effect on water vapor as a factor in seasonal patterns that other spacecraft have detected in the rate of hydrogen escaping from the top of Mars' atmosphere. https://photojournal.jpl.nasa.gov/catalog/PIA22080
On a programming language for graph algorithms
NASA Technical Reports Server (NTRS)
Rheinboldt, W. C.; Basili, V. R.; Mesztenyi, C. K.
1971-01-01
An algorithmic language, GRAAL, is presented for describing and implementing graph algorithms of the type primarily arising in applications. The language is based on a set algebraic model of graph theory which defines the graph structure in terms of morphisms between certain set algebraic structures over the node set and arc set. GRAAL is modular in the sense that the user specifies which of these mappings are available with any graph. This allows flexibility in the selection of the storage representation for different graph structures. In line with its set theoretic foundation, the language introduces sets as a basic data type and provides for the efficient execution of all set and graph operators. At present, GRAAL is defined as an extension of ALGOL 60 (revised) and its formal description is given as a supplement to the syntactic and semantic definition of ALGOL. Several typical graph algorithms are written in GRAAL to illustrate various features of the language and to show its applicability.
Flexibility in data interpretation: effects of representational format
Braithwaite, David W.; Goldstone, Robert L.
2013-01-01
Graphs and tables differentially support performance on specific tasks. For tasks requiring reading off single data points, tables are as good as or better than graphs, while for tasks involving relationships among data points, graphs often yield better performance. However, the degree to which graphs and tables support flexibility across a range of tasks is not well-understood. In two experiments, participants detected main and interaction effects in line graphs and tables of bivariate data. Graphs led to more efficient performance, but also lower flexibility, as indicated by a larger discrepancy in performance across tasks. In particular, detection of main effects of variables represented in the graph legend was facilitated relative to detection of main effects of variables represented in the x-axis. Graphs may be a preferable representational format when the desired task or analytical perspective is known in advance, but may also induce greater interpretive bias than tables, necessitating greater care in their use and design. PMID:24427145
Model-based morphological segmentation and labeling of coronary angiograms.
Haris, K; Efstratiadis, S N; Maglaveras, N; Pappas, C; Gourassas, J; Louridas, G
1999-10-01
A method for extraction and labeling of the coronary arterial tree (CAT) using minimal user supervision in single-view angiograms is proposed. The CAT structural description (skeleton and borders) is produced, along with quantitative information for the artery dimensions and assignment of coded labels, based on a given coronary artery model represented by a graph. The stages of the method are: 1) CAT tracking and detection; 2) artery skeleton and border estimation; 3) feature graph creation; and iv) artery labeling by graph matching. The approximate CAT centerline and borders are extracted by recursive tracking based on circular template analysis. The accurate skeleton and borders of each CAT segment are computed, based on morphological homotopy modification and watershed transform. The approximate centerline and borders are used for constructing the artery segment enclosing area (ASEA), where the defined skeleton and border curves are considered as markers. Using the marked ASEA, an artery gradient image is constructed where all the ASEA pixels (except the skeleton ones) are assigned the gradient magnitude of the original image. The artery gradient image markers are imposed as its unique regional minima by the homotopy modification method, the watershed transform is used for extracting the artery segment borders, and the feature graph is updated. Finally, given the created feature graph and the known model graph, a graph matching algorithm assigns the appropriate labels to the extracted CAT using weighted maximal cliques on the association graph corresponding to the two given graphs. Experimental results using clinical digitized coronary angiograms are presented.
Heuristic-driven graph wavelet modeling of complex terrain
NASA Astrophysics Data System (ADS)
Cioacǎ, Teodor; Dumitrescu, Bogdan; Stupariu, Mihai-Sorin; Pǎtru-Stupariu, Ileana; Nǎpǎrus, Magdalena; Stoicescu, Ioana; Peringer, Alexander; Buttler, Alexandre; Golay, François
2015-03-01
We present a novel method for building a multi-resolution representation of large digital surface models. The surface points coincide with the nodes of a planar graph which can be processed using a critically sampled, invertible lifting scheme. To drive the lazy wavelet node partitioning, we employ an attribute aware cost function based on the generalized quadric error metric. The resulting algorithm can be applied to multivariate data by storing additional attributes at the graph's nodes. We discuss how the cost computation mechanism can be coupled with the lifting scheme and examine the results by evaluating the root mean square error. The algorithm is experimentally tested using two multivariate LiDAR sets representing terrain surface and vegetation structure with different sampling densities.
Bathymetric map of the north part of Great Salt Lake, Utah, 2006
Baskin, Robert L.; Turner, Jane
2006-01-01
The U.S. Geological Survey, in cooperation with the Utah Department of Natural Resources, Division of Forestry, Fire, and State Lands, collected bathymetric data for the north part of Great Salt Lake during the spring and early summer of 2006 using a single beam, high-definition fathometer and real-time differential global positioning system. Approximately 5.2 million depth readings were collected along more than 765 miles of survey transects for construction of this map. Sound velocities were obtained in conjunction with the bathymetric data to provide time-of-travel corrections to the depth calculations. Data were processed using commercial hydrographic software and exported into a geographic information system (GIS) software for mapping. Due to the shallow nature of the lake and the limitations of the instrumentation, contours above an altitude of 4,194 feet were digitized from existing USGS 1:24,000 source-scale digital line graph data. The Behrens Trench is approximately located.For additional information on methods used to derive the bathymetric contours for this map, please see Baskin, Robert L., 2006, Calculation of area and volume for the North Part of Great Salt Lake, Utah, U.S. Geological Survey Open-File Report OFR–2006–1359
Usery, E. Lynn; Varanka, Dalia; Finn, Michael P.
2009-01-01
The United States Geological Survey (USGS) entered the mainstream of developments in computer-assisted technology for mapping during the 1970s. The introduction by USGS of digital line graphs (DLGs), digital elevation models (DEMs), and land use data analysis (LUDA) nationwide land-cover data provided a base for the rapid expansion of the use of GIS in the 1980s. Whereas USGS had developed the topologically structured DLG data and the Geographic Information Retrieval and Analysis System (GIRAS) for land-cover data, the Map Overlay Statistical System (MOSS), a nontopologically structured GIS software package developed by Autometric, Inc., under contract to the U.S. Fish and Wildlife Service, dominated the use of GIS by federal agencies in the 1970s. Thus, USGS data was used in MOSS, but the topological structure, which later became a requirement for GIS vector datasets, was not used in early GIS applications. The introduction of Esri's ARC/INFO in 1982 changed that, and by the end of the 1980s, topological structure for vector data was essential, and ARC/INFO was the dominant GIS software package used by federal agencies.
Graph characterization via Ihara coefficients.
Ren, Peng; Wilson, Richard C; Hancock, Edwin R
2011-02-01
The novel contributions of this paper are twofold. First, we demonstrate how to characterize unweighted graphs in a permutation-invariant manner using the polynomial coefficients from the Ihara zeta function, i.e., the Ihara coefficients. Second, we generalize the definition of the Ihara coefficients to edge-weighted graphs. For an unweighted graph, the Ihara zeta function is the reciprocal of a quasi characteristic polynomial of the adjacency matrix of the associated oriented line graph. Since the Ihara zeta function has poles that give rise to infinities, the most convenient numerically stable representation is to work with the coefficients of the quasi characteristic polynomial. Moreover, the polynomial coefficients are invariant to vertex order permutations and also convey information concerning the cycle structure of the graph. To generalize the representation to edge-weighted graphs, we make use of the reduced Bartholdi zeta function. We prove that the computation of the Ihara coefficients for unweighted graphs is a special case of our proposed method for unit edge weights. We also present a spectral analysis of the Ihara coefficients and indicate their advantages over other graph spectral methods. We apply the proposed graph characterization method to capturing graph-class structure and clustering graphs. Experimental results reveal that the Ihara coefficients are more effective than methods based on Laplacian spectra.
What Data Can Do: A Teacher's View of Digital Tools for Formative Assessment
ERIC Educational Resources Information Center
Gallagher, Kerry
2016-01-01
Digital tools are making it easier than ever for teachers to gather and analyze formative data. Paper exit slips can take a classroom teacher upward of an hour to sort and graph after just one day of classes. But now, that same teacher can pose a question out loud to the class and ask students to type answers on their mobile phones and hit send.…
ERIC Educational Resources Information Center
Pitts Bannister, Vanessa R.; Jamar, Idorenyin; Mutegi, Jomo W.
2007-01-01
In this article, the learning progress of one fifth-grade student is examined with regard to the development of her graph interpretation skills as she participated in the Junior Science Institute (JSI), a two-week, science intensive summer camp in which participants engaged in microbiology research and application. By showcasing the student's…
NASA Astrophysics Data System (ADS)
Yin, Y.; Sonka, M.
2010-03-01
A novel method is presented for definition of search lines in a variety of surface segmentation approaches. The method is inspired by properties of electric field direction lines and is applicable to general-purpose n-D shapebased image segmentation tasks. Its utility is demonstrated in graph construction and optimal segmentation of multiple mutually interacting objects. The properties of the electric field-based graph construction guarantee that inter-object graph connecting lines are non-intersecting and inherently covering the entire object-interaction space. When applied to inter-object cross-surface mapping, our approach generates one-to-one and all-to-all vertex correspondent pairs between the regions of mutual interaction. We demonstrate the benefits of the electric field approach in several examples ranging from relatively simple single-surface segmentation to complex multiobject multi-surface segmentation of femur-tibia cartilage. The performance of our approach is demonstrated in 60 MR images from the Osteoarthritis Initiative (OAI), in which our approach achieved a very good performance as judged by surface positioning errors (average of 0.29 and 0.59 mm for signed and unsigned cartilage positioning errors, respectively).
NASA Astrophysics Data System (ADS)
Alberti, Michael; Weber, Roman; Mancini, Marco
2017-10-01
The line-by-line procedure developed in the associated paper (Part A ) has been used to generate the total emissivity chart for pure CO and CO -N2 /air mixtures at 1 bar total pressure, in the 300 to 3000 K temperature and 0.01 to 3000 bar cm pressure path length range. Methods of scaling the emissivity to pressures different to 1 bar, in the range 0.1 to 40 bar, are provided through pressure correction graphs and EXCEL interpolator (Supplementary Material). The interpolated emissivities are within ± 2% margin from the line-by-line calculated values. The newly developed emissivity graphs are substantially more accurate than the existing Ulrich (1936) & Hottel (1954) and Abu-Romia & Tien (1966) charts.
Investigating diffusion with technology
NASA Astrophysics Data System (ADS)
Miller, Jon S.; Windelborn, Augden F.
2013-07-01
The activities described here allow students to explore the concept of diffusion with the use of common equipment such as computers, webcams and analysis software. The procedure includes taking a series of digital pictures of a container of water with a webcam as a dye slowly diffuses. At known time points, measurements of the pixel densities (darkness) of the digital pictures are recorded and then plotted on a graph. The resulting graph of darkness versus time allows students to see the results of diffusion of the dye over time. Through modification of the basic lesson plan, students are able to investigate the influence of a variety of variables on diffusion. Furthermore, students are able to expand the boundaries of their thinking by formulating hypotheses and testing their hypotheses through experimentation. As a result, students acquire a relevant science experience through taking measurements, organizing data into tables, analysing data and drawing conclusions.
NASA Astrophysics Data System (ADS)
Peterman, Karen; Cranston, Kayla A.; Pryor, Marie; Kermish-Allen, Ruth
2015-11-01
This case study was conducted within the context of a place-based education project that was implemented with primary school students in the USA. The authors and participating teachers created a performance assessment of standards-aligned tasks to examine 6-10-year-old students' graph interpretation skills as part of an exploratory research project. Fifty-five students participated in a performance assessment interview at the beginning and end of a place-based investigation. Two forms of the assessment were created and counterbalanced within class at pre and post. In situ scoring was conducted such that responses were scored as correct versus incorrect during the assessment's administration. Criterion validity analysis demonstrated an age-level progression in student scores. Tests of discriminant validity showed that the instrument detected variability in interpretation skills across each of three graph types (line, bar, dot plot). Convergent validity was established by correlating in situ scores with those from the Graph Interpretation Scoring Rubric. Students' proficiency with interpreting different types of graphs matched expectations based on age and the standards-based progression of graphs across primary school grades. The assessment tasks were also effective at detecting pre-post gains in students' interpretation of line graphs and dot plots after the place-based project. The results of the case study are discussed in relation to the common challenges associated with performance assessment. Implications are presented in relation to the need for authentic and performance-based instructional and assessment tasks to respond to the Common Core State Standards and the Next Generation Science Standards.
Nuclear power plant digital system PRA pilot study with the dynamic flow-graph methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yau, M.; Motamed, M.; Guarro, S.
2006-07-01
Current Probabilistic Risk Assessment (PRA) methodology is well established in analyzing hardware and some of the key human interactions. However processes for analyzing the software functions of digital systems within a plant PRA framework, and accounting for the digital system contribution to the overall risk are not generally available nor are they well understood and established. A recent study reviewed a number of methodologies that have potential applicability to modeling and analyzing digital systems within a PRA framework. This study identified the Dynamic Flow-graph Methodology (DFM) and the Markov Methodology as the most promising tools. As a result of thismore » study, a task was defined under the framework of a collaborative agreement between the U.S. Nuclear Regulatory Commission (NRC) and the Ohio State Univ. (OSU). The objective of this task is to set up benchmark systems representative of digital systems used in nuclear power plants and to evaluate DFM and the Markov methodology with these benchmark systems. The first benchmark system is a typical Pressurized Water Reactor (PWR) Steam Generator (SG) Feedwater System (FWS) level control system based on an earlier ASCA work with the U.S. NRC 2, upgraded with modern control laws. ASCA, Inc. is currently under contract to OSU to apply DFM to this benchmark system. The goal is to investigate the feasibility of using DFM to analyze and quantify digital system risk, and to integrate the DFM analytical results back into the plant event tree/fault tree PRA model. (authors)« less
Computer-aided boundary delineation of agricultural lands
NASA Technical Reports Server (NTRS)
Cheng, Thomas D.; Angelici, Gary L.; Slye, Robert E.; Ma, Matt
1989-01-01
The National Agricultural Statistics Service of the United States Department of Agriculture (USDA) presently uses labor-intensive aerial photographic interpretation techniques to divide large geographical areas into manageable-sized units for estimating domestic crop and livestock production. Prototype software, the computer-aided stratification (CAS) system, was developed to automate the procedure, and currently runs on a Sun-based image processing system. With a background display of LANDSAT Thematic Mapper and United States Geological Survey Digital Line Graph data, the operator uses a cursor to delineate agricultural areas, called sampling units, which are assigned to strata of land-use and land-cover types. The resultant stratified sampling units are used as input into subsequent USDA sampling procedures. As a test, three counties in Missouri were chosen for application of the CAS procedures. Subsequent analysis indicates that CAS was five times faster in creating sampling units than the manual techniques were.
Graphing in Groups: Learning about Lines in a Collaborative Classroom Network Environment
ERIC Educational Resources Information Center
White, Tobin; Wallace, Matthew; Lai, Kevin
2012-01-01
This article presents a design experiment in which we explore new structures for classroom collaboration supported by a classroom network of handheld graphing calculators. We describe a design for small group investigations of linear functions and present findings from its implementation in three high school algebra classrooms. Our coding of the…
The Effect of Emergent Features on Judgments of Quantity in Configural and Separable Displays
ERIC Educational Resources Information Center
Peebles, David
2008-01-01
Two experiments investigated effects of emergent features on perceptual judgments of comparative magnitude in three diagrammatic representations: kiviat charts, bar graphs, and line graphs. Experiment 1 required participants to compare individual values; whereas in Experiment 2 participants had to integrate several values to produce a global…
Van Norman, Ethan R; Christ, Theodore J
2016-10-01
Curriculum based measurement of oral reading (CBM-R) is used to monitor the effects of academic interventions for individual students. Decisions to continue, modify, or terminate these interventions are made by interpreting time series CBM-R data. Such interpretation is founded upon visual analysis or the application of decision rules. The purpose of this study was to compare the accuracy of visual analysis and decision rules. Visual analysts interpreted 108 CBM-R progress monitoring graphs one of three ways: (a) without graphic aids, (b) with a goal line, or (c) with a goal line and a trend line. Graphs differed along three dimensions, including trend magnitude, variability of observations, and duration of data collection. Automated trend line and data point decision rules were also applied to each graph. Inferential analyses permitted the estimation of the probability of a correct decision (i.e., the student is improving - continue the intervention, or the student is not improving - discontinue the intervention) for each evaluation method as a function of trend magnitude, variability of observations, and duration of data collection. All evaluation methods performed better when students made adequate progress. Visual analysis and decision rules performed similarly when observations were less variable. Results suggest that educators should collect data for more than six weeks, take steps to control measurement error, and visually analyze graphs when data are variable. Implications for practice and research are discussed. Copyright © 2016 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.
Quantitative Literacy: Working with Log Graphs
NASA Astrophysics Data System (ADS)
Shawl, S.
2013-04-01
The need for working with and understanding different types of graphs is a common occurrence in everyday life. Examples include anything having to do investments, being an educated juror in a case that involves evidence presented graphically, and understanding many aspect of our current political discourse. Within a science class graphs play a crucial role in presenting and interpreting data. In astronomy, where the range of graphed values is many orders of magnitude, log-axes must be used and understood. Experience shows that students do not understand how to read and interpret log-axes or how they differ from linear. Alters (1996), in a study of college students in an algebra-based physics class, found little understanding of log plotting. The purpose of this poster is to show the method and progression I have developed for use in my “ASTRO 101” class, with the goal being to help students better understand the H-R diagram, mass-luminosity relationship, and digital spectra.
Loops in hierarchical channel networks
NASA Astrophysics Data System (ADS)
Katifori, Eleni; Magnasco, Marcelo
2012-02-01
Nature provides us with many examples of planar distribution and structural networks having dense sets of closed loops. An archetype of this form of network organization is the vasculature of dicotyledonous leaves, which showcases a hierarchically-nested architecture. Although a number of methods have been proposed to measure aspects of the structure of such networks, a robust metric to quantify their hierarchical organization is still lacking. We present an algorithmic framework that allows mapping loopy networks to binary trees, preserving in the connectivity of the trees the architecture of the original graph. We apply this framework to investigate computer generated and natural graphs extracted from digitized images of dicotyledonous leaves and animal vasculature. We calculate various metrics on the corresponding trees and discuss the relationship of these quantities to the architectural organization of the original graphs. This algorithmic framework decouples the geometric information from the metric topology (connectivity and edge weight) and it ultimately allows us to perform a quantitative statistical comparison between predictions of theoretical models and naturally occurring loopy graphs.
ERIC Educational Resources Information Center
Aguilar, Stephen J.
2018-01-01
This qualitative study focuses on capturing students' understanding two visualizations often utilized by learning analytics-based educational technologies: bar graphs, and line graphs. It is framed by Achievement Goal Theory--a prominent theory of students' academic motivation--and utilizes interviews (n = 60) to investigate how students at risk…
Teaching Slope of a Line Using the Graphing Calculator as a Tool for Discovery Learning
ERIC Educational Resources Information Center
Nichols, Fiona Costello
2012-01-01
Discovery learning is one of the instructional strategies sometimes used to teach Algebra I. However, little research is available that includes investigation of the effects of incorporating the graphing calculator technology with discovery learning. This study was initiated to investigate two instructional approaches for teaching slope of a line…
Cannon, W.F.; Ottke, Doug
1999-01-01
The data on this CD consist of geographic information system (GIS) coverages and tabular data on the geology of Early Proterozoic and Archean rocks in part of the Early Proterozoic Penokean orogeny. The map emphasizes metasedimentary and metavolcanic rocks that were deposited along the southern margin of the Superior craton and were later deformed during continental collision at about 1850 Ma. The area includes the famous iron ranges of the south shore region of the Lake Superior district. Base maps, both as digital raster graphics (DRG) and digital line graphs (DLG) are also provided for the convenience of users. The map has been compiled from many individual studies, mostly by USGS researchers, completed during the past 50 years, including many detailed (1:24,000 scale) geologic maps. Data was compiled at 1:100,000 scale and preserves most of the details of source materials. This product is a preliminary release of the geologic map data bases during ongoing studies of the geology and metallogeny of the Penokean continental margin. Files are provided in three formats: Federal Spatial Data Transfer format (SDTS), Arc export format (.e00) files, and Arc coverages. All files can be accessed directly from the CD-ROM using either ARC/INFO 7.1.2 or later or Arc View 3.0 or later software. ESRI's Arc Explorer, a free GIS data viewer available at the web site: http://www.esri.com/software/arcexplorer/index.html also provides display and querying capability for these files.
Uav Borne Low Altitude Photogrammetry System
NASA Astrophysics Data System (ADS)
Lin, Z.; Su, G.; Xie, F.
2012-07-01
In this paper,the aforementioned three major aspects related to the Unmanned Aerial Vehicles (UAV) system for low altitude aerial photogrammetry, i.e., flying platform, imaging sensor system and data processing software, are discussed. First of all, according to the technical requirements about the least cruising speed, the shortest taxiing distance, the level of the flight control and the performance of turbulence flying, the performance and suitability of the available UAV platforms (e.g., fixed wing UAVs, the unmanned helicopters and the unmanned airships) are compared and analyzed. Secondly, considering the restrictions on the load weight of a platform and the resolution pertaining to a sensor, together with the exposure equation and the theory of optical information, the principles of designing self-calibration and self-stabilizing combined wide-angle digital cameras (e.g., double-combined camera and four-combined camera) are placed more emphasis on. Finally, a software named MAP-AT, considering the specialty of UAV platforms and sensors, is developed and introduced. Apart from the common functions of aerial image processing, MAP-AT puts more effort on automatic extraction, automatic checking and artificial aided adding of the tie points for images with big tilt angles. Based on the recommended process for low altitude photogrammetry with UAVs in this paper, more than ten aerial photogrammetry missions have been accomplished, the accuracies of Aerial Triangulation, Digital orthophotos(DOM)and Digital Line Graphs(DLG) of which meet the standard requirement of 1:2000, 1:1000 and 1:500 mapping.
Building Specialized Multilingual Lexical Graphs Using Community Resources
NASA Astrophysics Data System (ADS)
Daoud, Mohammad; Boitet, Christian; Kageura, Kyo; Kitamoto, Asanobu; Mangeot, Mathieu; Daoud, Daoud
We are describing methods for compiling domain-dedicated multilingual terminological data from various resources. We focus on collecting data from online community users as a main source, therefore, our approach depends on acquiring contributions from volunteers (explicit approach), and it depends on analyzing users' behaviors to extract interesting patterns and facts (implicit approach). As a generic repository that can handle the collected multilingual terminological data, we are describing the concept of dedicated Multilingual Preterminological Graphs MPGs, and some automatic approaches for constructing them by analyzing the behavior of online community users. A Multilingual Preterminological Graph is a special lexical resource that contains massive amount of terms related to a special domain. We call it preterminological, because it is a raw material that can be used to build a standardized terminological repository. Building such a graph is difficult using traditional approaches, as it needs huge efforts by domain specialists and terminologists. In our approach, we build such a graph by analyzing the access log files of the website of the community, and by finding the important terms that have been used to search in that website, and their association with each other. We aim at making this graph as a seed repository so multilingual volunteers can contribute. We are experimenting this approach with the Digital Silk Road Project. We have used its access log files since its beginning in 2003, and obtained an initial graph of around 116000 terms. As an application, we used this graph to obtain a preterminological multilingual database that is serving a CLIR system for the DSR project.
Optimizing Robinson Operator with Ant Colony Optimization As a Digital Image Edge Detection Method
NASA Astrophysics Data System (ADS)
Yanti Nasution, Tarida; Zarlis, Muhammad; K. M Nasution, Mahyuddin
2017-12-01
Edge detection serves to identify the boundaries of an object against a background of mutual overlap. One of the classic method for edge detection is operator Robinson. Operator Robinson produces a thin, not assertive and grey line edge. To overcome these deficiencies, the proposed improvements to edge detection method with the approach graph with Ant Colony Optimization algorithm. The repairs may be performed are thicken the edge and connect the edges cut off. Edge detection research aims to do optimization of operator Robinson with Ant Colony Optimization then compare the output and generated the inferred extent of Ant Colony Optimization can improve result of edge detection that has not been optimized and improve the accuracy of the results of Robinson edge detection. The parameters used in performance measurement of edge detection are morphology of the resulting edge line, MSE and PSNR. The result showed that Robinson and Ant Colony Optimization method produces images with a more assertive and thick edge. Ant Colony Optimization method is able to be used as a method for optimizing operator Robinson by improving the image result of Robinson detection average 16.77 % than classic Robinson result.
NASA Astrophysics Data System (ADS)
Kumar, Pradeep; Dutta, B. K.; Chattopadhyay, J.
2017-04-01
The miniaturized specimens are used to determine mechanical properties of the materials, such as yield stress, ultimate stress, fracture toughness etc. Use of such specimens is essential whenever limited quantity of material is available for testing, such as aged/irradiated materials. The miniaturized small punch test (SPT) is a technique which is widely used to determine change in mechanical properties of the materials. Various empirical correlations are proposed in the literature to determine the value of fracture toughness (JIC) using this technique. bi-axial fracture strain is determined using SPT tests. This parameter is then used to determine JIC using available empirical correlations. The correlations between JIC and biaxial fracture strain quoted in the literature are based on experimental data acquired for large number of materials. There are number of such correlations available in the literature, which are generally not in agreement with each other. In the present work, an attempt has been made to determine the correlation between biaxial fracture strain (εqf) and crack initiation toughness (Ji) numerically. About one hundred materials are digitally generated by varying yield stress, ultimate stress, hardening coefficient and Gurson parameters. Such set of each material is then used to analyze a SPT specimen and a standard TPB specimen. Analysis of SPT specimen generated biaxial fracture strain (εqf) and analysis of TPB specimen generated value of Ji. A graph is then plotted between these two parameters for all the digitally generated materials. The best fit straight line determines the correlation. It has been also observed that it is possible to have variation in Ji for the same value of biaxial fracture strain (εqf) within a limit. Such variation in the value of Ji has been also ascertained using the graph. Experimental SPT data acquired earlier for three materials were then used to get Ji by using newly developed correlation. A reasonable comparison of calculated Ji with the values quoted in literature confirmed usefulness of the correlation.
Enabling Graph Mining in RDF Triplestores using SPARQL for Holistic In-situ Graph Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Sangkeun; Sukumar, Sreenivas R; Hong, Seokyong
The graph analysis is now considered as a promising technique to discover useful knowledge in data with a new perspective. We envi- sion that there are two dimensions of graph analysis: OnLine Graph Analytic Processing (OLGAP) and Graph Mining (GM) where each respectively focuses on subgraph pattern matching and automatic knowledge discovery in graph. Moreover, as these two dimensions aim to complementarily solve complex problems, holistic in-situ graph analysis which covers both OLGAP and GM in a single system is critical for minimizing the burdens of operating multiple graph systems and transferring intermediate result-sets between those systems. Nevertheless, most existingmore » graph analysis systems are only capable of one dimension of graph analysis. In this work, we take an approach to enabling GM capabilities (e.g., PageRank, connected-component analysis, node eccentricity, etc.) in RDF triplestores, which are originally developed to store RDF datasets and provide OLGAP capability. More specifically, to achieve our goal, we implemented six representative graph mining algorithms using SPARQL. The approach allows a wide range of available RDF data sets directly applicable for holistic graph analysis within a system. For validation of our approach, we evaluate performance of our implementations with nine real-world datasets and three different computing environments - a laptop computer, an Amazon EC2 instance, and a shared-memory Cray XMT2 URIKA-GD graph-processing appliance. The experimen- tal results show that our implementation can provide promising and scalable performance for real world graph analysis in all tested environments. The developed software is publicly available in an open-source project that we initiated.« less
Enabling Graph Mining in RDF Triplestores using SPARQL for Holistic In-situ Graph Analysis
Lee, Sangkeun; Sukumar, Sreenivas R; Hong, Seokyong; ...
2016-01-01
The graph analysis is now considered as a promising technique to discover useful knowledge in data with a new perspective. We envi- sion that there are two dimensions of graph analysis: OnLine Graph Analytic Processing (OLGAP) and Graph Mining (GM) where each respectively focuses on subgraph pattern matching and automatic knowledge discovery in graph. Moreover, as these two dimensions aim to complementarily solve complex problems, holistic in-situ graph analysis which covers both OLGAP and GM in a single system is critical for minimizing the burdens of operating multiple graph systems and transferring intermediate result-sets between those systems. Nevertheless, most existingmore » graph analysis systems are only capable of one dimension of graph analysis. In this work, we take an approach to enabling GM capabilities (e.g., PageRank, connected-component analysis, node eccentricity, etc.) in RDF triplestores, which are originally developed to store RDF datasets and provide OLGAP capability. More specifically, to achieve our goal, we implemented six representative graph mining algorithms using SPARQL. The approach allows a wide range of available RDF data sets directly applicable for holistic graph analysis within a system. For validation of our approach, we evaluate performance of our implementations with nine real-world datasets and three different computing environments - a laptop computer, an Amazon EC2 instance, and a shared-memory Cray XMT2 URIKA-GD graph-processing appliance. The experimen- tal results show that our implementation can provide promising and scalable performance for real world graph analysis in all tested environments. The developed software is publicly available in an open-source project that we initiated.« less
Automating Phase Change Lines and Their Labels Using Microsoft Excel(R).
Deochand, Neil
2017-09-01
Many researchers have rallied against drawn in graphical elements and offered ways to avoid them, especially regarding the insertion of phase change lines (Deochand, Costello, & Fuqua, 2015; Dubuque, 2015; Vanselow & Bourret, 2012). However, few have offered a solution to automating the phase labels, which are often utilized in behavior analytic graphical displays (Deochand et al., 2015). Despite the fact that Microsoft Excel® is extensively utilized by behavior analysts, solutions to resolve issues in our graphing practices are not always apparent or user-friendly. Considering the insertion of phase change lines and their labels constitute a repetitious and laborious endeavor, any minimization in the steps to accomplish these graphical elements could offer substantial time-savings to the field. The purpose of this report is to provide an updated way (and templates in the supplemental materials) to add phase change lines with their respective labels, which stay embedded to the graph when they are moved or updated.
ERIC Educational Resources Information Center
Averitt, Sallie D.
This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills in working with line graphs and teaching…
ERIC Educational Resources Information Center
Stewart, Kelise K.; Carr, James E.; Brandt, Charles W.; McHenry, Meade M.
2007-01-01
The present study evaluated the effects of both a traditional lecture and the conservative dual-criterion (CDC) judgment aid on the ability of 6 university students to visually inspect AB-design line graphs. The traditional lecture reliably failed to improve visual inspection accuracy, whereas the CDC method substantially improved the performance…
Moving beyond the Bar Plot and the Line Graph to Create Informative and Attractive Graphics
ERIC Educational Resources Information Center
Larson-Hall, Jenifer
2017-01-01
Graphics are often mistaken for a mere frill in the methodological arsenal of data analysis when in fact they can be one of the simplest and at the same time most powerful methods of communicating statistical information (Tufte, 2001). The first section of the article argues for the statistical necessity of graphs, echoing and amplifying similar…
Introduction to Statistics. Learning Packages in the Policy Sciences Series, PS-26. Revised Edition.
ERIC Educational Resources Information Center
Policy Studies Associates, Croton-on-Hudson, NY.
The primary objective of this booklet is to introduce students to basic statistical skills that are useful in the analysis of public policy data. A few, selected statistical methods are presented, and theory is not emphasized. Chapter 1 provides instruction for using tables, bar graphs, bar graphs with grouped data, trend lines, pie diagrams,…
The Use of Graphics to Communicate Findings of Longitudinal Data in Design-Based Research
ERIC Educational Resources Information Center
Francis, Krista; Jacobsen, Michele; Friesen, Sharon
2014-01-01
Visuals and graphics have been used for communicating complex ideas since 1786 when William Playfair first invented the line graph and bar chart. Graphs and charts are useful for interpretation and making sense of data. For instance, John Snow's scatter plot helped pinpoint the source of a cholera outbreak in London in 1854 and also changed…
NASA Astrophysics Data System (ADS)
Adami, Riccardo; Cacciapuoti, Claudio; Finco, Domenico; Noja, Diego
We define the Schrödinger equation with focusing, cubic nonlinearity on one-vertex graphs. We prove global well-posedness in the energy domain and conservation laws for some self-adjoint boundary conditions at the vertex, i.e. Kirchhoff boundary condition and the so-called δ and δ‧ boundary conditions. Moreover, in the same setting, we study the collision of a fast solitary wave with the vertex and we show that it splits in reflected and transmitted components. The outgoing waves preserve a soliton character over a time which depends on the logarithm of the velocity of the ingoing solitary wave. Over the same timescale, the reflection and transmission coefficients of the outgoing waves coincide with the corresponding coefficients of the linear problem. In the analysis of the problem, we follow ideas borrowed from the seminal paper [17] about scattering of fast solitons by a delta interaction on the line, by Holmer, Marzuola and Zworski. The present paper represents an extension of their work to the case of graphs and, as a byproduct, it shows how to extend the analysis of soliton scattering by other point interactions on the line, interpreted as a degenerate graph.
Directional Agglomeration Multigrid Techniques for High Reynolds Number Viscous Flow Solvers
NASA Technical Reports Server (NTRS)
1998-01-01
A preconditioned directional-implicit agglomeration algorithm is developed for solving two- and three-dimensional viscous flows on highly anisotropic unstructured meshes of mixed-element types. The multigrid smoother consists of a pre-conditioned point- or line-implicit solver which operates on lines constructed in the unstructured mesh using a weighted graph algorithm. Directional coarsening or agglomeration is achieved using a similar weighted graph algorithm. A tight coupling of the line construction and directional agglomeration algorithms enables the use of aggressive coarsening ratios in the multigrid algorithm, which in turn reduces the cost of a multigrid cycle. Convergence rates which are independent of the degree of grid stretching are demonstrated in both two and three dimensions. Further improvement of the three-dimensional convergence rates through a GMRES technique is also demonstrated.
Directional Agglomeration Multigrid Techniques for High-Reynolds Number Viscous Flows
NASA Technical Reports Server (NTRS)
Mavriplis, Dimitri J.
1998-01-01
A preconditioned directional-implicit agglomeration algorithm is developed for solving two- and three-dimensional viscous flows on highly anisotropic unstructured meshes of mixed-element types. The multigrid smoother consists of a pre-conditioned point- or line-implicit solver which operates on lines constructed in the unstructured mesh using a weighted graph algorithm. Directional coarsening or agglomeration is achieved using a similar weighted graph algorithm. A tight coupling of the line construction and directional agglomeration algorithms enables the use of aggressive coarsening ratios in the multigrid algorithm, which in turn reduces the cost of a multigrid cycle. Convergence rates which are independent of the degree of grid stretching are demonstrated in both two and three dimensions. Further improvement of the three-dimensional convergence rates through a GMRES technique is also demonstrated.
Using graph approach for managing connectivity in integrative landscape modelling
NASA Astrophysics Data System (ADS)
Rabotin, Michael; Fabre, Jean-Christophe; Libres, Aline; Lagacherie, Philippe; Crevoisier, David; Moussa, Roger
2013-04-01
In cultivated landscapes, a lot of landscape elements such as field boundaries, ditches or banks strongly impact water flows, mass and energy fluxes. At the watershed scale, these impacts are strongly conditionned by the connectivity of these landscape elements. An accurate representation of these elements and of their complex spatial arrangements is therefore of great importance for modelling and predicting these impacts.We developped in the framework of the OpenFLUID platform (Software Environment for Modelling Fluxes in Landscapes) a digital landscape representation that takes into account the spatial variabilities and connectivities of diverse landscape elements through the application of the graph theory concepts. The proposed landscape representation consider spatial units connected together to represent the flux exchanges or any other information exchanges. Each spatial unit of the landscape is represented as a node of a graph and relations between units as graph connections. The connections are of two types - parent-child connection and up/downstream connection - which allows OpenFLUID to handle hierarchical graphs. Connections can also carry informations and graph evolution during simulation is possible (connections or elements modifications). This graph approach allows a better genericity on landscape representation, a management of complex connections and facilitate development of new landscape representation algorithms. Graph management is fully operational in OpenFLUID for developers or modelers ; and several graph tools are available such as graph traversal algorithms or graph displays. Graph representation can be managed i) manually by the user (for example in simple catchments) through XML-based files in easily editable and readable format or ii) by using methods of the OpenFLUID-landr library which is an OpenFLUID library relying on common open-source spatial libraries (ogr vector, geos topologic vector and gdal raster libraries). OpenFLUID-landr library has been developed in order i) to be used with no GIS expert skills needed (common gis formats can be read and simplified spatial management is provided), ii) to easily develop adapted rules of landscape discretization and graph creation to follow spatialized model requirements and iii) to allow model developers to manage dynamic and complex spatial topology. Graph management in OpenFLUID are shown with i) examples of hydrological modelizations on complex farmed landscapes and ii) the new implementation of Geo-MHYDAS tool based on the OpenFLUID-landr library, which allows to discretize a landscape and create graph structure for the MHYDAS model requirements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oppel, III, Fred; Hart, Brian; Hart, Derek
Umbra is a software package that has been in development at Sandia National Laboratories since 1995, under the name Umbra since 1997. Umbra is a software framework written in C++ and Tcl/Tk that has been applied to many operations, primarily dealing with robotics and simulation. Umbra executables are C++ libraries orchestrated with Tcl/Tk scripts. Two major feature upgrades occurred from 4.7 to 4.8 1. System Umbra Module with its own Update Graph within the C++ framework. 2. New terrain graph for fast line-of-sight calculations All else were minor updates such as later versions of Visual Studio, OpenSceneGraph and Boost.
On the structure of critical energy levels for the cubic focusing NLS on star graphs
NASA Astrophysics Data System (ADS)
Adami, Riccardo; Cacciapuoti, Claudio; Finco, Domenico; Noja, Diego
2012-05-01
We provide information on a non-trivial structure of phase space of the cubic nonlinear Schrödinger (NLS) on a three-edge star graph. We prove that, in contrast to the case of the standard NLS on the line, the energy associated with the cubic focusing Schrödinger equation on the three-edge star graph with a free (Kirchhoff) vertex does not attain a minimum value on any sphere of constant L2-norm. We moreover show that the only stationary state with prescribed L2-norm is indeed a saddle point.
Translations on Eastern Europe Scientific Affairs, Number 560
1977-10-04
Miklos Szilagyi . TAPNEG; prepares digitalized printed wiring diagram control punch tape on an ADMAP-2 graphing machine with reflection on the x axis...FOKAL 16 KE; BME, Dr Zsolt Illyefalvi-Vitez; BME, Dr Miklos Szilagyi . TESTOP-10; the program provides measurement and diagnostics for logic cards
NASA Astrophysics Data System (ADS)
Garciá-Arteaga, Juan D.; Corredor, Germán.; Wang, Xiangxue; Velcheti, Vamsidhar; Madabhushi, Anant; Romero, Eduardo
2017-11-01
Tumor-infiltrating lymphocytes occurs when various classes of white blood cells migrate from the blood stream towards the tumor, infiltrating it. The presence of TIL is predictive of the response of the patient to therapy. In this paper, we show how the automatic detection of lymphocytes in digital H and E histopathological images and the quantitative evaluation of the global lymphocyte configuration, evaluated through global features extracted from non-parametric graphs, constructed from the lymphocytes' detected positions, can be correlated to the patient's outcome in early-stage non-small cell lung cancer (NSCLC). The method was assessed on a tissue microarray cohort composed of 63 NSCLC cases. From the evaluated graphs, minimum spanning trees and K-nn showed the highest predictive ability, yielding F1 Scores of 0.75 and 0.72 and accuracies of 0.67 and 0.69, respectively. The predictive power of the proposed methodology indicates that graphs may be used to develop objective measures of the infiltration grade of tumors, which can, in turn, be used by pathologists to improve the decision making and treatment planning processes.
Sharma, Harshita; Alekseychuk, Alexander; Leskovsky, Peter; Hellwich, Olaf; Anand, R S; Zerbe, Norman; Hufnagl, Peter
2012-10-04
Computer-based analysis of digitalized histological images has been gaining increasing attention, due to their extensive use in research and routine practice. The article aims to contribute towards the description and retrieval of histological images by employing a structural method using graphs. Due to their expressive ability, graphs are considered as a powerful and versatile representation formalism and have obtained a growing consideration especially by the image processing and computer vision community. The article describes a novel method for determining similarity between histological images through graph-theoretic description and matching, for the purpose of content-based retrieval. A higher order (region-based) graph-based representation of breast biopsy images has been attained and a tree-search based inexact graph matching technique has been employed that facilitates the automatic retrieval of images structurally similar to a given image from large databases. The results obtained and evaluation performed demonstrate the effectiveness and superiority of graph-based image retrieval over a common histogram-based technique. The employed graph matching complexity has been reduced compared to the state-of-the-art optimal inexact matching methods by applying a pre-requisite criterion for matching of nodes and a sophisticated design of the estimation function, especially the prognosis function. The proposed method is suitable for the retrieval of similar histological images, as suggested by the experimental and evaluation results obtained in the study. It is intended for the use in Content Based Image Retrieval (CBIR)-requiring applications in the areas of medical diagnostics and research, and can also be generalized for retrieval of different types of complex images. The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/1224798882787923.
2012-01-01
Background Computer-based analysis of digitalized histological images has been gaining increasing attention, due to their extensive use in research and routine practice. The article aims to contribute towards the description and retrieval of histological images by employing a structural method using graphs. Due to their expressive ability, graphs are considered as a powerful and versatile representation formalism and have obtained a growing consideration especially by the image processing and computer vision community. Methods The article describes a novel method for determining similarity between histological images through graph-theoretic description and matching, for the purpose of content-based retrieval. A higher order (region-based) graph-based representation of breast biopsy images has been attained and a tree-search based inexact graph matching technique has been employed that facilitates the automatic retrieval of images structurally similar to a given image from large databases. Results The results obtained and evaluation performed demonstrate the effectiveness and superiority of graph-based image retrieval over a common histogram-based technique. The employed graph matching complexity has been reduced compared to the state-of-the-art optimal inexact matching methods by applying a pre-requisite criterion for matching of nodes and a sophisticated design of the estimation function, especially the prognosis function. Conclusion The proposed method is suitable for the retrieval of similar histological images, as suggested by the experimental and evaluation results obtained in the study. It is intended for the use in Content Based Image Retrieval (CBIR)-requiring applications in the areas of medical diagnostics and research, and can also be generalized for retrieval of different types of complex images. Virtual Slides The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/1224798882787923. PMID:23035717
NASA Astrophysics Data System (ADS)
Buick, Otto; Falcon, Pat; Alexander, G.; Siegel, Edward Carl-Ludwig
2013-03-01
Einstein[Dover(03)] critical-slowing-down(CSD)[Pais, Subtle in The Lord; Life & Sci. of Albert Einstein(81)] is Siegel CyberWar denial-of-access(DOA) operations-research queuing theory/pinning/jamming/.../Read [Aikido, Aikibojitsu & Natural-Law(90)]/Aikido(!!!) phase-transition critical-phenomenon via Siegel DIGIT-Physics (Newcomb[Am.J.Math. 4,39(1881)]-{Planck[(1901)]-Einstein[(1905)])-Poincare[Calcul Probabilités(12)-p.313]-Weyl [Goett.Nachr.(14); Math.Ann.77,313 (16)]-{Bose[(24)-Einstein[(25)]-Fermi[(27)]-Dirac[(1927)]}-``Benford''[Proc.Am.Phil.Soc. 78,4,551 (38)]-Kac[Maths.Stat.-Reasoning(55)]-Raimi[Sci.Am. 221,109 (69)...]-Jech[preprint, PSU(95)]-Hill[Proc.AMS 123,3,887(95)]-Browne[NYT(8/98)]-Antonoff-Smith-Siegel[AMS Joint-Mtg.,S.-D.(02)] algebraic-inversion to yield ONLY BOSE-EINSTEIN QUANTUM-statistics (BEQS) with ZERO-digit Bose-Einstein CONDENSATION(BEC) ``INTERSECTION''-BECOME-UNION to Barabasi[PRL 876,5632(01); Rev.Mod.Phys.74,47(02)...] Network /Net/GRAPH(!!!)-physics BEC: Strutt/Rayleigh(1881)-Polya(21)-``Anderson''(58)-Siegel[J.Non-crystalline-Sol.40,453(80)
Graphs in kinematics—a need for adherence to principles of algebraic functions
NASA Astrophysics Data System (ADS)
Sokolowski, Andrzej
2017-11-01
Graphs in physics are central to the analysis of phenomena and to learning about a system’s behavior. The ways students handle graphs are frequently researched. Students’ misconceptions are highlighted, and methods of improvement suggested. While kinematics graphs are to represent a real motion, they are also algebraic entities that must satisfy conditions for being algebraic functions. To be algebraic functions, they must pass certain tests before they can be used to infer more about motion. A preliminary survey of some physics resources has revealed that little attention is paid to verifying if the position, velocity and acceleration versus time graphs, that are to depict real motion, satisfy the most critical condition for being an algebraic function; the vertical line test. The lack of attention to this adherence shows as vertical segments in piecewise graphs. Such graphs generate unrealistic interpretations and may confuse students. A group of 25 college physics students was provided with such a graph and asked to analyse its adherence to reality. The majority of the students (N = 16, 64%) questioned the graph’s validity. It is inferred that such graphs might not only jeopardize the function principles studied in mathematics but also undermine the purpose of studying these principles. The aim of this study was to bring this idea forth and suggest a better alignment of physics and mathematics methods.
Quantifying loopy network architectures.
Katifori, Eleni; Magnasco, Marcelo O
2012-01-01
Biology presents many examples of planar distribution and structural networks having dense sets of closed loops. An archetype of this form of network organization is the vasculature of dicotyledonous leaves, which showcases a hierarchically-nested architecture containing closed loops at many different levels. Although a number of approaches have been proposed to measure aspects of the structure of such networks, a robust metric to quantify their hierarchical organization is still lacking. We present an algorithmic framework, the hierarchical loop decomposition, that allows mapping loopy networks to binary trees, preserving in the connectivity of the trees the architecture of the original graph. We apply this framework to investigate computer generated graphs, such as artificial models and optimal distribution networks, as well as natural graphs extracted from digitized images of dicotyledonous leaves and vasculature of rat cerebral neocortex. We calculate various metrics based on the asymmetry, the cumulative size distribution and the Strahler bifurcation ratios of the corresponding trees and discuss the relationship of these quantities to the architectural organization of the original graphs. This algorithmic framework decouples the geometric information (exact location of edges and nodes) from the metric topology (connectivity and edge weight) and it ultimately allows us to perform a quantitative statistical comparison between predictions of theoretical models and naturally occurring loopy graphs.
A Teacher's Journey with a New Generation Handheld: Decisions, Struggles, and Accomplishments
ERIC Educational Resources Information Center
Ozgun-Koca, S. Asli; Meagher, Michael; Edwards, Michael Todd
2011-01-01
In this technology-oriented age, teachers face daily decisions regarding the use of advanced digital technologies--graphing calculators, dynamic geometry software, blogs, wikis, podcasts and the like--to enhance student mathematical understanding in their classrooms. In this case study, the authors use the Technological, Pedagogical, and Content…
jSquid: a Java applet for graphical on-line network exploration.
Klammer, Martin; Roopra, Sanjit; Sonnhammer, Erik L L
2008-06-15
jSquid is a graph visualization tool for exploring graphs from protein-protein interaction or functional coupling networks. The tool was designed for the FunCoup web site, but can be used for any similar network exploring purpose. The program offers various visualization and graph manipulation techniques to increase the utility for the user. jSquid is available for direct usage and download at http://jSquid.sbc.su.se including source code under the GPLv3 license, and input examples. It requires Java version 5 or higher to run properly. erik.sonnhammer@sbc.su.se Supplementary data are available at Bioinformatics online.
A Benes-like theorem for the shuffle-exchange graph
NASA Technical Reports Server (NTRS)
Schwabe, Eric J.
1992-01-01
One of the first theorems on permutation routing, proved by V. E. Beness (1965), shows that given a set of source-destination pairs in an N-node butterfly network with at most a constant number of sources or destinations in each column of the butterfly, there exists a set of paths of lengths O(log N) connecting each pair such that the total congestion is constant. An analogous theorem yielding constant-congestion paths for off-line routing in the shuffle-exchange graph is proved here. The necklaces of the shuffle-exchange graph play the same structural role as the columns of the butterfly in Beness' theorem.
Development of a 14-digit Hydrologic Unit Code Numbering System for South Carolina
Bower, David E.; Lowry, Claude; Lowery, Mark A.; Hurley, Noel M.
1999-01-01
A Hydrologic Unit Map showing the cataloging units, watersheds, and subwatersheds of South Carolina has been developed by the U.S. Geological Survey in cooperation with the South Carolina Department of Health and Environmental Control, funded through a U.S. Environmental Protection Agency 319 Grant, and the U.S. Department of Agriculture, Natural Resources Conservation Service. These delineations represent 8-, 11-, and 14-digit Hydrologic Unit Codes, respectively. This map presents information on drainage, hydrography, and hydrologic boundaries of the water-resources regions, subregions, accounting units, cataloging units, watersheds, and subwatersheds. The source maps for the basin delineations are 1:24,000-scale 7.5-minute series topographic maps and the base maps are from 1:100,000-scale Digital Line Graphs; however, the data are published at a scale of 1:500,000. In addition, an electronic version of the data is provided on a compact disc.Of the 1,022 subwatersheds delineated for this project, 1,004 range in size from 3,000 to 40,000 acres (4.69 to 62.5 square miles). Seventeen subwatersheds are smaller than 3,000 acres and one subwatershed, located on St. Helena Island, is larger than 40,000 acres.This map and its associated codes provide a standardized base for use by water-resource managers and planners in locating, storing, retrieving, and exchanging hydrologic data. In addition, the map can be used for cataloging water-data acquisition activities, geographically organizing hydrologic data, and planning and describing water-use and related land-use activities.
Rate-Compatible Protograph LDPC Codes
NASA Technical Reports Server (NTRS)
Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)
2014-01-01
Digital communication coding methods resulting in rate-compatible low density parity-check (LDPC) codes built from protographs. Described digital coding methods start with a desired code rate and a selection of the numbers of variable nodes and check nodes to be used in the protograph. Constraints are set to satisfy a linear minimum distance growth property for the protograph. All possible edges in the graph are searched for the minimum iterative decoding threshold and the protograph with the lowest iterative decoding threshold is selected. Protographs designed in this manner are used in decode and forward relay channels.
User’s Manual for the Modular Analysis-Package Libraries ANAPAC and TRANL
1977-09-01
number) Computer software Fourier transforms Computer software library Interpolation software Digitized data...disregarded to give the user a simplified plot. (b) The last digit of ISPACE determines the type of line to be drawn, provided KODE is not...negative. If the last digit of ISPACE is 0 a solid line is drawn 1 a dashed line is drawn - - - 2 a dotted line is drawn .... 3 a dash-dot line is
GenomeGraphs: integrated genomic data visualization with R.
Durinck, Steffen; Bullard, James; Spellman, Paul T; Dudoit, Sandrine
2009-01-06
Biological studies involve a growing number of distinct high-throughput experiments to characterize samples of interest. There is a lack of methods to visualize these different genomic datasets in a versatile manner. In addition, genomic data analysis requires integrated visualization of experimental data along with constantly changing genomic annotation and statistical analyses. We developed GenomeGraphs, as an add-on software package for the statistical programming environment R, to facilitate integrated visualization of genomic datasets. GenomeGraphs uses the biomaRt package to perform on-line annotation queries to Ensembl and translates these to gene/transcript structures in viewports of the grid graphics package. This allows genomic annotation to be plotted together with experimental data. GenomeGraphs can also be used to plot custom annotation tracks in combination with different experimental data types together in one plot using the same genomic coordinate system. GenomeGraphs is a flexible and extensible software package which can be used to visualize a multitude of genomic datasets within the statistical programming environment R.
Computing Strongly Connected Components in the Streaming Model
NASA Astrophysics Data System (ADS)
Laura, Luigi; Santaroni, Federico
In this paper we present the first algorithm to compute the Strongly Connected Components of a graph in the datastream model (W-Stream), where the graph is represented by a stream of edges and we are allowed to produce intermediate output streams. The algorithm is simple, effective, and can be implemented with few lines of code: it looks at each edge in the stream, and selects the appropriate action with respect to a tree T, representing the graph connectivity seen so far. We analyze the theoretical properties of the algorithm: correctness, memory occupation (O(n logn)), per item processing time (bounded by the current height of T), and number of passes (bounded by the maximal height of T). We conclude by presenting a brief experimental evaluation of the algorithm against massive synthetic and real graphs that confirms its effectiveness: with graphs with up to 100M nodes and 4G edges, only few passes are needed, and millions of edges per second are processed.
Border and surface tracing--theoretical foundations.
Brimkov, Valentin E; Klette, Reinhard
2008-04-01
In this paper we define and study digital manifolds of arbitrary dimension, and provide (in particular)a general theoretical basis for curve or surface tracing in picture analysis. The studies involve properties such as one-dimensionality of digital curves and (n-1)-dimensionality of digital hypersurfaces that makes them discrete analogs of corresponding notions in continuous topology. The presented approach is fully based on the concept of adjacency relation and complements the concept of dimension as common in combinatorial topology. This work appears to be the first one on digital manifolds based ona graph-theoretical definition of dimension. In particular, in the n-dimensional digital space, a digital curve is a one-dimensional object and a digital hypersurface is an (n-1)-dimensional object, as it is in the case of curves and hypersurfaces in the Euclidean space. Relying on the obtained properties of digital hypersurfaces, we propose a uniform approach for studying good pairs defined by separations and obtain a classification of good pairs in arbitrary dimension. We also discuss possible applications of the presented definitions and results.
Min-cut segmentation of cursive handwriting in tabular documents
NASA Astrophysics Data System (ADS)
Davis, Brian L.; Barrett, William A.; Swingle, Scott D.
2015-01-01
Handwritten tabular documents, such as census, birth, death and marriage records, contain a wealth of information vital to genealogical and related research. Much work has been done in segmenting freeform handwriting, however, segmentation of cursive handwriting in tabular documents is still an unsolved problem. Tabular documents present unique segmentation challenges caused by handwriting overlapping cell-boundaries and other words, both horizontally and vertically, as "ascenders" and "descenders" overlap into adjacent cells. This paper presents a method for segmenting handwriting in tabular documents using a min-cut/max-flow algorithm on a graph formed from a distance map and connected components of handwriting. Specifically, we focus on line, word and first letter segmentation. Additionally, we include the angles of strokes of the handwriting as a third dimension to our graph to enable the resulting segments to share pixels of overlapping letters. Word segmentation accuracy is 89.5% evaluating lines of the data set used in the ICDAR2013 Handwriting Segmentation Contest. Accuracy is 92.6% for a specific application of segmenting first and last names from noisy census records. Accuracy for segmenting lines of names from noisy census records is 80.7%. The 3D graph cutting shows promise in segmenting overlapping letters, although highly convoluted or overlapping handwriting remains an ongoing challenge.
Safeguarding End-User Military Software
2014-12-04
product lines using composi- tional symbolic execution [17] Software product lines are families of products defined by feature commonality and vari...ability, with a well-managed asset base. Recent work in testing of software product lines has exploited similarities across development phases to reuse...feature dependence graph to extract the set of possible interaction trees in a product family. It composes these to incrementally and symbolically
Graph drawing using tabu search coupled with path relinking.
Dib, Fadi K; Rodgers, Peter
2018-01-01
Graph drawing, or the automatic layout of graphs, is a challenging problem. There are several search based methods for graph drawing which are based on optimizing an objective function which is formed from a weighted sum of multiple criteria. In this paper, we propose a new neighbourhood search method which uses a tabu search coupled with path relinking to optimize such objective functions for general graph layouts with undirected straight lines. To our knowledge, before our work, neither of these methods have been previously used in general multi-criteria graph drawing. Tabu search uses a memory list to speed up searching by avoiding previously tested solutions, while the path relinking method generates new solutions by exploring paths that connect high quality solutions. We use path relinking periodically within the tabu search procedure to speed up the identification of good solutions. We have evaluated our new method against the commonly used neighbourhood search optimization techniques: hill climbing and simulated annealing. Our evaluation examines the quality of the graph layout (objective function's value) and the speed of layout in terms of the number of evaluated solutions required to draw a graph. We also examine the relative scalability of each method. Our experimental results were applied to both random graphs and a real-world dataset. We show that our method outperforms both hill climbing and simulated annealing by producing a better layout in a lower number of evaluated solutions. In addition, we demonstrate that our method has greater scalability as it can layout larger graphs than the state-of-the-art neighbourhood search methods. Finally, we show that similar results can be produced in a real world setting by testing our method against a standard public graph dataset.
Graph drawing using tabu search coupled with path relinking
Rodgers, Peter
2018-01-01
Graph drawing, or the automatic layout of graphs, is a challenging problem. There are several search based methods for graph drawing which are based on optimizing an objective function which is formed from a weighted sum of multiple criteria. In this paper, we propose a new neighbourhood search method which uses a tabu search coupled with path relinking to optimize such objective functions for general graph layouts with undirected straight lines. To our knowledge, before our work, neither of these methods have been previously used in general multi-criteria graph drawing. Tabu search uses a memory list to speed up searching by avoiding previously tested solutions, while the path relinking method generates new solutions by exploring paths that connect high quality solutions. We use path relinking periodically within the tabu search procedure to speed up the identification of good solutions. We have evaluated our new method against the commonly used neighbourhood search optimization techniques: hill climbing and simulated annealing. Our evaluation examines the quality of the graph layout (objective function’s value) and the speed of layout in terms of the number of evaluated solutions required to draw a graph. We also examine the relative scalability of each method. Our experimental results were applied to both random graphs and a real-world dataset. We show that our method outperforms both hill climbing and simulated annealing by producing a better layout in a lower number of evaluated solutions. In addition, we demonstrate that our method has greater scalability as it can layout larger graphs than the state-of-the-art neighbourhood search methods. Finally, we show that similar results can be produced in a real world setting by testing our method against a standard public graph dataset. PMID:29746576
Digitizing the Past: A History Book on CD-ROM.
ERIC Educational Resources Information Center
Rosenzweig, Roy
1993-01-01
Describes the development of an American history book with interactive CD-ROM technology that includes text, pictures, graphs and charts, audio, and film. Topics discussed include the use of HyperCard software to link information; access to primary sources of information; greater student control over learning; and the concept of collaborative…
Probeware: Illuminating the Invisible
ERIC Educational Resources Information Center
Brunsell, Eric; Horejsi, Martin
2010-01-01
Probeware is the combination of sensors and software connected to a computer or handheld device. At the heart of probeware is a digital sensor that measures a particular physical parameter. One or more sensors are plugged into an interface, a calculator, and a handheld device. With the click of a button, probeware collects and graphs data in real…
ERIC Educational Resources Information Center
Chen, Nian-Shing; Teng, Daniel Chia-En; Lee, Cheng-Han; Kinshuk
2011-01-01
Comprehension is the goal of reading. However, students often encounter reading difficulties due to the lack of background knowledge and proper reading strategy. Unfortunately, print text provides very limited assistance to one's reading comprehension through its static knowledge representations such as symbols, charts, and graphs. Integrating…
Computational Fact Checking by Mining Knowledge Graphs
ERIC Educational Resources Information Center
Shiralkar, Prashant
2017-01-01
Misinformation and rumors have become rampant on online social platforms with adverse consequences for the real world. Fact-checking efforts are needed to mitigate the risks associated with the spread of digital misinformation. However, the pace at which information is generated online limits the capacity to fact-check claims at the same rate…
False Reality or Hidden Messages: Reading Graphs Obtained in Computerized Biological Experiments
ERIC Educational Resources Information Center
Sorgo, Andrej; Kocijancic, Slavko
2012-01-01
Information and communication technology (ICT) has become an inseparable part of schoolwork and a goal of education to prepare scientifically literate and digitally competent citizens. Yet the introduction of computers into school work has been much slower than its introduction in other spheres of life. Teachers' lack of knowledge/skills and…
Saund, Eric
2013-10-01
Effective object and scene classification and indexing depend on extraction of informative image features. This paper shows how large families of complex image features in the form of subgraphs can be built out of simpler ones through construction of a graph lattice—a hierarchy of related subgraphs linked in a lattice. Robustness is achieved by matching many overlapping and redundant subgraphs, which allows the use of inexpensive exact graph matching, instead of relying on expensive error-tolerant graph matching to a minimal set of ideal model graphs. Efficiency in exact matching is gained by exploitation of the graph lattice data structure. Additionally, the graph lattice enables methods for adaptively growing a feature space of subgraphs tailored to observed data. We develop the approach in the domain of rectilinear line art, specifically for the practical problem of document forms recognition. We are especially interested in methods that require only one or very few labeled training examples per category. We demonstrate two approaches to using the subgraph features for this purpose. Using a bag-of-words feature vector we achieve essentially single-instance learning on a benchmark forms database, following an unsupervised clustering stage. Further performance gains are achieved on a more difficult dataset using a feature voting method and feature selection procedure.
Symposium on Chemical Applications of Topology and Graph Theory, April 18-22, 1983.
1983-04-01
illustrated by application to the Lotka - Volterra oscillator. ELECTRICAL NETWORK REPRESENTATION OF n-DIMENSIONAL CHEMICAL MANIFOLDS L. Peusner P.O. Box 380...like molecules and others; the original formulas by Cayley were extended by Polya in a general enumeration theorem, simplified by Otter, and also studied...Gutman, leading to a joint paper which generalized it, using line graphs. Finally, electroneqativity consider- ations tell the strength of a chemical
Digital Model of Railway Electric Traction Lines
NASA Astrophysics Data System (ADS)
Garg, Rachana; Mahajan, Priya; Kumar, Parmod
2017-08-01
The characteristic impedance and propagation constant define the behavior of signal propagation over the transmission lines. The digital model for railway traction lines which includes railway tracks is developed, using curve fitting technique in MATLAB. The sensitivity of this model has been computed with respect to frequency. The digital sensitivity values are compared with the values of analog sensitivity. The developed model is useful for digital protection, integrated operation, control and planning of the system.
Temporal dynamics and impact of event interactions in cyber-social populations
NASA Astrophysics Data System (ADS)
Zhang, Yi-Qing; Li, Xiang
2013-03-01
The advance of information technologies provides powerful measures to digitize social interactions and facilitate quantitative investigations. To explore large-scale indoor interactions of a social population, we analyze 18 715 users' Wi-Fi access logs recorded in a Chinese university campus during 3 months, and define event interaction (EI) to characterize the concurrent interactions of multiple users inferred by their geographic coincidences—co-locating in the same small region at the same time. We propose three rules to construct a transmission graph, which depicts the topological and temporal features of event interactions. The vertex dynamics of transmission graph tells that the active durations of EIs fall into the truncated power-law distributions, which is independent on the number of involved individuals. The edge dynamics of transmission graph reports that the transmission durations present a truncated power-law pattern independent on the daily and weekly periodicities. Besides, in the aggregated transmission graph, low-degree vertices previously neglected in the aggregated static networks may participate in the large-degree EIs, which is verified by three data sets covering different sizes of social populations with various rendezvouses. This work highlights the temporal significance of event interactions in cyber-social populations.
NASA Astrophysics Data System (ADS)
Zhou, Lifan; Chai, Dengfeng; Xia, Yu; Ma, Peifeng; Lin, Hui
2018-01-01
Phase unwrapping (PU) is one of the key processes in reconstructing the digital elevation model of a scene from its interferometric synthetic aperture radar (InSAR) data. It is known that two-dimensional (2-D) PU problems can be formulated as maximum a posteriori estimation of Markov random fields (MRFs). However, considering that the traditional MRF algorithm is usually defined on a rectangular grid, it fails easily if large parts of the wrapped data are dominated by noise caused by large low-coherence area or rapid-topography variation. A PU solution based on sparse MRF is presented to extend the traditional MRF algorithm to deal with sparse data, which allows the unwrapping of InSAR data dominated by high phase noise. To speed up the graph cuts algorithm for sparse MRF, we designed dual elementary graphs and merged them to obtain the Delaunay triangle graph, which is used to minimize the energy function efficiently. The experiments on simulated and real data, compared with other existing algorithms, both confirm the effectiveness of the proposed MRF approach, which suffers less from decorrelation effects caused by large low-coherence area or rapid-topography variation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McVea, G.G.; Power, A.J.
1995-04-01
USA Military Specification MIL-D-22612 provides a procedure for measurement of particulate levels in Naval aviation gas turbine engine JP5 fuel (F44; RAN AVCAT) using the contaminated fuel detector (CFD). Evaluation of this procedure within the specification has revealed significant shortcomings in the application of the theoretical principles upon which the method is based. CFD measurements have been compared to gravimetric results from ASTM D2276, which provides accurate determination of concentrations of particulate matter in JP5. Inaccuracies evident in the CFD readings have been found to relate to the high sensitivity of the CFD to variations in fuel particulate extinction coefficientsmore » (ECs) (relating to fuel sediment colour) and to an error in the application of light transmittance theory in the recommended method. This report demonstrates that accurate CFD determination of JP5 particulate concentrations depends on spectrophotometric measurement of a narrow range of ECs of particulate matter. A range of fuel sediments derived from Australian naval ship and shore fuel storages was studied. It was observed that the CFD plot, which is in light transmittance mode, in theory provides a curved line graph against the gravimetric test results, whereas MIL-D-22612 describes a straight line graph. It was concluded that this must be an approximation. However, conversion of light transmittance data derived from the CFD into the reciprocal logarithm to give light absorbance data was shown to give a straight line graph which corresponded well with the gravimetric results. This relationship depended on construction of the graph on the basis of a narrow range of known particulate ECs. The conversion to absorbance gave improved correlation for JP5 particulate measurements with gravimetric procedures, using the CFD.« less
Interactive graphics for the Macintosh: software review of FlexiGraphs.
Antonak, R F
1990-01-01
While this product is clearly unique, its usefulness to individuals outside small business environments is somewhat limited. FlexiGraphs is, however, a reasonable first attempt to design a microcomputer software package that controls data through interactive editing within a graph. Although the graphics capabilities of mainframe programs such as MINITAB (Ryan, Joiner, & Ryan, 1981) and the graphic manipulations available through exploratory data analysis (e.g., Velleman & Hoaglin, 1981) will not be surpassed anytime soon by this program, a researcher may want to add this program to a software library containing other Macintosh statistics, drawing, and graphics programs if only to obtain the easy-to-obtain curve fitting and line smoothing options. I welcome the opportunity to review the enhanced "scientific" version of FlexiGraphs that the author of the program indicates is currently under development. An MS-DOS version of the program should be available within the year.
Test and Evaluation of the Time/Frequency Collision Avoidance System Concept.
1973-09-01
cumulative distributions were then plotted on “normal” graph paper , i.e., graph paper on whit..h a normal distribution will plot as a straight line...apparent problems. 6-8 _ _ _ _ _ _ _ _ _ _ _ _ _ CIMP TER SEVEN CONCLUSIONS AND RECOMMENDAT IONS 7. 1 CONCLUSIONS The time/frequency technique for...instrumentation due to waiting for an event that will not occur , there are time—outs that cause the process to step past the event in questions . In this
Graph theoretical model of a sensorimotor connectome in zebrafish.
Stobb, Michael; Peterson, Joshua M; Mazzag, Borbala; Gahtan, Ethan
2012-01-01
Mapping the detailed connectivity patterns (connectomes) of neural circuits is a central goal of neuroscience. The best quantitative approach to analyzing connectome data is still unclear but graph theory has been used with success. We present a graph theoretical model of the posterior lateral line sensorimotor pathway in zebrafish. The model includes 2,616 neurons and 167,114 synaptic connections. Model neurons represent known cell types in zebrafish larvae, and connections were set stochastically following rules based on biological literature. Thus, our model is a uniquely detailed computational representation of a vertebrate connectome. The connectome has low overall connection density, with 2.45% of all possible connections, a value within the physiological range. We used graph theoretical tools to compare the zebrafish connectome graph to small-world, random and structured random graphs of the same size. For each type of graph, 100 randomly generated instantiations were considered. Degree distribution (the number of connections per neuron) varied more in the zebrafish graph than in same size graphs with less biological detail. There was high local clustering and a short average path length between nodes, implying a small-world structure similar to other neural connectomes and complex networks. The graph was found not to be scale-free, in agreement with some other neural connectomes. An experimental lesion was performed that targeted three model brain neurons, including the Mauthner neuron, known to control fast escape turns. The lesion decreased the number of short paths between sensory and motor neurons analogous to the behavioral effects of the same lesion in zebrafish. This model is expandable and can be used to organize and interpret a growing database of information on the zebrafish connectome.
Axial Tomography from Digitized Real Time Radiography
DOE R&D Accomplishments Database
Zolnay, A. S.; McDonald, W. M.; Doupont, P. A.; McKinney, R. L.; Lee, M. M.
1985-01-18
Axial tomography from digitized real time radiographs provides a useful tool for industrial radiography and tomography. The components of this system are: x-ray source, image intensifier, video camera, video line extractor and digitizer, data storage and reconstruction computers. With this system it is possible to view a two dimensional x-ray image in real time at each angle of rotation and select the tomography plane of interest by choosing which video line to digitize. The digitization of a video line requires less than a second making data acquisition relatively short. Further improvements on this system are planned and initial results are reported.
Land Treatment Digital Library
Pilliod, David S.; Welty, Justin L.
2013-01-01
The Land Treatment Digital Library (LTDL) was created by the U.S. Geological Survey to catalog legacy land treatment information on Bureau of Land Management lands in the western United States. The LTDL can be used by federal managers and scientists for compiling information for data-calls, producing maps, generating reports, and conducting analyses at varying spatial and temporal scales. The LTDL currently houses thousands of treatments from BLM lands across 10 states. Users can browse a map to find information on individual treatments, perform more complex queries to identify a set of treatments, and view graphs of treatment summary statistics.
Teaching and Learning with a Visualiser in the Primary Classroom: Modelling Graph-Making
ERIC Educational Resources Information Center
Mavers, Diane
2009-01-01
This paper examines the technological affordances of the visualiser, and what teachers actually do with it in the primary (elementary) classroom, followed by an investigation into one example of teaching and learning with this whole-class technology. A visualiser is a digital display device. Connected to a data projector, whatever is in view of…
A New Spin on Miscue Analysis: Using Spider Charts to Web Reading Processes
ERIC Educational Resources Information Center
Wohlwend, Karen E.
2012-01-01
This article introduces a way of seeing miscue analysis data through a "spider chart", a readily available digital graphing tool that provides an effective way to visually represent readers' complex coordination of interrelated cueing systems. A spider chart is a standard feature in recent spreadsheet software that puts a new spin on miscue…
NASA Astrophysics Data System (ADS)
Shi, Y.; Long, Y.; Wi, X. L.
2014-04-01
When tourists visiting multiple tourist scenic spots, the travel line is usually the most effective road network according to the actual tour process, and maybe the travel line is different from planned travel line. For in the field of navigation, a proposed travel line is normally generated automatically by path planning algorithm, considering the scenic spots' positions and road networks. But when a scenic spot have a certain area and have multiple entrances or exits, the traditional described mechanism of single point coordinates is difficult to reflect these own structural features. In order to solve this problem, this paper focuses on the influence on the process of path planning caused by scenic spots' own structural features such as multiple entrances or exits, and then proposes a doubleweighted Graph Model, for the weight of both vertexes and edges of proposed Model can be selected dynamically. And then discusses the model building method, and the optimal path planning algorithm based on Dijkstra algorithm and Prim algorithm. Experimental results show that the optimal planned travel line derived from the proposed model and algorithm is more reasonable, and the travelling order and distance would be further optimized.
Deriving the Regression Line with Algebra
ERIC Educational Resources Information Center
Quintanilla, John A.
2017-01-01
Exploration with spreadsheets and reliance on previous skills can lead students to determine the line of best fit. To perform linear regression on a set of data, students in Algebra 2 (or, in principle, Algebra 1) do not have to settle for using the mysterious "black box" of their graphing calculators (or other classroom technologies).…
NASA Astrophysics Data System (ADS)
Viana, Ilisio; Orteu, Jean-José; Cornille, Nicolas; Bugarin, Florian
2015-11-01
We focus on quality control of mechanical parts in aeronautical context using a single pan-tilt-zoom (PTZ) camera and a computer-aided design (CAD) model of the mechanical part. We use the CAD model to create a theoretical image of the element to be checked, which is further matched with the sensed image of the element to be inspected, using a graph theory-based approach. The matching is carried out in two stages. First, the two images are used to create two attributed graphs representing the primitives (ellipses and line segments) in the images. In the second stage, the graphs are matched using a similarity function built from the primitive parameters. The similarity scores of the matching are injected in the edges of a bipartite graph. A best-match-search procedure in the bipartite graph guarantees the uniqueness of the match solution. The method achieves promising performance in tests with synthetic data including missing elements, displaced elements, size changes, and combinations of these cases. The results open good prospects for using the method with realistic data.
The use of control charts by laypeople and hospital decision-makers for guiding decision making.
Schmidtke, K A; Watson, D G; Vlaev, I
2017-07-01
Graphs presenting healthcare data are increasingly available to support laypeople and hospital staff's decision making. When making these decisions, hospital staff should consider the role of chance-that is, random variation. Given random variation, decision-makers must distinguish signals (sometimes called special-cause data) from noise (common-cause data). Unfortunately, many graphs do not facilitate the statistical reasoning necessary to make such distinctions. Control charts are a less commonly used type of graph that support statistical thinking by including reference lines that separate data more likely to be signals from those more likely to be noise. The current work demonstrates for whom (laypeople and hospital staff) and when (treatment and investigative decisions) control charts strengthen data-driven decision making. We present two experiments that compare people's use of control and non-control charts to make decisions between hospitals (funnel charts vs. league tables) and to monitor changes across time (run charts with control lines vs. run charts without control lines). As expected, participants more accurately identified the outlying data using a control chart than using a non-control chart, but their ability to then apply that information to more complicated questions (e.g., where should I go for treatment?, and should I investigate?) was limited. The discussion highlights some common concerns about using control charts in hospital settings.
NASA Astrophysics Data System (ADS)
Smit, Jantien; Bakker, Arthur; van Eerde, Dolly; Kuijpers, Maggie
2016-09-01
The importance of language in mathematics learning has been widely acknowledged. However, little is known about how to make this insight productive in the design and enactment of language-oriented mathematics education. In a design-based research project, we explored how language-oriented mathematics education can be designed and enacted. We drew on genre pedagogy to promote student proficiency in the language required for interpreting line graphs. In the intervention, the teacher used scaffolding strategies to focus students' attention on the structure and linguistic features of the language involved in this particular domain. The research question addressed in this paper is how student proficiency in this language may be promoted. The study comprised nine lessons involving 22 students in grades 5 and 6 (aged 10-12); of these students, 19 had a migrant background. In light of the research aim, we first describe the rationale behind our design. Next, we illustrate how the design was enacted by means of a case study focusing on one student in the classroom practice of developing proficiency in the language required for interpreting line graphs. On the basis of pre- and posttest scores, we conclude that overall their proficiency has increased. Together, the results indicate that and how genre pedagogy may be used to help students become more proficient in the language required in a mathematical domain.
Project SQUID. Quarterly Progress Report
1949-07-01
the sodium line reversal method for flame temperature determination ., Determination of Point Temperatures in Turbulent Flames Using the Sodium Line...taken to determine the approximate position of the line. Then, with the G-M tube in position and using the photo graph as an indicator, the region... beams are wide, the latter yielding a greater source of X-rays. Hence, by using that window yielding the broadest beam greater intensity of X-rays
Online graphic symbol recognition using neural network and ARG matching
NASA Astrophysics Data System (ADS)
Yang, Bing; Li, Changhua; Xie, Weixing
2001-09-01
This paper proposes a novel method for on-line recognition of line-based graphic symbol. The input strokes are usually warped into a cursive form due to the sundry drawing style, and classifying them is very difficult. To deal with this, an ART-2 neural network is used to classify the input strokes. It has the advantages of high recognition rate, less recognition time and forming classes in a self-organized manner. The symbol recognition is achieved by an Attribute Relational Graph (ARG) matching algorithm. The ARG is very efficient for representing complex objects, but computation cost is very high. To over come this, we suggest a fast graph matching algorithm using symbol structure information. The experimental results show that the proposed method is effective for recognition of symbols with hierarchical structure.
NASA Astrophysics Data System (ADS)
Golharani, Saeedeh; Jazi, Bahram; Jahanbakht, Sajad; Moeini-Nashalji, Azam
2018-07-01
In this paper, a plasma waveguide made of two eccentric cylindrical metallic walls have been studied according to the theory of transmission lines. The inductance per unit length L, the capacitance per unit length C, the resistance per unit length R and the shunt conductance per unit length G are obtained. The graphs of variations of the mentioned parameters vs. geometrical dimensions of waveguide are investigated. This investigations have been done for two different types of plasma waveguide. At first stage, plasma region will be considered in cold and collisional approximation and in second stage, a drift plasma in cold collisionless approximation will be considered. Also, graphs of phase velocity variation vs. the main parameters of the waveguide are presented.
[Baseflow separation methods in hydrological process research: a review].
Xu, Lei-Lei; Liu, Jing-Lin; Jin, Chang-Jie; Wang, An-Zhi; Guan, De-Xin; Wu, Jia-Bing; Yuan, Feng-Hui
2011-11-01
Baseflow separation research is regarded as one of the most important and difficult issues in hydrology and ecohydrology, but lacked of unified standards in the concepts and methods. This paper introduced the theories of baseflow separation based on the definitions of baseflow components, and analyzed the development course of different baseflow separation methods. Among the methods developed, graph separation method is simple and applicable but arbitrary, balance method accords with hydrological mechanism but is difficult in application, whereas time series separation method and isotopic method can overcome the subjective and arbitrary defects caused by graph separation method, and thus can obtain the baseflow procedure quickly and efficiently. In recent years, hydrological modeling, digital filtering, and isotopic method are the main methods used for baseflow separation.
Magnetogate: using an iPhone magnetometer for measuring kinematic variables
NASA Astrophysics Data System (ADS)
Kağan Temiz, Burak; Yavuz, Ahmet
2016-01-01
This paper presents a method to measure the movement of an object from specific locations on a straight line using an iPhone’s magnetometer. In this method, called ‘magnetogate’, an iPhone is placed on a moving object (in this case a toy car) and small neodymium magnets are arranged at equal intervals on one side of a straight line. The magnetometer sensor of the iPhone is switched on and then the car starts moving. The iPhone’s magnetometer is stimulated throughout its movement along a straight line. A ‘sensor Kinetics’ application on the iPhone saves the magnetic stimulations and produces a graph of the changing magnetic field near the iPhone. At the end of motion, data from the magnetometer is interpreted and peaks on the graph are detected. Thus, position-time changes can be analysed and comments about the motion of the object can be made. The position, velocity and acceleration of the object can be easily measured with this method.
Image Segmentation Using Minimum Spanning Tree
NASA Astrophysics Data System (ADS)
Dewi, M. P.; Armiati, A.; Alvini, S.
2018-04-01
This research aim to segmented the digital image. The process of segmentation is to separate the object from the background. So the main object can be processed for the other purposes. Along with the development of technology in digital image processing application, the segmentation process becomes increasingly necessary. The segmented image which is the result of the segmentation process should accurate due to the next process need the interpretation of the information on the image. This article discussed the application of minimum spanning tree on graph in segmentation process of digital image. This method is able to separate an object from the background and the image will change to be the binary images. In this case, the object that being the focus is set in white, while the background is black or otherwise.
Image Analysis and Classification Based on Soil Strength
2016-08-01
Satellite imagery classification is useful for a variety of commonly used ap- plications, such as land use classification, agriculture , wetland...required use of a coinci- dent digital elevation model (DEM) and a high-resolution orthophoto- graph collected by the National Agriculture Imagery Program...14. ABSTRACT Satellite imagery classification is useful for a variety of commonly used applications, such as land use classification, agriculture
ERIC Educational Resources Information Center
Urban, Michael J.
2013-01-01
Using an ALTA II reflectance spectrometer, the USGS digital spectral library, graphs of planetary spectra, and a few mineral hand samples, one can teach how light can be used to study planets and moons. The author created the hands-on, inquiry-based activity for an undergraduate planetary science course consisting of freshman to senior level…
What to Use for Mathematics in High School: PC, Tablet or Graphing Calculator?
ERIC Educational Resources Information Center
Korenova, Lilla
2015-01-01
Digital technologies have made their way not only into our everyday lives, but nowadays they are also commonly used in schools. Computers, tablets and smartphones are now part of the lives of this new generation of students, so it's only natural that they are used for educational purposes as well. Besides the interactive whiteboards, computers and…
Digital Workflows for a 3d Semantic Representation of AN Ancient Mining Landscape
NASA Astrophysics Data System (ADS)
Hiebel, G.; Hanke, K.
2017-08-01
The ancient mining landscape of Schwaz/Brixlegg in the Tyrol, Austria witnessed mining from prehistoric times to modern times creating a first order cultural landscape when it comes to one of the most important inventions in human history: the production of metal. In 1991 a part of this landscape was lost due to an enormous landslide that reshaped part of the mountain. With our work we want to propose a digital workflow to create a 3D semantic representation of this ancient mining landscape with its mining structures to preserve it for posterity. First, we define a conceptual model to integrate the data. It is based on the CIDOC CRM ontology and CRMgeo for geometric data. To transform our information sources to a formal representation of the classes and properties of the ontology we applied semantic web technologies and created a knowledge graph in RDF (Resource Description Framework). Through the CRMgeo extension coordinate information of mining features can be integrated into the RDF graph and thus related to the detailed digital elevation model that may be visualized together with the mining structures using Geoinformation systems or 3D visualization tools. The RDF network of the triple store can be queried using the SPARQL query language. We created a snapshot of mining, settlement and burial sites in the Bronze Age. The results of the query were loaded into a Geoinformation system and a visualization of known bronze age sites related to mining, settlement and burial activities was created.
Integrating concepts and skills: Slope and kinematics graphs
NASA Astrophysics Data System (ADS)
Tonelli, Edward P., Jr.
The concept of force is a foundational idea in physics. To predict the results of applying forces to objects, a student must be able to interpret data representing changes in distance, time, speed, and acceleration. Comprehension of kinematics concepts requires students to interpret motion graphs, where rates of change are represented as slopes of line segments. Studies have shown that majorities of students who show proficiency with mathematical concepts fail accurately to interpret motion graphs. The primary aim of this study was to examine how students apply their knowledge of slope when interpreting kinematics graphs. To answer the research questions a mixed methods research design, which included a survey and interviews, was adopted. Ninety eight (N=98) high school students completed surveys which were quantitatively analyzed along with qualitative information collected from interviews of students (N=15) and teachers ( N=2). The study showed that students who recalled methods for calculating slopes and speeds calculated slopes accurately, but calculated speeds inaccurately. When comparing the slopes and speeds, most students resorted to calculating instead of visual inspection. Most students recalled and applied memorized rules. Students who calculated slopes and speeds inaccurately failed to recall methods of calculating slopes and speeds, but when comparing speeds, these students connected the concepts of distance and time to the line segments and the rates of change they represented. This study's findings will likely help mathematics and science educators to better assist their students to apply their knowledge of the definition of slope and skills in kinematics concepts.
A digital retina-like low-level vision processor.
Mertoguno, S; Bourbakis, N G
2003-01-01
This correspondence presents the basic design and the simulation of a low level multilayer vision processor that emulates to some degree the functional behavior of a human retina. This retina-like multilayer processor is the lower part of an autonomous self-organized vision system, called Kydon, that could be used on visually impaired people with a damaged visual cerebral cortex. The Kydon vision system, however, is not presented in this paper. The retina-like processor consists of four major layers, where each of them is an array processor based on hexagonal, autonomous processing elements that perform a certain set of low level vision tasks, such as smoothing and light adaptation, edge detection, segmentation, line recognition and region-graph generation. At each layer, the array processor is a 2D array of k/spl times/m hexagonal identical autonomous cells that simultaneously execute certain low level vision tasks. Thus, the hardware design and the simulation at the transistor level of the processing elements (PEs) of the retina-like processor and its simulated functionality with illustrative examples are provided in this paper.
Electronic Current Transducer (ECT) for high voltage dc lines
NASA Astrophysics Data System (ADS)
Houston, J. M.; Peters, P. H., Jr.; Summerayes, H. R., Jr.; Carlson, G. J.; Itani, A. M.
1980-02-01
The development of a bipolar electronic current transducer (ECT) for measuring the current in a high voltage dc (HVDC) power line at line potential is discussed. The design and construction of a free standing ECT for use on a 400 kV line having a nominal line current of 2000 A is described. Line current is measured by a 0.0001 ohm shunt whose voltage output is sampled by a 14 bit digital data link. The high voltage interface between line and ground is traversed by optical fibers which carry digital light signals as far as 300 m to a control room where the digital signal is converted back to an analog representation of the shunt voltage. Two redundant electronic and optical data links are used in the prototype. Power to operate digital and optical electronics and temperature controlling heaters at the line is supplied by a resistively and capacitively graded 10 stage cascade of ferrite core transformers located inside the hollow, SF6 filled, porcelain support insulator. The cascade is driven by a silicon controlled rectifier inverter which supplies about 100 W of power at 30 kHz.
NASA Astrophysics Data System (ADS)
Mackaness, William; Duchateau, Rica; Cross, Jamie
2018-05-01
Land registration is important in land tenure security and often resolves land-related issues. Volunteered geographic information is a cheap and quick alternative to formal and traditional approaches to land registration. This research investigates the extent to which this tool is meaningful for land registration, with the Scottish crofting com- munity as a case study. CroftCappture was developed to record points along boundaries and save geotagged photo- graphs and descriptions. The project raised interesting questions over usability, functionality and accuracy, as well issues of privacy, crofting practices, digital competency, and highlighted the fractal nature of the digital divide.
VizieR Online Data Catalog: HI 21-cm absorption in redshifted galaxies (Curran+, 2016)
NASA Astrophysics Data System (ADS)
Curran, S. J.; Duchesne, S. W.; Divoli, A.; Allison, J. R.
2018-04-01
Unlike at lower redshifts, most of the z>=0.1 detections are already compiled (in Tables 1 and 2, which are updated from Curran & Whiting, 2010ApJ...712..303C and Curran, 2010MNRAS.402.2657C, respectively). However, the raw data were generally unavailable and so the spectra were acquired from the literature by digitizing the available figures. For this, we used the GetData Graph Digitizer2 package for all the spectra, except those in Srianand et al. (2015MNRAS.451..917S) and Yan et al. (2016AJ....151...74Y), which were constructed from Gaussian parameters presented. (2 data files).
Manchester visual query language
NASA Astrophysics Data System (ADS)
Oakley, John P.; Davis, Darryl N.; Shann, Richard T.
1993-04-01
We report a database language for visual retrieval which allows queries on image feature information which has been computed and stored along with images. The language is novel in that it provides facilities for dealing with feature data which has actually been obtained from image analysis. Each line in the Manchester Visual Query Language (MVQL) takes a set of objects as input and produces another, usually smaller, set as output. The MVQL constructs are mainly based on proven operators from the field of digital image analysis. An example is the Hough-group operator which takes as input a specification for the objects to be grouped, a specification for the relevant Hough space, and a definition of the voting rule. The output is a ranked list of high scoring bins. The query could be directed towards one particular image or an entire image database, in the latter case the bins in the output list would in general be associated with different images. We have implemented MVQL in two layers. The command interpreter is a Lisp program which maps each MVQL line to a sequence of commands which are used to control a specialized database engine. The latter is a hybrid graph/relational system which provides low-level support for inheritance and schema evolution. In the paper we outline the language and provide examples of useful queries. We also describe our solution to the engineering problems associated with the implementation of MVQL.
Digital adaptive optics line-scanning confocal imaging system.
Liu, Changgeng; Kim, Myung K
2015-01-01
A digital adaptive optics line-scanning confocal imaging (DAOLCI) system is proposed by applying digital holographic adaptive optics to a digital form of line-scanning confocal imaging system. In DAOLCI, each line scan is recorded by a digital hologram, which allows access to the complex optical field from one slice of the sample through digital holography. This complex optical field contains both the information of one slice of the sample and the optical aberration of the system, thus allowing us to compensate for the effect of the optical aberration, which can be sensed by a complex guide star hologram. After numerical aberration compensation, the corrected optical fields of a sequence of line scans are stitched into the final corrected confocal image. In DAOLCI, a numerical slit is applied to realize the confocality at the sensor end. The width of this slit can be adjusted to control the image contrast and speckle noise for scattering samples. DAOLCI dispenses with the hardware pieces, such as Shack–Hartmann wavefront sensor and deformable mirror, and the closed-loop feedbacks adopted in the conventional adaptive optics confocal imaging system, thus reducing the optomechanical complexity and cost. Numerical simulations and proof-of-principle experiments are presented that demonstrate the feasibility of this idea.
NASA Astrophysics Data System (ADS)
Kohler, Sophie; Far, Aïcha Beya; Hirsch, Ernest
2007-01-01
This paper presents an original approach for the optimal 3D reconstruction of manufactured workpieces based on a priori planification of the task, enhanced on-line through dynamic adjustment of the lighting conditions, and built around a cognitive intelligent sensory system using so-called Situation Graph Trees. The system takes explicitely structural knowledge related to image acquisition conditions, type of illumination sources, contents of the scene (e. g., CAD models and tolerance information), etc. into account. The principle of the approach relies on two steps. First, a socalled initialization phase, leading to the a priori task plan, collects this structural knowledge. This knowledge is conveniently encoded, as a sub-part, in the Situation Graph Tree building the backbone of the planning system specifying exhaustively the behavior of the application. Second, the image is iteratively evaluated under the control of this Situation Graph Tree. The information describing the quality of the piece to analyze is thus extracted and further exploited for, e. g., inspection tasks. Lastly, the approach enables dynamic adjustment of the Situation Graph Tree, enabling the system to adjust itself to the actual application run-time conditions, thus providing the system with a self-learning capability.
A componential model of human interaction with graphs: 1. Linear regression modeling
NASA Technical Reports Server (NTRS)
Gillan, Douglas J.; Lewis, Robert
1994-01-01
Task analyses served as the basis for developing the Mixed Arithmetic-Perceptual (MA-P) model, which proposes (1) that people interacting with common graphs to answer common questions apply a set of component processes-searching for indicators, encoding the value of indicators, performing arithmetic operations on the values, making spatial comparisons among indicators, and repsonding; and (2) that the type of graph and user's task determine the combination and order of the components applied (i.e., the processing steps). Two experiments investigated the prediction that response time will be linearly related to the number of processing steps according to the MA-P model. Subjects used line graphs, scatter plots, and stacked bar graphs to answer comparison questions and questions requiring arithmetic calculations. A one-parameter version of the model (with equal weights for all components) and a two-parameter version (with different weights for arithmetic and nonarithmetic processes) accounted for 76%-85% of individual subjects' variance in response time and 61%-68% of the variance taken across all subjects. The discussion addresses possible modifications in the MA-P model, alternative models, and design implications from the MA-P model.
Coined quantum walks on weighted graphs
NASA Astrophysics Data System (ADS)
Wong, Thomas G.
2017-11-01
We define a discrete-time, coined quantum walk on weighted graphs that is inspired by Szegedy’s quantum walk. Using this, we prove that many lackadaisical quantum walks, where each vertex has l integer self-loops, can be generalized to a quantum walk where each vertex has a single self-loop of real-valued weight l. We apply this real-valued lackadaisical quantum walk to two problems. First, we analyze it on the line or one-dimensional lattice, showing that it is exactly equivalent to a continuous deformation of the three-state Grover walk with faster ballistic dispersion. Second, we generalize Grover’s algorithm, or search on the complete graph, to have a weighted self-loop at each vertex, yielding an improved success probability when l < 3 + 2\\sqrt{2} ≈ 5.828 .
NASA Astrophysics Data System (ADS)
Cheng, Shaobo; Zhang, Dong; Deng, Shiqing; Li, Xing; Li, Jun; Tan, Guotai; Zhu, Yimei; Zhu, Jing
2018-04-01
Topological defects and their interactions often arouse multiple types of emerging phenomena from edge states in Skyrmions to disclination pairs in liquid crystals. In hexagonal manganites, partial edge dislocations, a prototype topological defect, are ubiquitous and they significantly alter the topologically protected domains and their behaviors. Herein, combining electron microscopy experiment and graph theory analysis, we report a systematic study of the connections and configurations of domains in this dislocation embedded system. Rules for domain arrangement are established. The dividing line between domains, which can be attributed by the strain field of dislocations, is accurately described by a genus model from a higher dimension in the graph theory. Our results open a door for the understanding of domain patterns in topologically protected multiferroic systems.
Converting analog interpretive data to digital formats for use in database and GIS applications
Flocks, James G.
2004-01-01
There is a growing need by researchers and managers for comprehensive and unified nationwide datasets of scientific data. These datasets must be in a digital format that is easily accessible using database and GIS applications, providing the user with access to a wide variety of current and historical information. Although most data currently being collected by scientists are already in a digital format, there is still a large repository of information in the literature and paper archive. Converting this information into a format accessible by computer applications is typically very difficult and can result in loss of data. However, since scientific data are commonly collected in a repetitious, concise matter (i.e., forms, tables, graphs, etc.), these data can be recovered digitally by using a conversion process that relates the position of an attribute in two-dimensional space to the information that the attribute signifies. For example, if a table contains a certain piece of information in a specific row and column, then the space that the row and column occupies becomes an index of that information. An index key is used to identify the relation between the physical location of the attribute and the information the attribute contains. The conversion process can be achieved rapidly, easily and inexpensively using widely available digitizing and spreadsheet software, and simple programming code. In the geological sciences, sedimentary character is commonly interpreted from geophysical profiles and descriptions of sediment cores. In the field and laboratory, these interpretations were typically transcribed to paper. The information from these paper archives is still relevant and increasingly important to scientists, engineers and managers to understand geologic processes affecting our environment. Direct scanning of this information produces a raster facsimile of the data, which allows it to be linked to the electronic world. But true integration of the content with database and GIS software as point, vector or text information is commonly lost. Sediment core descriptions and interpretation of geophysical profiles are usually portrayed as lines, curves, symbols and text information. They have vertical and horizontal dimensions associated with depth, category, time, or geographic position. These dimensions are displayed in consistent positions, which can be digitized and converted to a digital format, such as a spreadsheet. Once this data is in a digital, tabulated form it can easily be made available to a wide variety of imaging and data manipulation software for compilation and world-wide dissemination.
Digital PCM bit synchronizer and detector
NASA Astrophysics Data System (ADS)
Moghazy, A. E.; Maral, G.; Blanchard, A.
1980-08-01
A theoretical analysis of a digital self-bit synchronizer and detector is presented and supported by the implementation of an experimental model that utilizes standard TTL logic circuits. This synchronizer is based on the generation of spectral line components by nonlinear filtering of the received bit stream, and extracting the line by a digital phase-locked loop (DPLL). The extracted reference signal instructs a digital matched filter (DMF) data detector. This realization features a short acquisition time and an all-digital structure.
A Phase-Based Approach to Satellite Constellation Analysis and Design
1991-01-01
and 4p is a phase angle representing true anomaly, as measured from the line of nodes. For a spherical earth, the orbital parameters are related...Var Outdat : Arrayll..2,1..90] of Real; J Output data for cost versus optimization parameter I F : text; { Output file Y, DY : Vec2; Y is a point on...InitGraph(Gd, Gm,’graph’); Assign(f,’c:\\matlab\\ OutDat ’); Rewrite (f); 129 7 o / With Common do with Target do With LoopParm do With Constellation do
Geologic map of the Chewelah 30' x 60' Quadrangle, Washington and Idaho
Miller, F.K.
2001-01-01
This data set maps and describes the geology of the Chewelah 30' X 60' quadrangle, Washington and Idaho. Created using Environmental Systems Research Institute's ARC/INFO software, the data base consists of the following items: (1) a map coverage containing geologic contacts and units, (2) a point coverage containing site-specific geologic structural data, (3) two coverages derived from 1:100,000 Digital Line Graphs (DLG); one of which represents topographic data, and the other, cultural data, (4) two line coverages that contain cross-section lines and unit-label leaders, respectively, and (5) attribute tables for geologic units (polygons), contacts (arcs), and site-specific data (points). In addition, the data set includes the following graphic and text products: (1) A PostScript graphic plot-file containing the geologic map, topography, cultural data, and two cross sections, and on a separate sheet, a Correlation of Map Units (CMU) diagram, an abbreviated Description of Map Units (DMU), modal diagrams for granitic rocks, an index map, a regional geologic and structure map, and a key for point and line symbols; (2) PDF files of the Readme text-file and expanded Description of Map Units (DMU), and (3) this metadata file. The geologic map database contains original U.S. Geological Survey data generated by detailed field observation and by interpretation of aerial photographs. The map was compiled from geologic maps of eight 1:48,000 15' quadrangle blocks, each of which was made by mosaicing and reducing the four constituent 7.5' quadrangles. These 15' quadrangle blocks were mapped chiefly at 1:24,000 scale, but the detail of the mapping was governed by the intention that it was to be compiled at 1:48,000 scale. The compilation at 1:100,000 scale entailed necessary simplification in some areas and combining of some geologic units. Overall, however, despite a greater than two times reduction in scale, most geologic detail found on the 1:48,000 maps is retained on the 1:100,000 map. Geologic contacts across boundaries of the eight constituent quadrangles required minor adjustments, but none significant at the final 1:100,000 scale. The geologic map was compiled on a base-stable cronoflex copy of the Chewelah 30' X 60' topographic base and then scribed. The scribe guide was used to make a 0.007 mil-thick blackline clear-film, which was scanned at 1200 DPI by Optronics Specialty Company, Northridge, California. This image was converted to vector and polygon GIS layers and minimally attributed by Optronics Specialty Company. Minor hand-digitized additions were made at the USGS. Lines, points, and polygons were subsequently edited at the USGS by using standard ARC/INFO commands. Digitizing and editing artifacts significant enough to display at a scale of 1:100,000 were corrected. Within the database, geologic contacts are represented as lines (arcs), geologic units as polygons, and site-specific data as points. Polygon, arc, and point attribute tables (.pat, .aat, and .pat, respectively) uniquely identify each geologic datum.
Method of Error Floor Mitigation in Low-Density Parity-Check Codes
NASA Technical Reports Server (NTRS)
Hamkins, Jon (Inventor)
2014-01-01
A digital communication decoding method for low-density parity-check coded messages. The decoding method decodes the low-density parity-check coded messages within a bipartite graph having check nodes and variable nodes. Messages from check nodes are partially hard limited, so that every message which would otherwise have a magnitude at or above a certain level is re-assigned to a maximum magnitude.
Video on phone lines: technology and applications
NASA Astrophysics Data System (ADS)
Hsing, T. Russell
1996-03-01
Recent advances in communications signal processing and VLSI technology are fostering tremendous interest in transmitting high-speed digital data over ordinary telephone lines at bit rates substantially above the ISDN Basic Access rate (144 Kbit/s). Two new technologies, high-bit-rate digital subscriber lines and asymmetric digital subscriber lines promise transmission over most of the embedded loop plant at 1.544 Mbit/s and beyond. Stimulated by these research promises and rapid advances on video coding techniques and the standards activity, information networks around the globe are now exploring possible business opportunities of offering quality video services (such as distant learning, telemedicine, and telecommuting etc.) through this high-speed digital transport capability in the copper loop plant. Visual communications for residential customers have become more feasible than ever both technically and economically.
In-line digital holography with phase-shifting Greek-ladder sieves
NASA Astrophysics Data System (ADS)
Xie, Jing; Zhang, Junyong; Zhang, Yanli; Zhou, Shenlei; Zhu, Jianqiang
2018-04-01
Phase shifting is the key technique in in-line digital holography, but traditional phase shifters have their own limitations in short wavelength regions. Here, phase-shifting Greek-ladder sieves with amplitude-only modulation are introduced into in-line digital holography, which are essentially a kind of diffraction lens with three-dimensional array diffraction-limited foci. In the in-line digital holographic experiment, we design two kinds of sieves by lithography and verify the validity of their phase-shifting function by measuring a 1951 U.S. Air Force resolution test target and three-dimensional array foci. With advantages of high resolving power, low cost, and no limitations at shorter wavelengths, phase-shifting Greek-ladder sieves have great potential in X-ray holography or biochemical microscopy for the next generation of synchrotron light sources.
Application research for 4D technology in flood forecasting and evaluation
NASA Astrophysics Data System (ADS)
Li, Ziwei; Liu, Yutong; Cao, Hongjie
1998-08-01
In order to monitor the region which disaster flood happened frequently in China, satisfy the great need of province governments for high accuracy monitoring and evaluated data for disaster and improve the efficiency for repelling disaster, under the Ninth Five-year National Key Technologies Programme, the method was researched for flood forecasting and evaluation using satellite and aerial remoted sensed image and land monitor data. The effective and practicable flood forecasting and evaluation system was established and DongTing Lake was selected as the test site. Modern Digital photogrammetry, remote sensing and GIS technology was used in this system, the disastrous flood could be forecasted and loss can be evaluated base on '4D' (DEM -- Digital Elevation Model, DOQ -- Digital OrthophotoQuads, DRG -- Digital Raster Graph, DTI -- Digital Thematic Information) disaster background database. The technology of gathering and establishing method for '4D' disaster environment background database, application technology for flood forecasting and evaluation based on '4D' background data and experimental results for DongTing Lake test site were introduced in detail in this paper.
Radiometric Survey in Western Afghanistan: A Website for Distribution of Data
Sweeney, Ronald E.; Kucks, Robert P.; Hill, Patricia L.; Finn, Carol A.
2007-01-01
Radiometric (uranium content, thorium content, potassium content, and gamma-ray intensity) and related data were digitized from radiometric and survey route location maps of western Afghanistan published in 1976. The uranium content data were digitized along contour lines from 33 maps in a series entitled 'Map of Uranium (Radium) Contents of Afghanistan (Western Area),' compiled by V. N. Kirsanov and R. S. Dershimanov. The thorium content data were digitized along contour lines from 33 maps in a series entitled 'Map of Thorium Contents of Afghanistan (Western Area),' compiled by V. N. Kirsanov and R. S. Dershimanov. The potassium content data were digitized along contour lines from 33 maps in a series entitled 'Map of Potassium Contents of Afghanistan (Western Area),' compiled by V. N. Kirsanov and R. S. Dershimanov. The gamma-ray intensity data were digitized along contour lines from 33 maps in a series entitled 'Map of Gamma-Field of Afghanistan (Western Area),' compiled by V. N. Kirsanov and R. S. Dershimanov. The survey route location data were digitized along flight-lines located on 33 maps in a series entitled 'Survey Routes Location and Contours of Flight Equal Altitudes. Western Area of Afghanistan,' compiled by Z. A. Alpatova, V. G. Kurnosov, and F. A. Grebneva.
[Procedural analysis of acid-base balance disorder: case serials in 4 patents].
Ma, Chunyuan; Wang, Guijie
2017-05-01
To establish the standardization process of acid-base balance analysis, analyze cases of acid-base balance disorder with the aid of acid-base balance coordinate graph. The acid-base balance theory were reviewed systematically on recent research progress, and the important concepts, definitions, formulas, parameters, regularity and inference in the analysis of acid-base balance were studied. The analysis of acid-base balance disordered processes and steps were figured. The application of acid-base balance coordinate graph in the cases was introduced. The method of "four parameters-four steps" analysis was put forward to analyze the acid-base balance disorders completely. "Four parameters" included pH, arterial partial pressure of carbon dioxide (PaCO 2 ), HCO 3 - and anion gap (AG). "Four steps" were outlined by following aspects: (1) according to the pH, PaCO 2 and HCO 3 - , the primary or main types of acid-base balance disorder was determined; (2) primary or main types of acid-base disorder were used to choose the appropriate compensation formula and to determine the presence of double mixed acid-base balance disorder; (3) the primary acid-base balance disorders were divided into two parts: respiratory acidosis or respiratory alkalosis, at the same time, the potential HCO 3 - should be calculated, the measured HCO 3 - should be replaced with potential HCO 3 - , to determine whether there were three mixed acid-base disorders; (4) based on the above analysis the data judged as the simple AG increased-metabolic acidosis was needed to be further analyzed. The ratio of ΔAG↑/ΔHCO 3 - ↓ was also needed to be calculated, to determine whether there was normal AG metabolic acidosis or metabolic alkalosis. In the clinical practice, PaCO 2 (as the abscissa) and HCO 3 - (as the ordinate) were used to establish a rectangular coordinate system, through origin (0, 0) and coordinate point (40, 24) could be a straight line, and all points on the straight line pH were equal to 7.40. The acid-base balance coordinate graph could be divided into seven areas by three straight lines [namely pH = 7.40 isoline, PaCO 2 = 40 mmHg (1 mmHg = 0.133 kPa) line and HCO 3 - = 24 mmol/L line]: main respiratory alkalosis area, main metabolic alkalosis area, respiratory + metabolic alkalosis area, main respiratory acidosis area, main metabolic acidosis area, respiratory + metabolic acidosis area and normal area. It was easier to determine the type of acid-base balance disorders by identifying the location of the (PaCO 2 , HCO 3 - ) or (PaCO 2 , potential HCO 3 - ) point on the acid-base balance coordinate graph. "Four parameters-four steps" method is systematic and comprehensive. At the same time, by using the acid-base balance coordinate graph, it is simpler to estimate the types of acid-base balance disorders. It is worthy of popularizing and generalizing.
Metacoder: An R package for visualization and manipulation of community taxonomic diversity data.
Foster, Zachary S L; Sharpton, Thomas J; Grünwald, Niklaus J
2017-02-01
Community-level data, the type generated by an increasing number of metabarcoding studies, is often graphed as stacked bar charts or pie graphs that use color to represent taxa. These graph types do not convey the hierarchical structure of taxonomic classifications and are limited by the use of color for categories. As an alternative, we developed metacoder, an R package for easily parsing, manipulating, and graphing publication-ready plots of hierarchical data. Metacoder includes a dynamic and flexible function that can parse most text-based formats that contain taxonomic classifications, taxon names, taxon identifiers, or sequence identifiers. Metacoder can then subset, sample, and order this parsed data using a set of intuitive functions that take into account the hierarchical nature of the data. Finally, an extremely flexible plotting function enables quantitative representation of up to 4 arbitrary statistics simultaneously in a tree format by mapping statistics to the color and size of tree nodes and edges. Metacoder also allows exploration of barcode primer bias by integrating functions to run digital PCR. Although it has been designed for data from metabarcoding research, metacoder can easily be applied to any data that has a hierarchical component such as gene ontology or geographic location data. Our package complements currently available tools for community analysis and is provided open source with an extensive online user manual.
On Quantifying Diffusion of Health Information on Twitter.
Bakal, Gokhan; Kavuluru, Ramakanth
2017-02-01
With the increasing use of digital technologies, online social networks are emerging as major means of communication. Recently, social networks such as Facebook and Twitter are also being used by consumers, care providers (physicians, hospitals), and government agencies to share health related information. The asymmetric user network and the short message size have made Twitter particularly popular for propagating health related content on the Web. Besides tweeting on their own, users can choose to retweet particular tweets from other users (even if they do not follow them on Twitter.) Thus, a tweet can diffuse through the Twitter network via the follower-friend connections. In this paper, we report results of a pilot study we conducted to quantitatively assess how health related tweets diffuse in the directed follower-friend Twitter graph through the retweeting activity. Our effort includes (1). development of a retweet collection and Twitter retweet graph formation framework and (2). a preliminary analysis of retweet graphs and associated diffusion metrics for health tweets. Given the ambiguous nature (due to polysemy and sarcasm) of health relatedness of tweets collected with keyword based matches, our initial study is limited to ≈ 200 health related tweets (which were manually verified to be on health topics) each with at least 25 retweets. To our knowledge, this is first attempt to study health information diffusion on Twitter through retweet graph analysis.
Metacoder: An R package for visualization and manipulation of community taxonomic diversity data
Foster, Zachary S. L.; Sharpton, Thomas J.
2017-01-01
Community-level data, the type generated by an increasing number of metabarcoding studies, is often graphed as stacked bar charts or pie graphs that use color to represent taxa. These graph types do not convey the hierarchical structure of taxonomic classifications and are limited by the use of color for categories. As an alternative, we developed metacoder, an R package for easily parsing, manipulating, and graphing publication-ready plots of hierarchical data. Metacoder includes a dynamic and flexible function that can parse most text-based formats that contain taxonomic classifications, taxon names, taxon identifiers, or sequence identifiers. Metacoder can then subset, sample, and order this parsed data using a set of intuitive functions that take into account the hierarchical nature of the data. Finally, an extremely flexible plotting function enables quantitative representation of up to 4 arbitrary statistics simultaneously in a tree format by mapping statistics to the color and size of tree nodes and edges. Metacoder also allows exploration of barcode primer bias by integrating functions to run digital PCR. Although it has been designed for data from metabarcoding research, metacoder can easily be applied to any data that has a hierarchical component such as gene ontology or geographic location data. Our package complements currently available tools for community analysis and is provided open source with an extensive online user manual. PMID:28222096
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eides, M.I.; Karshenboim, S.G.; Shelyuto, V.A.
1994-12-31
Contributions to HFS and to the Lamb shift intervals of order a{sup 2}(Za){sup 5} induced by gauge invariant set of nineteen topologically different graphs with two radiative photons inserted in the electron line are considered. Corrections both to HFS and Lamb shift induced by nine diagrams are calculated in the Fried-Yennie gauge.
ERIC Educational Resources Information Center
Stacey, Kaye; Price, Beth; Steinle, Vicki
2012-01-01
This paper discusses issues arising in the design of questions to use in an on-line computer-based formative assessment system, focussing on how best to identify the stages of a learning hierarchy for reporting to teachers. Data from several hundred students is used to illustrate how design decisions have been made for a test on interpreting line…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, Shaobo; Zhang, Dong; Deng, Shiqing
Topological defects and their interactions often arouse multiple types of emerging phenomena from edge states in Skyrmions to disclination pairs in liquid crystals. In hexagonal manganites, partial edge dislocations, a prototype topological defect, are ubiquitous and they significantly alter the topologically protected domains and their behaviors. In this work, combining electron microscopy experiment and graph theory analysis, we report a systematic study of the connections and configurations of domains in this dislocation embedded system. Rules for domain arrangement are established. The dividing line between domains, which can be attributed by the strain field of dislocations, is accurately described by amore » genus model from a higher dimension in the graph theory. In conclusion, our results open a door for the understanding of domain patterns in topologically protected multiferroic systems.« less
Generalized teleportation by quantum walks
NASA Astrophysics Data System (ADS)
Wang, Yu; Shang, Yun; Xue, Peng
2017-09-01
We develop a generalized teleportation scheme based on quantum walks with two coins. For an unknown qubit state, we use two-step quantum walks on the line and quantum walks on the cycle with four vertices for teleportation. For any d-dimensional states, quantum walks on complete graphs and quantum walks on d-regular graphs can be used for implementing teleportation. Compared with existing d-dimensional states teleportation, prior entangled state is not required and the necessary maximal entanglement resource is generated by the first step of quantum walk. Moreover, two projective measurements with d elements are needed by quantum walks on the complete graph, rather than one joint measurement with d^2 basis states. Quantum walks have many applications in quantum computation and quantum simulations. This is the first scheme of realizing communicating protocol with quantum walks, thus opening wider applications.
Cheng, Shaobo; Zhang, Dong; Deng, Shiqing; ...
2018-04-19
Topological defects and their interactions often arouse multiple types of emerging phenomena from edge states in Skyrmions to disclination pairs in liquid crystals. In hexagonal manganites, partial edge dislocations, a prototype topological defect, are ubiquitous and they significantly alter the topologically protected domains and their behaviors. In this work, combining electron microscopy experiment and graph theory analysis, we report a systematic study of the connections and configurations of domains in this dislocation embedded system. Rules for domain arrangement are established. The dividing line between domains, which can be attributed by the strain field of dislocations, is accurately described by amore » genus model from a higher dimension in the graph theory. In conclusion, our results open a door for the understanding of domain patterns in topologically protected multiferroic systems.« less
Taking Digital Creativity to the Art Classroom: Mystery Box Swap
ERIC Educational Resources Information Center
Shin, Ryan
2010-01-01
Today's students are the first generation to grow up with computers, cell-phones, video games, music and video players, and other digital technologies. As "digital natives", a new term coined by Prensky (2001), they spend more time reading text messaging lines than lines from books, and they spend more time on Facebook than putting their energies…
Digital geologic map database of the Nevada Test Site area, Nevada
Wahl, R.R.; Sawyer, D.A.; Minor, S.A.; Carr, M.D.; Cole, J.C.; Swadley, W.C.; Laczniak, R.J.; Warren, R.G.; Green, K.S.; Engle, C.M.
1997-01-01
Forty years of geologic investigations at the Nevada Test Site (NTS) have been digitized. These data include all geologic information that: (1) has been collected, and (2) can be represented on a map within the map borders at the map scale is included in the map digital coverages. The following coverages are included with this dataset: Coverage Type Description geolpoly Polygon Geologic outcrops geolflts line Fault traces geolatts Point Bedding attitudes, etc. geolcald line Caldera boundaries geollins line Interpreted lineaments geolmeta line Metamorphic gradients The above coverages are attributed with numeric values and interpreted information. The entity files documented below show the data associated with each coverage.
System for memorizing maximum values
NASA Technical Reports Server (NTRS)
Bozeman, Richard J., Jr. (Inventor)
1992-01-01
The invention discloses a system capable of memorizing maximum sensed values. The system includes conditioning circuitry which receives the analog output signal from a sensor transducer. The conditioning circuitry rectifies and filters the analog signal and provides an input signal to a digital driver, which may be either linear or logarithmic. The driver converts the analog signal to discrete digital values, which in turn triggers an output signal on one of a plurality of driver output lines n. The particular output lines selected is dependent on the converted digital value. A microfuse memory device connects across the driver output lines, with n segments. Each segment is associated with one driver output line, and includes a microfuse that is blown when a signal appears on the associated driver output line.
System for memorizing maximum values
NASA Astrophysics Data System (ADS)
Bozeman, Richard J., Jr.
1992-08-01
The invention discloses a system capable of memorizing maximum sensed values. The system includes conditioning circuitry which receives the analog output signal from a sensor transducer. The conditioning circuitry rectifies and filters the analog signal and provides an input signal to a digital driver, which may be either linear or logarithmic. The driver converts the analog signal to discrete digital values, which in turn triggers an output signal on one of a plurality of driver output lines n. The particular output lines selected is dependent on the converted digital value. A microfuse memory device connects across the driver output lines, with n segments. Each segment is associated with one driver output line, and includes a microfuse that is blown when a signal appears on the associated driver output line.
System for Memorizing Maximum Values
NASA Technical Reports Server (NTRS)
Bozeman, Richard J., Jr. (Inventor)
1996-01-01
The invention discloses a system capable of memorizing maximum sensed values. The system includes conditioning circuitry which receives the analog output signal from a sensor transducer. The conditioning circuitry rectifies and filters the analog signal and provides an input signal to a digital driver, which may be either liner or logarithmic. The driver converts the analog signal to discrete digital values, which in turn triggers an output signal on one of a plurality of driver output lines n. The particular output lines selected is dependent on the converted digital value. A microfuse memory device connects across the driver output lines, with n segments. Each segment is associated with one driver output line, and includes a microfuse that is blown when a signal appears on the associated driver output line.
Digital terrain tapes: user guide
,
1980-01-01
DMATC's digital terrain tapes are a by-product of the agency's efforts to streamline the production of raised-relief maps. In the early 1960's DMATC developed the Digital Graphics Recorder (DGR) system that introduced new digitizing techniques and processing methods into the field of three-dimensional mapping. The DGR system consisted of an automatic digitizing table and a computer system that recorded a grid of terrain elevations from traces of the contour lines on standard topographic maps. A sequence of computer accuracy checks was performed and then the elevations of grid points not intersected by contour lines were interpolated. The DGR system produced computer magnetic tapes which controlled the carving of plaster forms used to mold raised-relief maps. It was realized almost immediately that this relatively simple tool for carving plaster molds had enormous potential for storing, manipulating, and selectively displaying (either graphically or numerically) a vast number of terrain elevations. As the demand for the digital terrain tapes increased, DMATC began developing increasingly advanced digitizing systems and now operates the Digital Topographic Data Collection System (DTDCS). With DTDCS, two types of data elevations as contour lines and points, and stream and ridge lines are sorted, matched, and resorted to obtain a grid of elevation values for every 0.01 inch on each map (approximately 200 feet on the ground). Undefined points on the grid are found by either linear or or planar interpolation.
NASA Astrophysics Data System (ADS)
Patel, K. C.; Ruiz, R.; Lille, J.; Wan, L.; Dobiz, E.; Gao, H.; Robertson, N.; Albrecht, T. R.
2012-03-01
Directed self-assembly is emerging as a promising technology to define sub-20nm features. However, a straightforward path to scale block copolymer lithography to single-digit fabrication remains challenging given the diverse material properties found in the wide spectrum of self-assembling materials. A vast amount of block copolymer research for industrial applications has been dedicated to polystyrene-b-methyl methacrylate (PS-b-PMMA), a model system that displays multiple properties making it ideal for lithography, but that is limited by a weak interaction parameter that prevents it from scaling to single-digit lithography. Other block copolymer materials have shown scalability to much smaller dimensions, but at the expense of other material properties that could delay their insertion into industrial lithographic processes. We report on a line doubling process applied to block copolymer patterns to double the frequency of PS-b-PMMA line/space features, demonstrating the potential of this technique to reach single-digit lithography. We demonstrate a line-doubling process that starts with directed self-assembly of PS-b-PMMA to define line/space features. This pattern is transferred into an underlying sacrificial hard-mask layer followed by a growth of self-aligned spacers which subsequently serve as hard-masks for transferring the 2x frequency doubled pattern to the underlying substrate. We applied this process to two different block copolymer materials to demonstrate line-space patterns with a half pitch of 11nm and 7nm underscoring the potential to reach single-digit critical dimensions. A subsequent patterning step with perpendicular lines can be used to cut the fine line patterns into a 2-D array of islands suitable for bit patterned media. Several integration challenges such as line width control and line roughness are addressed.
Digital video system for on-line portal verification
NASA Astrophysics Data System (ADS)
Leszczynski, Konrad W.; Shalev, Shlomo; Cosby, N. Scott
1990-07-01
A digital system has been developed for on-line acquisition, processing and display of portal images during radiation therapy treatment. A metal/phosphor screen combination is the primary detector, where the conversion from high-energy photons to visible light takes place. A mirror angled at 45 degrees reflects the primary image to a low-light-level camera, which is removed from the direct radiation beam. The image registered by the camera is digitized, processed and displayed on a CRT monitor. Advanced digital techniques for processing of on-line images have been developed and implemented to enhance image contrast and suppress the noise. Some elements of automated radiotherapy treatment verification have been introduced.
Graph Theoretical Analysis Reveals: Women's Brains Are Better Connected than Men's.
Szalkai, Balázs; Varga, Bálint; Grolmusz, Vince
2015-01-01
Deep graph-theoretic ideas in the context with the graph of the World Wide Web led to the definition of Google's PageRank and the subsequent rise of the most popular search engine to date. Brain graphs, or connectomes, are being widely explored today. We believe that non-trivial graph theoretic concepts, similarly as it happened in the case of the World Wide Web, will lead to discoveries enlightening the structural and also the functional details of the animal and human brains. When scientists examine large networks of tens or hundreds of millions of vertices, only fast algorithms can be applied because of the size constraints. In the case of diffusion MRI-based structural human brain imaging, the effective vertex number of the connectomes, or brain graphs derived from the data is on the scale of several hundred today. That size facilitates applying strict mathematical graph algorithms even for some hard-to-compute (or NP-hard) quantities like vertex cover or balanced minimum cut. In the present work we have examined brain graphs, computed from the data of the Human Connectome Project, recorded from male and female subjects between ages 22 and 35. Significant differences were found between the male and female structural brain graphs: we show that the average female connectome has more edges, is a better expander graph, has larger minimal bisection width, and has more spanning trees than the average male connectome. Since the average female brain weighs less than the brain of males, these properties show that the female brain has better graph theoretical properties, in a sense, than the brain of males. It is known that the female brain has a smaller gray matter/white matter ratio than males, that is, a larger white matter/gray matter ratio than the brain of males; this observation is in line with our findings concerning the number of edges, since the white matter consists of myelinated axons, which, in turn, roughly correspond to the connections in the brain graph. We have also found that the minimum bisection width, normalized with the edge number, is also significantly larger in the right and the left hemispheres in females: therefore, the differing bisection widths are independent from the difference in the number of edges.
Graph Theoretical Analysis Reveals: Women’s Brains Are Better Connected than Men’s
Szalkai, Balázs; Varga, Bálint; Grolmusz, Vince
2015-01-01
Deep graph-theoretic ideas in the context with the graph of the World Wide Web led to the definition of Google’s PageRank and the subsequent rise of the most popular search engine to date. Brain graphs, or connectomes, are being widely explored today. We believe that non-trivial graph theoretic concepts, similarly as it happened in the case of the World Wide Web, will lead to discoveries enlightening the structural and also the functional details of the animal and human brains. When scientists examine large networks of tens or hundreds of millions of vertices, only fast algorithms can be applied because of the size constraints. In the case of diffusion MRI-based structural human brain imaging, the effective vertex number of the connectomes, or brain graphs derived from the data is on the scale of several hundred today. That size facilitates applying strict mathematical graph algorithms even for some hard-to-compute (or NP-hard) quantities like vertex cover or balanced minimum cut. In the present work we have examined brain graphs, computed from the data of the Human Connectome Project, recorded from male and female subjects between ages 22 and 35. Significant differences were found between the male and female structural brain graphs: we show that the average female connectome has more edges, is a better expander graph, has larger minimal bisection width, and has more spanning trees than the average male connectome. Since the average female brain weighs less than the brain of males, these properties show that the female brain has better graph theoretical properties, in a sense, than the brain of males. It is known that the female brain has a smaller gray matter/white matter ratio than males, that is, a larger white matter/gray matter ratio than the brain of males; this observation is in line with our findings concerning the number of edges, since the white matter consists of myelinated axons, which, in turn, roughly correspond to the connections in the brain graph. We have also found that the minimum bisection width, normalized with the edge number, is also significantly larger in the right and the left hemispheres in females: therefore, the differing bisection widths are independent from the difference in the number of edges. PMID:26132764
Fisher metric, geometric entanglement, and spin networks
NASA Astrophysics Data System (ADS)
Chirco, Goffredo; Mele, Fabio M.; Oriti, Daniele; Vitale, Patrizia
2018-02-01
Starting from recent results on the geometric formulation of quantum mechanics, we propose a new information geometric characterization of entanglement for spin network states in the context of quantum gravity. For the simple case of a single-link fixed graph (Wilson line), we detail the construction of a Riemannian Fisher metric tensor and a symplectic structure on the graph Hilbert space, showing how these encode the whole information about separability and entanglement. In particular, the Fisher metric defines an entanglement monotone which provides a notion of distance among states in the Hilbert space. In the maximally entangled gauge-invariant case, the entanglement monotone is proportional to a power of the area of the surface dual to the link thus supporting a connection between entanglement and the (simplicial) geometric properties of spin network states. We further extend such analysis to the study of nonlocal correlations between two nonadjacent regions of a generic spin network graph characterized by the bipartite unfolding of an intertwiner state. Our analysis confirms the interpretation of spin network bonds as a result of entanglement and to regard the same spin network graph as an information graph, whose connectivity encodes, both at the local and nonlocal level, the quantum correlations among its parts. This gives a further connection between entanglement and geometry.
Delay-time distribution in the scattering of time-narrow wave packets (II)—quantum graphs
NASA Astrophysics Data System (ADS)
Smilansky, Uzy; Schanz, Holger
2018-02-01
We apply the framework developed in the preceding paper in this series (Smilansky 2017 J. Phys. A: Math. Theor. 50 215301) to compute the time-delay distribution in the scattering of ultra short radio frequency pulses on complex networks of transmission lines which are modeled by metric (quantum) graphs. We consider wave packets which are centered at high wave number and comprise many energy levels. In the limit of pulses of very short duration we compute upper and lower bounds to the actual time-delay distribution of the radiation emerging from the network using a simplified problem where time is replaced by the discrete count of vertex-scattering events. The classical limit of the time-delay distribution is also discussed and we show that for finite networks it decays exponentially, with a decay constant which depends on the graph connectivity and the distribution of its edge lengths. We illustrate and apply our theory to a simple model graph where an algebraic decay of the quantum time-delay distribution is established.
Accelerometer Method and Apparatus for Integral Display and Control Functions
NASA Technical Reports Server (NTRS)
Bozeman, Richard J., Jr. (Inventor)
1996-01-01
Method and apparatus for detecting mechanical vibrations and outputting a signal in response thereto. Art accelerometer package having integral display and control functions is suitable for mounting upon the machinery to be monitored. Display circuitry provides signals to a bar graph display which may be used to monitor machine conditions over a period of time. Control switches may be set which correspond to elements in the bar graph to provide an alert if vibration signals increase in amplitude over a selected trip point. The circuitry is shock mounted within the accelerometer housing. The method provides for outputting a broadband analog accelerometer signal, integrating this signal to produce a velocity signal, integrating and calibrating the velocity signal before application to a display driver, and selecting a trip point at which a digitally compatible output signal is generated.
Integrated segmentation and recognition of connected Ottoman script
NASA Astrophysics Data System (ADS)
Yalniz, Ismet Zeki; Altingovde, Ismail Sengor; Güdükbay, Uğur; Ulusoy, Özgür
2009-11-01
We propose a novel context-sensitive segmentation and recognition method for connected letters in Ottoman script. This method first extracts a set of segments from a connected script and determines the candidate letters to which extracted segments are most similar. Next, a function is defined for scoring each different syntactically correct sequence of these candidate letters. To find the candidate letter sequence that maximizes the score function, a directed acyclic graph is constructed. The letters are finally recognized by computing the longest path in this graph. Experiments using a collection of printed Ottoman documents reveal that the proposed method provides >90% precision and recall figures in terms of character recognition. In a further set of experiments, we also demonstrate that the framework can be used as a building block for an information retrieval system for digital Ottoman archives.
Accelerometer Method and Apparatus for Integral Display and Control Functions
NASA Technical Reports Server (NTRS)
Bozeman, Richard J., Jr. (Inventor)
1998-01-01
Method and apparatus for detecting mechanical vibrations and outputting a signal in response thereto is discussed. An accelerometer package having integral display and control functions is suitable for mounting upon the machinery to be monitored. Display circuitry provides signals to a bar graph display which may be used to monitor machine conditions over a period of time. Control switches may be set which correspond to elements in the bar graph to provide an alert if vibration signals increase in amplitude over a selected trip point. The circuitry is shock mounted within the accelerometer housing. The method provides for outputting a broadband analog accelerometer signal, integrating this signal to produce a velocity signal, integrating and calibrating the velocity signal before application to a display driver, and selecting a trip point at which a digitally compatible output signal is generated.
Coding for Single-Line Transmission
NASA Technical Reports Server (NTRS)
Madison, L. G.
1983-01-01
Digital transmission code combines data and clock signals into single waveform. MADCODE needs four standard integrated circuits in generator and converter plus five small discrete components. MADCODE allows simple coding and decoding for transmission of digital signals over single line.
Real-time community detection in full social networks on a laptop
Chamberlain, Benjamin Paul; Levy-Kramer, Josh; Humby, Clive
2018-01-01
For a broad range of research and practical applications it is important to understand the allegiances, communities and structure of key players in society. One promising direction towards extracting this information is to exploit the rich relational data in digital social networks (the social graph). As global social networks (e.g., Facebook and Twitter) are very large, most approaches make use of distributed computing systems for this purpose. Distributing graph processing requires solving many difficult engineering problems, which has lead some researchers to look at single-machine solutions that are faster and easier to maintain. In this article, we present an approach for analyzing full social networks on a standard laptop, allowing for interactive exploration of the communities in the locality of a set of user specified query vertices. The key idea is that the aggregate actions of large numbers of users can be compressed into a data structure that encapsulates the edge weights between vertices in a derived graph. Local communities can be constructed by selecting vertices that are connected to the query vertices with high edge weights in the derived graph. This compression is robust to noise and allows for interactive queries of local communities in real-time, which we define to be less than the average human reaction time of 0.25s. We achieve single-machine real-time performance by compressing the neighborhood of each vertex using minhash signatures and facilitate rapid queries through Locality Sensitive Hashing. These techniques reduce query times from hours using industrial desktop machines operating on the full graph to milliseconds on standard laptops. Our method allows exploration of strongly associated regions (i.e., communities) of large graphs in real-time on a laptop. It has been deployed in software that is actively used by social network analysts and offers another channel for media owners to monetize their data, helping them to continue to provide free services that are valued by billions of people globally. PMID:29342158
Securing Provenance of Distributed Processes in an Untrusted Environment
NASA Astrophysics Data System (ADS)
Syalim, Amril; Nishide, Takashi; Sakurai, Kouichi
Recently, there is much concern about the provenance of distributed processes, that is about the documentation of the origin and the processes to produce an object in a distributed system. The provenance has many applications in the forms of medical records, documentation of processes in the computer systems, recording the origin of data in the cloud, and also documentation of human-executed processes. The provenance of distributed processes can be modeled by a directed acyclic graph (DAG) where each node represents an entity, and an edge represents the origin and causal relationship between entities. Without sufficient security mechanisms, the provenance graph suffers from integrity and confidentiality problems, for example changes or deletions of the correct nodes, additions of fake nodes and edges, and unauthorized accesses to the sensitive nodes and edges. In this paper, we propose an integrity mechanism for provenance graph using the digital signature involving three parties: the process executors who are responsible in the nodes' creation, a provenance owner that records the nodes to the provenance store, and a trusted party that we call the Trusted Counter Server (TCS) that records the number of nodes stored by the provenance owner. We show that the mechanism can detect the integrity problem in the provenance graph, namely unauthorized and malicious “authorized” updates even if all the parties, except the TCS, collude to update the provenance. In this scheme, the TCS only needs a very minimal storage (linear with the number of the provenance owners). To protect the confidentiality and for an efficient access control administration, we propose a method to encrypt the provenance graph that allows access by paths and compartments in the provenance graph. We argue that encryption is important as a mechanism to protect the provenance data stored in an untrusted environment. We analyze the security of the integrity mechanism, and perform experiments to measure the performance of both mechanisms.
"Battleship Numberline": A Digital Game for Improving Estimation Accuracy on Fraction Number Lines
ERIC Educational Resources Information Center
Lomas, Derek; Ching, Dixie; Stampfer, Eliane; Sandoval, Melanie; Koedinger, Ken
2011-01-01
Given the strong relationship between number line estimation accuracy and math achievement, might a computer-based number line game help improve math achievement? In one study by Rittle-Johnson, Siegler and Alibali (2001), a simple digital game called "Catch the Monster" provided practice in estimating the location of decimals on a…
Multigraph: Reusable Interactive Data Graphs
NASA Astrophysics Data System (ADS)
Phillips, M. B.
2010-12-01
There are surprisingly few good software tools available for presenting time series data on the internet. The most common practice is to use a desktop program such as Excel or Matlab to save a graph as an image which can be included in a web page like any other image. This disconnects the graph from the data in a way that makes updating a graph with new data a cumbersome manual process, and it limits the user to one particular view of the data. The Multigraph project defines an XML format for describing interactive data graphs, and software tools for creating and rendering those graphs in web pages and other internet connected applications. Viewing a Multigraph graph is extremely simple and intuitive, and requires no instructions; the user can pan and zoom by clicking and dragging, in a familiar "Google Maps" kind of way. Creating a new graph for inclusion in a web page involves writing a simple XML configuration file. Multigraph can read data in a variety of formats, and can display data from a web service, allowing users to "surf" through large data sets, downloading only those the parts of the data that are needed for display. The Multigraph XML format, or "MUGL" for short, provides a concise description of the visual properties of a graph, such as axes, plot styles, data sources, labels, etc, as well as interactivity properties such as how and whether the user can pan or zoom along each axis. Multigraph reads a file in this format, draws the described graph, and allows the user to interact with it. Multigraph software currently includes a Flash application for embedding graphs in web pages, a Flex component for embedding graphs in larger Flex/Flash applications, and a plugin for creating graphs in the WordPress content management system. Plans for the future include a Java version for desktop viewing and editing, a command line version for batch and server side rendering, and possibly Android and iPhone versions. Multigraph is currently in use on several web sites including the US Drought Portal (www.drought.gov), the NOAA Climate Services Portal (www.climate.gov), the Climate Reference Network (www.ncdc.noaa.gov/crn), NCDC's State of the Climate Report (www.ncdc.noaa.gov/sotc), and the US Forest Service's Forest Change Assessment Viewer (ews.forestthreats.org/NPDE/NPDE.html). More information about Multigraph is available from the web site www.multigraph.org. Interactive Multigraph Display of Real Time Weather Data
Exactly solved models on planar graphs with vertices in {Z}^3
NASA Astrophysics Data System (ADS)
Kels, Andrew P.
2017-12-01
It is shown how exactly solved edge interaction models on the square lattice, may be extended onto more general planar graphs, with edges connecting a subset of next nearest neighbour vertices of {Z}3 . This is done by using local deformations of the square lattice, that arise through the use of the star-triangle relation. Similar to Baxter’s Z-invariance property, these local deformations leave the partition function invariant up to some simple factors coming from the star-triangle relation. The deformations used here extend the usual formulation of Z-invariance, by requiring the introduction of oriented rapidity lines which form directed closed paths in the rapidity graph of the model. The quasi-classical limit is also considered, in which case the deformations imply a classical Z-invariance property, as well as a related local closure relation, for the action functional of a system of classical discrete Laplace equations.
Graph-based real-time fault diagnostics
NASA Technical Reports Server (NTRS)
Padalkar, S.; Karsai, G.; Sztipanovits, J.
1988-01-01
A real-time fault detection and diagnosis capability is absolutely crucial in the design of large-scale space systems. Some of the existing AI-based fault diagnostic techniques like expert systems and qualitative modelling are frequently ill-suited for this purpose. Expert systems are often inadequately structured, difficult to validate and suffer from knowledge acquisition bottlenecks. Qualitative modelling techniques sometimes generate a large number of failure source alternatives, thus hampering speedy diagnosis. In this paper we present a graph-based technique which is well suited for real-time fault diagnosis, structured knowledge representation and acquisition and testing and validation. A Hierarchical Fault Model of the system to be diagnosed is developed. At each level of hierarchy, there exist fault propagation digraphs denoting causal relations between failure modes of subsystems. The edges of such a digraph are weighted with fault propagation time intervals. Efficient and restartable graph algorithms are used for on-line speedy identification of failure source components.
Improved segmentation of abnormal cervical nuclei using a graph-search based approach
NASA Astrophysics Data System (ADS)
Zhang, Ling; Liu, Shaoxiong; Wang, Tianfu; Chen, Siping; Sonka, Milan
2015-03-01
Reliable segmentation of abnormal nuclei in cervical cytology is of paramount importance in automation-assisted screening techniques. This paper presents a general method for improving the segmentation of abnormal nuclei using a graph-search based approach. More specifically, the proposed method focuses on the improvement of coarse (initial) segmentation. The improvement relies on a transform that maps round-like border in the Cartesian coordinate system into lines in the polar coordinate system. The costs consisting of nucleus-specific edge and region information are assigned to the nodes. The globally optimal path in the constructed graph is then identified by dynamic programming. We have tested the proposed method on abnormal nuclei from two cervical cell image datasets, Herlev and H and E stained liquid-based cytology (HELBC), and the comparative experiments with recent state-of-the-art approaches demonstrate the superior performance of the proposed method.
[Health for All-Italia: an indicator system on health].
Burgio, Alessandra; Crialesi, Roberta; Loghi, Marzia
2003-01-01
The Health for All - Italia information system collects health data from several sources. It is intended to be a cornerstone for the achievement of an overview about health in Italy. Health is analyzed at different levels, ranging from health services, health needs, lifestyles, demographic, social, economic and environmental contexts. The database associated software allows to pin down statistical data into graphs and tables, and to carry out simple statistical analysis. It is therefore possible to view the indicators' time series, make simple projections and compare the various indicators over the years for each territorial unit. This is possible by means of tables, graphs (histograms, line graphs, frequencies, linear regression with calculation of correlation coefficients, etc) and maps. These charts can be exported to other programs (i.e. Word, Excel, Power Point), or they can be directly printed in color or black and white.
NASA Astrophysics Data System (ADS)
Raymond, M.
1982-06-01
The Karasek Home is a single family Massachusetts residence whose active-solar-energy system is equipped with 640 square feet of trickle-down liquid flat-plate collectors, storage in a 300-gallon tank and a 2000-gallon tank embedded in a rock bin in the basement, and an oil-fired glass-lined 40-gallon domestic hot water tank for auxiliary water and space heating. Monthly performance data are tabulated for the overall system and for the collector, storage, space heating, and domestic hot water subsystems. For each month a graph is presented of collector array efficiency versus the difference between the inlet water temperature and ambient temperature divided by insolation. Typical system operation is illustrated by graphs of insolation and temperatures at different parts of the system versus time for a typical day. The typical system operating sequence for a day is also graphed as well as solar energy utilization and heat losses.
Breast histopathology image segmentation using spatio-colour-texture based graph partition method.
Belsare, A D; Mushrif, M M; Pangarkar, M A; Meshram, N
2016-06-01
This paper proposes a novel integrated spatio-colour-texture based graph partitioning method for segmentation of nuclear arrangement in tubules with a lumen or in solid islands without a lumen from digitized Hematoxylin-Eosin stained breast histology images, in order to automate the process of histology breast image analysis to assist the pathologists. We propose a new similarity based super pixel generation method and integrate it with texton representation to form spatio-colour-texture map of Breast Histology Image. Then a new weighted distance based similarity measure is used for generation of graph and final segmentation using normalized cuts method is obtained. The extensive experiments carried shows that the proposed algorithm can segment nuclear arrangement in normal as well as malignant duct in breast histology tissue image. For evaluation of the proposed method the ground-truth image database of 100 malignant and nonmalignant breast histology images is created with the help of two expert pathologists and the quantitative evaluation of proposed breast histology image segmentation has been performed. It shows that the proposed method outperforms over other methods. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.
Jelicic Kadic, Antonia; Vucic, Katarina; Dosenovic, Svjetlana; Sapunar, Damir; Puljak, Livia
2016-06-01
To compare speed and accuracy of graphical data extraction using manual estimation and open source software. Data points from eligible graphs/figures published in randomized controlled trials (RCTs) from 2009 to 2014 were extracted by two authors independently, both by manual estimation and with the Plot Digitizer, open source software. Corresponding authors of each RCT were contacted up to four times via e-mail to obtain exact numbers that were used to create graphs. Accuracy of each method was compared against the source data from which the original graphs were produced. Software data extraction was significantly faster, reducing time for extraction for 47%. Percent agreement between the two raters was 51% for manual and 53.5% for software data extraction. Percent agreement between the raters and original data was 66% vs. 75% for the first rater and 69% vs. 73% for the second rater, for manual and software extraction, respectively. Data extraction from figures should be conducted using software, whereas manual estimation should be avoided. Using software for data extraction of data presented only in figures is faster and enables higher interrater reliability. Copyright © 2016 Elsevier Inc. All rights reserved.
Comparing morphologies of drainage basins on Mars and Earth using integral-geometry and neural maps
NASA Technical Reports Server (NTRS)
Stepinski, T. F.; Coradetti, S.
2004-01-01
We compare morphologies of drainage basins on Mars and Earth in order to confine the formation process of Martian valley networks. Basins on both planets are computationally extracted from digital topography. Integral-geometry methods are used to represent each basin by a circularity function that encapsulates its internal structure. The shape of such a function is an indicator of the style of fluvial erosion. We use the self-organizing map technique to construct a similarity graph for all basins. The graph reveals systematic differences between morphologies of basins on the two planets. This dichotomy indicates that terrestrial and Martian surfaces were eroded differently. We argue that morphologies of Martian basins are incompatible with runoff from sustained, homogeneous rainfall. Fluvial environments compatible with observed morphologies are discussed. We also construct a similarity graph based on the comparison of basins hypsometric curves to demonstrate that hypsometry is incapable of discriminating between terrestrial and Martian basins. INDEX TERMS: 1824 Hydrology: Geomorphology (1625); 1886 Hydrology: Weathering (1625); 5415 Planetology: Solid Surface Planets: Erosion and weathering; 6225 Planetology: Solar System Objects Mars. Citation: Stepinski, T. F., and S. Coradetti (2004), Comparing morphologies of drainage basins on Mars and Earth using integral-ge
A soft kinetic data structure for lesion border detection.
Kockara, Sinan; Mete, Mutlu; Yip, Vincent; Lee, Brendan; Aydin, Kemal
2010-06-15
The medical imaging and image processing techniques, ranging from microscopic to macroscopic, has become one of the main components of diagnostic procedures to assist dermatologists in their medical decision-making processes. Computer-aided segmentation and border detection on dermoscopic images is one of the core components of diagnostic procedures and therapeutic interventions for skin cancer. Automated assessment tools for dermoscopic images have become an important research field mainly because of inter- and intra-observer variations in human interpretations. In this study, a novel approach-graph spanner-for automatic border detection in dermoscopic images is proposed. In this approach, a proximity graph representation of dermoscopic images in order to detect regions and borders in skin lesion is presented. Graph spanner approach is examined on a set of 100 dermoscopic images whose manually drawn borders by a dermatologist are used as the ground truth. Error rates, false positives and false negatives along with true positives and true negatives are quantified by digitally comparing results with manually determined borders from a dermatologist. The results show that the highest precision and recall rates obtained to determine lesion boundaries are 100%. However, accuracy of assessment averages out at 97.72% and borders errors' mean is 2.28% for whole dataset.
Automation in photogrammetry: Recent developments and applications (1972-1976)
Thompson, M.M.; Mikhail, E.M.
1976-01-01
An overview of recent developments in the automation of photogrammetry in various countries is presented. Conclusions regarding automated photogrammetry reached at the 1972 Congress in Ottawa are reviewed first as a background for examining the developments of 1972-1976. Applications are described for each country reporting significant developments. Among fifteen conclusions listed are statements concerning: the widespread practice of equipping existing stereoplotters with simple digitizers; the growing tendency to use minicomputers on-line with stereoplotters; the optimization of production of digital terrain models by progressive sampling in stereomodels; the potential of digitization of a photogrammetric model by density correlation on epipolar lines; the capabilities and economic aspects of advanced systems which permit simultaneous production of orthophotos, contours, and digital terrain models; the economy of off-line orthophoto systems; applications of digital image processing; automation by optical techniques; applications of sensors other than photographic imagery, and the role of photogrammetric phases in a completely automated cartographic system. ?? 1976.
ERIC Educational Resources Information Center
Padula, Janice
2014-01-01
If educators want to interest students in mathematics (and science), they must engage them in the lower forms of high school or even earlier (Fisher, 2012). So, teachers should always consider a topic's ability to interest students in the early years of instruction in high school and its topicality. Networks have come into prominence recently with…
An automated system for creep testing
NASA Technical Reports Server (NTRS)
Spiegel, F. Xavier; Weigman, Bernard J.
1992-01-01
A completely automated data collection system was devised to measure, analyze, and graph creep versus time using a PC, a 16 channel multiplexed analog to digital converter, and low friction potentiometers to measure length. The sampling rate for each experiment can be adjusted in the software to meet the needs of the material tested. Data is collected and stored on a diskette for permanent record and also for later data analysis on a different machine.
Multi-Disciplinary Techniques for Understanding Time-Varying Space-Based Imagery.
1985-05-10
problem, and I V WY" 3 discuss the impgrtage of this work to Air Force technology and to related Air Force programs. Section 1.5 provides a summary of...development of new algorithms and their realization in a hybrid optical/digital architecture. However, devices and architectures being developed in related ...and relate these representntions to object and surface contour properties of the scene. The techniques studied included Probabilistic Graph Matching
Data Rescue for precipitation station network in Slovak Republic
NASA Astrophysics Data System (ADS)
Fasko, Pavel; Bochníček, Oliver; Švec, Marek; Paľušová, Zuzana; Markovič, Ladislav
2016-04-01
Transparency of archive catalogues presents very important task for the data saving. It helps to the further activities e.g. digitalization and homogenization. For the time being visualization of time series continuation in precipitation stations (approximately 1250 stations) is under way in Slovak Republic since the beginning of observation (meteorological stations gradually began to operate during the second half of the 19th century in Slovakia). Visualization is joined with the activities like verification and accessibility of the data mentioned in the archive catalogue, station localization according to the historical annual books, conversion of coordinates into x-JTSK, y-JTSK and hydrological catchment assignment. Clustering of precipitation stations at the specific hydrological catchment in the map and visualization of the data duration (line graph) will lead to the effective assignment of corresponding precipitation stations for the prolongation of time series. This process should be followed by the process of turn or trend detection and homogenization. The risks and problems at verification of records from archive catalogues, their digitalization, repairs and the way of visualization will be seen in poster. During the searching process of the historical and often short time series, we realized the importance of mainly those stations, located in the middle and higher altitudes. They might be used as replacement for up to now quoted fictive points used at the construction of precipitation maps. Supplementing and enhancing the time series of individual stations will enable to follow changes in precipitation totals during the certain period as well as area totals for individual catchments in various time periods appreciated mainly by hydrologists and agro-climatologists.
Jargon that Computes: Today's PC Terminology.
ERIC Educational Resources Information Center
Crawford, Walt
1997-01-01
Discusses PC (personal computer) and telecommunications terminology in context: Integrated Services Digital Network (ISDN); Asymmetric Digital Subscriber Line (ADSL); cable modems; satellite downloads; T1 and T3 lines; magnitudes ("giga-,""nano-"); Central Processing Unit (CPU); Random Access Memory (RAM); Universal Serial Bus…
NASA Astrophysics Data System (ADS)
Iwakuni, Kana; Okubo, Sho; Inaba, Hajime; Onae, Atsushi; Hong, Feng-Lei; Sasada, Hiroyuki; Yamada, Koichi MT
2016-06-01
We observe that the pressure-broadening coefficients depend on the ortho-para levels. The spectrum is taken with a dual-comb spectrometer which has the resolution of 48 MHz and the frequency accuracy of 8 digit when the signal-to-noise ratio is more than 20. In this study, about 4.4-Tz wide spectra of the P(31) to R(31) transitions in the νb{1}+νb{3} vibration band of 12C_2H_2 are observed at the pressure of 25, 60, 396, 1047, 1962 and 2654 Pa. Each rotation-vibration absorption line is fitted to Voight function and we determined pressure-broadening coefficients for each rotation-vibration transition. The Figure shows pressure broadening coefficient as a function of m. Here m is J"+1 for R and -J" for P-branch. The graph shows obvious dependence on ortho and para. We fit it to Pade function considering the population ratio of three-to-one for the ortho and para levels. This would lead to detailed understanding of the pressure boarding mechanism. S. Okubo et al., Applied Physics Express 8, 082402 (2015)
1991-12-01
9 2.6.1 Multi-Shape Detection. .. .. .. .. .. .. ...... 9 Page 2.6.2 Line Segment Extraction and Re-Combination.. 9 2.6.3 Planimetric Feature... Extraction ............... 10 2.6.4 Line Segment Extraction From Statistical Texture Analysis .............................. 11 2.6.5 Edge Following as Graph...image after image, could benefit clue to the fact that major spatial characteristics of subregions could be extracted , and minor spatial changes could be
Liquid rocket booster study. Volume 2, book 2, appendix 1: Trades studies
NASA Technical Reports Server (NTRS)
1988-01-01
A list is presented of the trade studies which were planned and the status to which they have been accomplished. Full descriptions are also given of the trade studies along with line drawings and graphs illustrating the studies.
Synthetic aperture in terahertz in-line digital holography for resolution enhancement.
Huang, Haochong; Rong, Lu; Wang, Dayong; Li, Weihua; Deng, Qinghua; Li, Bin; Wang, Yunxin; Zhan, Zhiqiang; Wang, Xuemin; Wu, Weidong
2016-01-20
Terahertz digital holography is a combination of terahertz technology and digital holography. In digital holography, the imaging resolution is the key parameter in determining the detailed quality of a reconstructed wavefront. In this paper, the synthetic aperture method is used in terahertz digital holography and the in-line arrangement is built to perform the detection. The resolved capability of previous terahertz digital holographic systems restricts this technique to meet the requirement of practical detection. In contrast, the experimental resolved power of the present method can reach 125 μm, which is the best resolution of terahertz digital holography to date. Furthermore, the basic detection of a biological specimen is conducted to show the practical application. In all, the results of the proposed method demonstrate the enhancement of experimental imaging resolution and that the amplitude and phase distributions of the fine structure of samples can be reconstructed by using terahertz digital holography.
NASA Astrophysics Data System (ADS)
Zhou, Hang
Quantum walks are the quantum mechanical analogue of classical random walks. Discrete-time quantum walks have been introduced and studied mostly on the line Z or higher dimensional space Zd but rarely defined on graphs with fractal dimensions because the coin operator depends on the position and the Fourier transform on the fractals is not defined. Inspired by its nature of classical walks, different quantum walks will be defined by choosing different shift and coin operators. When the coin operator is uniform, the results of classical walks will be obtained upon measurement at each step. Moreover, with measurement at each step, our results reveal more information about the classical random walks. In this dissertation, two graphs with fractal dimensions will be considered. The first one is Sierpinski gasket, a degree-4 regular graph with Hausdorff dimension of df = ln 3/ ln 2. The second is the Cantor graph derived like Cantor set, with Hausdorff dimension of df = ln 2/ ln 3. The definitions and amplitude functions of the quantum walks will be introduced. The main part of this dissertation is to derive a recursive formula to compute the amplitude Green function. The exiting probability will be computed and compared with the classical results. When the generation of graphs goes to infinity, the recursion of the walks will be investigated and the convergence rates will be obtained and compared with the classical counterparts.
Bell, M R; Britson, P J; Chu, A; Holmes, D R; Bresnahan, J F; Schwartz, R S
1997-01-01
We describe a method of validation of computerized quantitative coronary arteriography and report the results of a new UNIX-based quantitative coronary arteriography software program developed for rapid on-line (digital) and off-line (digital or cinefilm) analysis. The UNIX operating system is widely available in computer systems using very fast processors and has excellent graphics capabilities. The system is potentially compatible with any cardiac digital x-ray system for on-line analysis and has been designed to incorporate an integrated database, have on-line and immediate recall capabilities, and provide digital access to all data. The accuracy (mean signed differences of the observed minus the true dimensions) and precision (pooled standard deviations of the measurements) of the program were determined x-ray vessel phantoms. Intra- and interobserver variabilities were assessed from in vivo studies during routine clinical coronary arteriography. Precision from the x-ray phantom studies (6-In. field of view) for digital images was 0.066 mm and for digitized cine images was 0.060 mm. Accuracy was 0.076 mm (overestimation) for digital images compared to 0.008 mm for digitized cine images. Diagnostic coronary catheters were also used for calibration; accuracy.varied according to size of catheter and whether or not they were filled with iodinated contrast. Intra- and interobserver variabilities were excellent and indicated that coronary lesion measurements were relatively user-independent. Thus, this easy to use and very fast UNIX based program appears to be robust with optimal accuracy and precision for clinical and research applications.
Army Medical Imaging System - ARMIS
1992-08-08
modems , scanners, hard disk drives, dot matrix printers, erasable-optical disc drives, CD-ROM drives, WORM disc drives and tape drives are fully...can use 56K leased lines, TI links, digital data circuits, or public telephone lines. 3. ISDN The Integrated Services Digital Network, ISDN, is a
Graph cuts for curvature based image denoising.
Bae, Egil; Shi, Juan; Tai, Xue-Cheng
2011-05-01
Minimization of total variation (TV) is a well-known method for image denoising. Recently, the relationship between TV minimization problems and binary MRF models has been much explored. This has resulted in some very efficient combinatorial optimization algorithms for the TV minimization problem in the discrete setting via graph cuts. To overcome limitations, such as staircasing effects, of the relatively simple TV model, variational models based upon higher order derivatives have been proposed. The Euler's elastica model is one such higher order model of central importance, which minimizes the curvature of all level lines in the image. Traditional numerical methods for minimizing the energy in such higher order models are complicated and computationally complex. In this paper, we will present an efficient minimization algorithm based upon graph cuts for minimizing the energy in the Euler's elastica model, by simplifying the problem to that of solving a sequence of easy graph representable problems. This sequence has connections to the gradient flow of the energy function, and converges to a minimum point. The numerical experiments show that our new approach is more effective in maintaining smooth visual results while preserving sharp features better than TV models.
Do, Hongdo; Molania, Ramyar
2017-01-01
The identification of genomic rearrangements with high sensitivity and specificity using massively parallel sequencing remains a major challenge, particularly in precision medicine and cancer research. Here, we describe a new method for detecting rearrangements, GRIDSS (Genome Rearrangement IDentification Software Suite). GRIDSS is a multithreaded structural variant (SV) caller that performs efficient genome-wide break-end assembly prior to variant calling using a novel positional de Bruijn graph-based assembler. By combining assembly, split read, and read pair evidence using a probabilistic scoring, GRIDSS achieves high sensitivity and specificity on simulated, cell line, and patient tumor data, recently winning SV subchallenge #5 of the ICGC-TCGA DREAM8.5 Somatic Mutation Calling Challenge. On human cell line data, GRIDSS halves the false discovery rate compared to other recent methods while matching or exceeding their sensitivity. GRIDSS identifies nontemplate sequence insertions, microhomologies, and large imperfect homologies, estimates a quality score for each breakpoint, stratifies calls into high or low confidence, and supports multisample analysis. PMID:29097403
Integrating the ECG power-line interference removal methods with rule-based system.
Kumaravel, N; Senthil, A; Sridhar, K S; Nithiyanandam, N
1995-01-01
The power-line frequency interference in electrocardiographic signals is eliminated to enhance the signal characteristics for diagnosis. The power-line frequency normally varies +/- 1.5 Hz from its standard value of 50 Hz. In the present work, the performances of the linear FIR filter, Wave digital filter (WDF) and adaptive filter for the power-line frequency variations from 48.5 to 51.5 Hz in steps of 0.5 Hz are studied. The advantage of the LMS adaptive filter in the removal of power-line frequency interference even if the frequency of interference varies by +/- 1.5 Hz from its normal value of 50 Hz over other fixed frequency filters is very well justified. A novel method of integrating rule-based system approach with linear FIR filter and also with Wave digital filter are proposed. The performances of Rule-based FIR filter and Rule-based Wave digital filter are compared with the LMS adaptive filter.
2016 Resembles Past Global Dust Storm Years on Mars
2016-10-05
This graphic indicates a similarity between 2016 (dark blue line) and five past years in which Mars has experienced a global dust storm (orange lines and band), compared to years with no global dust storm (blue-green lines and band). The arrow nearly midway across in the dark blue line indicates the Mars time of year in late September 2016. A key factor in the graph is the orbital angular momentum of Mars, which would be steady in a system of only one planet orbiting the sun, but varies due to relatively small effects of having other planets in the solar system. The horizontal scale is time of year on Mars, starting at left with the planet's farthest distance from the sun in each orbit. This point in the Mars year, called "Mars aphelion," corresponds to late autumn in the southern hemisphere. Numeric values on the horizontal axis are in Earth years; each Mars year lasts for about 1.9 Earth years. The vertical scale bar at left applies only to the black-line curve on the graph. The amount of solar energy entering Mars' atmosphere (in watts per square meter) peaks at the time of year when Mars is closest to the sun, corresponding to late spring in the southern hemisphere. The duration of Mars' dust storm season, as indicated, brackets the time of maximum solar input to the atmosphere. The scale bar at right, for orbital angular momentum, applies to the blue, brown and blue-green curves on the graph. The values are based on mass, velocity and distance from the gravitational center of the solar system. Additional information on the units is in a 2015 paper in the journal Icarus, from which this graph is derived. The band shaded in orange is superimposed on the curves of angular momentum for five Mars years that were accompanied by global dust storms in 1956, 1971, 1982, 1994 and 2007. Brown diamond symbols on the curves for these years in indicate the times when the global storms began. The band shaded blue-green lies atop angular momentum curves for six years when no global dust storms occurred: 1939, 1975, 1988, 1998, 2000 and 2011. Note that in 2016, as in the pattern of curves for years with global dust storms, the start of the dust storm season corresponded to a period of increasing orbital angular momentum. In years with no global storm, angular momentum was declining at that point. Observations of whether dust from regional storms on Mars spreads globally in late 2016 or early 2017 will determine whether this correspondence holds up for the current Mars year. http://photojournal.jpl.nasa.gov/catalog/PIA20855
Modelling prehistoric terrain Models using LiDAR-data: a geomorphological approach
NASA Astrophysics Data System (ADS)
Höfler, Veit; Wessollek, Christine; Karrasch, Pierre
2015-10-01
Terrain surfaces conserve human activities in terms of textures and structures. With reference to archaeological questions, the geological archive is investigated by means of models regarding anthropogenic traces. In doing so, the high-resolution digital terrain model is of inestimable value for the decoding of the archive. The evaluation of these terrain models and the reconstruction of historical surfaces is still a challenging issue. Due to the data collection by means of LiDAR systems (light detection and ranging) and despite their subsequent pre-processing and filtering, recently anthropogenic artefacts are still present in the digital terrain model. Analysis have shown that elements, such as contour lines and channels, can well be extracted from a high-resolution digital terrain model. This way, channels in settlement areas show a clear anthropogenic character. This fact can also be observed for contour lines. Some contour lines representing a possibly natural ground surface and avoid anthropogenic artefacts. Comparable to channels, noticeable patterns of contour lines become visible in areas with anthropogenic artefacts. The presented workflow uses functionalities of ArcGIS and the programming language R.1 The method starts with the extraction of contour lines from the digital terrain model. Through macroscopic analyses based on geomorphological expert knowledge, contour lines are selected representing the natural geomorphological character of the surface. In a first step, points are determined along each contour line in regular intervals. This points and the corresponding height information which is taken from an original digital terrain model is saved as a point cloud. Using the programme library gstat, a variographic analysis and the use of a Kriging-procedure based on this follow.2-4 The result is a digital terrain model filtered considering geomorphological expert knowledge showing no human degradation in terms of artefacts, preserving the landscape-genetic character and can be called a prehistoric terrain model.
Fully digital programmable optical frequency comb generation and application.
Yan, Xianglei; Zou, Xihua; Pan, Wei; Yan, Lianshan; Azaña, José
2018-01-15
We propose a fully digital programmable optical frequency comb (OFC) generation scheme based on binary phase-sampling modulation, wherein an optimized bit sequence is applied to phase modulate a narrow-linewidth light wave. Programming the bit sequence enables us to tune both the comb spacing and comb-line number (i.e., number of comb lines). The programmable OFCs are also characterized by ultra-flat spectral envelope, uniform temporal envelope, and stable bias-free setup. Target OFCs are digitally programmed to have 19, 39, 61, 81, 101, or 201 comb lines and to have a 100, 50, 20, 10, 5, or 1 MHz comb spacing. As a demonstration, a scanning-free temperature sensing system using a proposed OFC with 1001 comb lines was also implemented with a sensitivity of 0.89°C/MHz.
Coverage Maximization Using Dynamic Taint Tracing
2007-03-28
we do not have source code are handled, incompletely, via models of taint transfer. We use a little language to specify how taint transfers across a...n) 2.3.7 Implementation and Runtime Issues The taint graph instrumentation is a 2K line Ocaml module extending CIL and is supported by 5K lines of...modern scripting languages such as Ruby have taint modes that work similarly; however, all propagate taint at the variable rather than the byte level and
EROS Data Center Landsat digital enhancement techniques and imagery availability
Rohde, Wayne G.; Lo, Jinn Kai; Pohl, Russell A.
1978-01-01
The US Geological Survey's EROS Data Center (EDC) is experimenting with the production of digitally enhanced Landsat imagery. Advanced digital image processing techniques are used to perform geometric and radiometric corrections and to perform contrast and edge enhancements. The enhanced image product is produced from digitally preprocessed Landsat computer compatible tapes (CCTs) on a laser beam film recording system. Landsat CCT data have several geometric distortions which are corrected when NASA produces the standard film products. When producing film images from CCT's, geometric correction of the data is required. The EDC Digital Image Enhancement System (EDIES) compensates for geometric distortions introduced by Earth's rotation, variable line length, non-uniform mirror scan velocity, and detector misregistration. Radiometric anomalies such as bad data lines and striping are common to many Landsat film products and are also in the CCT data. Bad data lines or line segments with more than 150 contiguous bad pixels are corrected by inserting data from the previous line in place of the bad data. Striping, caused by variations in detector gain and offset, is removed with a destriping algorithm applied after digitally enhancing the data. Image enhancement is performed by applying a linear contrast stretch and an edge enhancement algorithm. The linear contrast enhancement algorithm is designed to expand digitally the full range of useful data recorded on the CCT over the range of 256 digital counts. This minimizes the effect of atmospheric scattering and saturates the relative brightness of highly reflecting features such as clouds or snow. It is the intent that no meaningful terrain data are eliminated by the digital processing. The edge enhancement algorithm is designed to enhance boundaries between terrain features that exhibit subtle differences in brightness values along edges of features. After the digital data have been processed, data for each Landsat band are recorded on black-and-white film with a laser beam film recorder (LBR). The LBR corrects for aspect ratio distortions as the digital data are recorded on the recording film over a preselected density range. Positive transparencies of MSS bands 4, 5, and 7 produced by the LBR are used to make color composite transparencies. Color film positives are made photographically from first generation black-and-white products generated on the LBR.
Dual-channel in-line digital holographic double random phase encryption
Das, Bhargab; Yelleswarapu, Chandra S; Rao, D V G L N
2012-01-01
We present a robust encryption method for the encoding of 2D/3D objects using digital holography and virtual optics. Using our recently developed dual-plane in-line digital holography technique, two in-line digital holograms are recorded at two different planes and are encrypted using two different double random phase encryption configurations, independently. The process of using two mutually exclusive encryption channels makes the system more robust against attacks since both the channels should be decrypted accurately in order to get a recognizable reconstruction. Results show that the reconstructed object is unrecognizable even when the portion of the correct phase keys used during decryption is close to 75%. The system is verified against blind decryptions by evaluating the SNR and MSE. Validation of the proposed method and sensitivities of the associated parameters are quantitatively analyzed and illustrated. PMID:23471012
Ghanma, M A; Rider, R V; Sirageldin, I
1984-01-01
The Lorenz Curve, originally developed to measure the concentration of wealth in a population, was used to describe the distribution of contraceptive practice in Jordan. Data from the 1976 Jordan Fertility Study, carried out as part of the World Fertility Survey program, was used in the analysis. The application of the Automatic Interaction Detector program to the survey's sample population of 3611 women of reproductive age divided the sample into 6 mutually exclusive groups on the basis of residence, education, and whether desired family size was attained or not attained. These 3 characteristics accounted for a major portion of the variation in contraceptive practice. These subgroups, in ascending order by the proportion practicing contraception, were: 1) rural women with unattained desired family size; 2) urban, illiterate women with unattained desired family size; 3) rural women with attained desired family size; 4) urban, literate women with unattained desired family size; 5) urban, illiterate women with attained desired family size; and 6) urban, literate women with attained desired family size. The cumulative proportion of the sample in each ordered subdivision was plotted on the X axis of a graph, and the cumulative proportion of those practicing contraception was plotted on the Y axis of the graph. A line connecting the intersection of the points on the X and Y axis was then drawn. The resultant line was a concave ascending line. If contraceptive practice was evenly distributed in the population, the line would be a straight diagonal line. The plotted curved line indicated that contraceptive practice was unevenly distributed in the population. 2 indexes for measuring the area between the diagonal and the line resulting from plotting the observed distribution for each subgroup was used to assess the degree of concentration of contraceptive practice in the population. The indexes also indicated that contraceptive practice was unequally distributed. When separate curves were plotted for the subgroups with attained desired family size and the subgroups without attained desired family size, it was apparent that the distribution of contraceptive practice was more uniform among those with attained desired family size than among the other 3 subgroups. A curve for the distribution of births was then plotted on the same graph. This curve was not a true application of the Lorenz Curve since it was based on the order of the subdivisions by birth rates. The resultant line approached the straight diagonal line and indicated that the distribution of births was fairly evenly distributed in the population. The uneven distribution of contraceptive practice and the uniform distribution of births suggests that contraceptive practice in this population is ineffective. This may be a characteristic of populations in the early stages of fertility control.
Bridging the Digital Divide with Off-Line E-Learning
ERIC Educational Resources Information Center
Hillier, Mathew
2018-01-01
This paper explores a proposal for an off-line e-learning platform that will provide a bridge for digitally unconnected students and educators to join the contemporary information and communications technology (ICT) intensive world. Individual remote and unconnected learners face a chicken and egg problem for engagement with contemporary…
DOT National Transportation Integrated Search
1999-02-01
This paper focuses on various digital subscriber line (xDSL) technologies and their potential application to Intelligent Transportation Systems (ITS). A summary of some of the features of xDSL technologies is given, followed by a description of a suc...
NASA Technical Reports Server (NTRS)
Hoffer, R. M. (Principal Investigator)
1980-01-01
To facilitate comparison between the four different spatial resolution of the NS-001 MSS data sets, a supervised approach was taken in defining training blocks for each of the different cover types. The training fields representing each cover type category were grouped and this group was clustered to determine the individual spectral classes within each cover type category which would effectively characterize the entire test site. Graphs show the variation in spectral response level with respect to distance in the across track dimension for four sampling intervals. Radar digitization procedures were developd. Flight characteristics and parameters for digitization of radar imagery are tabulated. The statement of work for phase 3 was reviewed and modifications were suggested to meet funding reduction.
Graph theory data for topological quantum chemistry.
Vergniory, M G; Elcoro, L; Wang, Zhijun; Cano, Jennifer; Felser, C; Aroyo, M I; Bernevig, B Andrei; Bradlyn, Barry
2017-08-01
Topological phases of noninteracting particles are distinguished by the global properties of their band structure and eigenfunctions in momentum space. On the other hand, group theory as conventionally applied to solid-state physics focuses only on properties that are local (at high-symmetry points, lines, and planes) in the Brillouin zone. To bridge this gap, we have previously [Bradlyn et al., Nature (London) 547, 298 (2017)NATUAS0028-083610.1038/nature23268] mapped the problem of constructing global band structures out of local data to a graph construction problem. In this paper, we provide the explicit data and formulate the necessary algorithms to produce all topologically distinct graphs. Furthermore, we show how to apply these algorithms to certain "elementary" band structures highlighted in the aforementioned reference, and thus we identified and tabulated all orbital types and lattices that can give rise to topologically disconnected band structures. Finally, we show how to use the newly developed bandrep program on the Bilbao Crystallographic Server to access the results of our computation.
NASA Astrophysics Data System (ADS)
Chen, Jie; Hu, Jiangnan
2017-06-01
Industry 4.0 and lean production has become the focus of manufacturing. A current issue is to analyse the performance of the assembly line balancing. This study focus on distinguishing the factors influencing the assembly line balancing. The one-way ANOVA method is applied to explore the significant degree of distinguished factors. And regression model is built to find key points. The maximal task time (tmax ), the quantity of tasks (n), and degree of convergence of precedence graph (conv) are critical for the performance of assembly line balancing. The conclusion will do a favor to the lean production in the manufacturing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winlaw, Manda; De Sterck, Hans; Sanders, Geoffrey
In very simple terms a network can be de ned as a collection of points joined together by lines. Thus, networks can be used to represent connections between entities in a wide variety of elds including engi- neering, science, medicine, and sociology. Many large real-world networks share a surprising number of properties, leading to a strong interest in model development research and techniques for building synthetic networks have been developed, that capture these similarities and replicate real-world graphs. Modeling these real-world networks serves two purposes. First, building models that mimic the patterns and prop- erties of real networks helps tomore » understand the implications of these patterns and helps determine which patterns are important. If we develop a generative process to synthesize real networks we can also examine which growth processes are plausible and which are not. Secondly, high-quality, large-scale network data is often not available, because of economic, legal, technological, or other obstacles [7]. Thus, there are many instances where the systems of interest cannot be represented by a single exemplar network. As one example, consider the eld of cybersecurity, where systems require testing across diverse threat scenarios and validation across diverse network structures. In these cases, where there is no single exemplar network, the systems must instead be modeled as a collection of networks in which the variation among them may be just as important as their common features. By developing processes to build synthetic models, so-called graph generators, we can build synthetic networks that capture both the essential features of a system and realistic variability. Then we can use such synthetic graphs to perform tasks such as simulations, analysis, and decision making. We can also use synthetic graphs to performance test graph analysis algorithms, including clustering algorithms and anomaly detection algorithms.« less
Parametric Equations: Push 'Em Back, Push 'Em Back, Way Back!
ERIC Educational Resources Information Center
Cieply, Joseph F.
1993-01-01
Stresses using the features of graphing calculators to teach parametric equations much earlier in the curriculum than is presently done. Examples using parametric equations to teach slopes and lines in beginning algebra, inverse functions in advanced algebra, the wrapping function, and simulations of physical phenomena are presented. (MAZ)
Modeling Nuclear Decay: A Point of Integration between Chemistry and Mathematics.
ERIC Educational Resources Information Center
Crippen, Kent J.; Curtright, Robert D.
1998-01-01
Describes four activities that use graphing calculators to model nuclear-decay phenomena. Students ultimately develop a notion about the radioactive waste produced by nuclear fission. These activities are in line with national educational standards and allow for the integration of science and mathematics. Contains 13 references. (Author/WRM)
Work with Us | Bioenergy | NREL
Work with Us Work with Us NREL provides partnerships and collaboration in the research and line graph in industrial facility; other workers in background. Leverage the expertise of our research variety of ways to get involved with NREL's bio-based research activities: Explore NREL's partnership
Computational Models for Belief Revision, Group Decision-Making and Cultural Shifts
2010-10-25
34social" networks; the green numbers are pseudo-trees or artificial (non-social) constructions. The dashed blue line indicates the range of Erdos- Renyi ...non-social networks such as Erdos- Renyi random graphs or the more passive non-cognitive spreading of disease or information flow, As mentioned
The "No Crossing Constraint" in Autosegmental Phonology.
ERIC Educational Resources Information Center
Coleman, John; Local, John
A discussion of autosegmental phonology (AP), a theory of phonological representation that uses graphs rather than strings as the central data structure, considers its principal constraint, the "No Crossing Constraint" (NCC). The NCC is the statement that in a well-formed autosegmental diagram, lines of association may not cross. After…
Testing Understanding and Understanding Testing.
ERIC Educational Resources Information Center
Pedersen, Jean; Ross, Peter
1985-01-01
Provides examples in which graphs are used in the statements of problems or in their solutions as a means of testing understanding of mathematical concepts. Examples (appropriate for a beginning course in calculus and analytic geometry) include slopes of lines and curves, quadratic formula, properties of the definite integral, and others. (JN)
1982-05-01
FACTORS: U.S. CUSTOMARY TO METRIC (SI) UNITS OF MEASUREMENT These conversion factors include all the significant digits given in the conversion...where suspended (Wuebben et al. 1978a). ships pass through narrow channels. Also, the ra- This disruption of river bottom sediments can pid water level...graphs that showed sites in the middle of the pic - 29 of these reaches (5.2 miles) showed evidence ture. The average photographic scale was deter- of
Automated rejection of parasitic frequency sidebands in heterodyne-detection LIDAR applications
NASA Technical Reports Server (NTRS)
Esproles, Carlos; Tratt, David M.; Menzies, Robert T.
1989-01-01
A technique is described for the detection of the sporadic onset of multiaxial mode behavior of a normally single-mode TEA CO2 laser. The technique is implemented using primarily commercial circuit modules; it incorporates a peak detector that displays the RF detector output on a digital voltmeter, and a LED bar graph. The technique was successfully demonstrated with an existing coherent atmospheric LIDAR facility utilizing an injection-seeded single-mode TEA CO2 laser. The block schematic diagram is included.
SPROC: A multiple-processor DSP IC
NASA Technical Reports Server (NTRS)
Davis, R.
1991-01-01
A large, single-chip, multiple-processor, digital signal processing (DSP) integrated circuit (IC) fabricated in HP-Cmos34 is presented. The innovative architecture is best suited for analog and real-time systems characterized by both parallel signal data flows and concurrent logic processing. The IC is supported by a powerful development system that transforms graphical signal flow graphs into production-ready systems in minutes. Automatic compiler partitioning of tasks among four on-chip processors gives the IC the signal processing power of several conventional DSP chips.
NASA Technical Reports Server (NTRS)
Thomann, G. C.
1973-01-01
Experiments to remotely determine sea water salinity from measurements of the sea surface radiometric temperature over the Mississippi Sound were conducted. The line was flown six times at an altitude of 244 meters. The radiometric temperature of the sea surface was measured in two spectral intervals. The specifications of the equipment and the conditions under which the tests were conducted are described. Results of the tests are presented in the form of graphs.
Short paths in expander graphs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kleinberg, J.; Rubinfeld, R.
Graph expansion has proved to be a powerful general tool for analyzing the behavior of routing algorithms and the interconnection networks on which they run. We develop new routing algorithms and structural results for bounded-degree expander graphs. Our results are unified by the fact that they are all based upon, and extend, a body of work asserting that expanders are rich in short, disjoint paths. In particular, our work has consequences for the disjoint paths problem, multicommodify flow, and graph minor containment. We show: (i) A greedy algorithm for approximating the maximum disjoint paths problem achieves a polylogarithmic approximation ratiomore » in bounded-degree expanders. Although our algorithm is both deterministic and on-line, its performance guarantee is an improvement over previous bounds in expanders. (ii) For a multicommodily flow problem with arbitrary demands on a bounded-degree expander, there is a (1 + {epsilon})-optimal solution using only flow paths of polylogarithmic length. It follows that the multicommodity flow algorithm of Awerbuch and Leighton runs in nearly linear time per commodity in expanders. Our analysis is based on establishing the following: given edge weights on an expander G, one can increase some of the weights very slightly so the resulting shortest-path metric is smooth - the min-weight path between any pair of nodes uses a polylogarithmic number of edges. (iii) Every bounded-degree expander on n nodes contains every graph with O(n/log{sup O(1)} n) nodes and edges as a minor.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zapata, Francisco; Kreinovich, Vladik; Joslyn, Cliff A.
2013-08-01
To make a decision, we need to compare the values of quantities. In many practical situations, we know the values with interval uncertainty. In such situations, we need to compare intervals. Allen’s algebra describes all possible relations between intervals on the real line, and ordering relations between such intervals are well studied. In this paper, we extend this description to intervals in an arbitrary partially ordered set (poset). In particular, we explicitly describe ordering relations between intervals that generalize relation between points. As auxiliary results, we provide a logical interpretation of the relation between intervals, and extend the results aboutmore » interval graphs to intervals over posets.« less
Bandwidth tunable microwave photonic filter based on digital and analog modulation
NASA Astrophysics Data System (ADS)
Zhang, Qi; Zhang, Jie; Li, Qiang; Wang, Yubing; Sun, Xian; Dong, Wei; Zhang, Xindong
2018-05-01
A bandwidth tunable microwave photonic filter based on digital and analog modulation is proposed and experimentally demonstrated. The digital modulation is used to broaden the effective gain spectrum and the analog modulation is to get optical lines. By changing the symbol rate of data pattern, the bandwidth is tunable from 50 MHz to 700 MHz. The interval of optical lines is set according to the bandwidth of gain spectrum which is related to the symbol rate. Several times of bandwidth increase are achieved compared to a single analog modulation and the selectivity of the response is increased by 3.7 dB compared to a single digital modulation.
Teaching and Learning Physics in a 1:1 Laptop School
NASA Astrophysics Data System (ADS)
Zucker, Andrew A.; Hug, Sarah T.
2008-12-01
1:1 laptop programs, in which every student is provided with a personal computer to use during the school year, permit increased and routine use of powerful, user-friendly computer-based tools. Growing numbers of 1:1 programs are reshaping the roles of teachers and learners in science classrooms. At the Denver School of Science and Technology, a public charter high school where a large percentage of students come from low-income families, 1:1 laptops are used often by teachers and students. This article describes the school's use of laptops, the Internet, and related digital tools, especially for teaching and learning physics. The data are from teacher and student surveys, interviews, classroom observations, and document analyses. Physics students and teachers use an interactive digital textbook; Internet-based simulations (some developed by a Nobel Prize winner); word processors; digital drop boxes; email; formative electronic assessments; computer-based and stand-alone graphing calculators; probes and associated software; and digital video cameras to explore hypotheses, collaborate, engage in scientific inquiry, and to identify strengths and weaknesses of students' understanding of physics. Technology provides students at DSST with high-quality tools to explore scientific concepts and the experiences of teachers and students illustrate effective uses of digital technology for high school physics.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-28
... appropriate advisory committee hot line/ phone line to learn about possible modifications before coming to the... premarket approval application for the Selenia C Digital Breast Tomosynthesis System, sponsored by Hologic, Inc. The Selenia C Digital Breast Tomosynthesis System is intended for use in the same clinical...
A non-iterative twin image elimination method with two in-line digital holograms
NASA Astrophysics Data System (ADS)
Kim, Jongwu; Lee, Heejung; Jeon, Philjun; Kim, Dug Young
2018-02-01
We propose a simple non-iterative in-line holographic measurement method which can effectively eliminate a twin image in digital holographic 3D imaging. It is shown that a twin image can be effectively eliminated with only two measured holograms by using a simple numerical propagation algorithm and arithmetic calculations.
Bicalho, R C; Machado, V S; Caixeta, L S
2009-07-01
Lameness is the most significant challenge for the dairy industry to overcome, given its obvious disruption of animal welfare and severe economic losses. Sole ulcers and white line abscesses are ubiquitous chronic diseases with the highest associated economic losses among all foot lesions. Their underlying causes are still not fully understood. An observational cross-sectional study was carried out to investigate the association between claw horn lesions and the thickness of the digital cushion.The thickness of the digital cushion was evaluated by ultrasonographic examination of the sole at the typical ulcer site. A total of 501 lactating Holstein dairy cows were enrolled in the study. The prevalence of sole ulcers was 4.2 and 27.8% for parity 1 and parity >1, respectively.The prevalence of white line disease was 1.0 and 6.5% for parity 1 and >1, respectively. The prevalence of lameness (visual locomotion score > or = 3) was 19.8 and 48.2% for parity 1 and >1, respectively. The prevalence of sole ulcers and white line diseases was significantly associated with thickness of the digital cushion; cows in the upper quartile of digital cushion thickness had an adjusted prevalence of lameness 15 percentage points lower than the lower quartile. Body condition scores were positively associated with digital cushion thickness.The mean gray value of the sonographic image of the digital cushion had a negative linear association with digital cushion thickness (R2 = 0.14), indicating that the composition of the digital cushion may have changed with its thickness. Furthermore, digital cushion thickness decreased steadily from the first month of lactation and reached a nadir 120 d after parturition.These results support the concept that sole ulcers and white line abscesses are related to contusions within the claw horn capsule and such contusions are a consequence of the lesser capacity of the digital cushion to dampen the pressure exerted by the third phalanx on the soft tissue beneath.
Novel approaches to analysis by flow injection gradient titration.
Wójtowicz, Marzena; Kozak, Joanna; Kościelniak, Paweł
2007-09-26
Two novel procedures for flow injection gradient titration with the use of a single stock standard solution are proposed. In the multi-point single-line (MP-SL) method the calibration graph is constructed on the basis of a set of standard solutions, which are generated in a standard reservoir and subsequently injected into the titrant. According to the single-point multi-line (SP-ML) procedure the standard solution and a sample are injected into the titrant stream from four loops of different capacities, hence four calibration graphs are able to be constructed and the analytical result is calculated on the basis of a generalized slope of these graphs. Both approaches have been tested on the example of spectrophotometric acid-base titration of hydrochloric and acetic acids with using bromothymol blue and phenolphthalein as indicators, respectively, and sodium hydroxide as a titrant. Under optimized experimental conditions the analytical results of precision less than 1.8 and 2.5% (RSD) and of accuracy less than 3.0 and 5.4% (relative error (RE)) were obtained for MP-SL and SP-ML procedures, respectively, in ranges of 0.0031-0.0631 mol L(-1) for samples of hydrochloric acid and of 0.1680-1.7600 mol L(-1) for samples of acetic acid. The feasibility of both methods was illustrated by applying them to the total acidity determination in vinegar samples with precision lower than 0.5 and 2.9% (RSD) for MP-SL and SP-ML procedures, respectively.
Visualization of permanent marks in progressive addition lenses by digital in-line holography
NASA Astrophysics Data System (ADS)
Perucho, Beatriz; Micó, Vicente
2013-04-01
A critical issue in the production of ophthalmic lenses is to guarantee the correct centering and alignment throughout the manufacturing and mounting processes. Aimed to that, progressive addition lenses (PALs) incorporate permanent marks at standardized locations at the lens. Those marks are engraved upon the surface and provide the model identification and addition power of the PAL, as well as to serve as locator marks to re-ink the removable marks again if necessary. Although the permanent marks should be visible by simple visual inspection, those marks are often faint and weak on new lenses providing low contrast, obscured by scratches on older lenses, and partially occluded and difficult to recognize on tinted or anti-reflection coated lenses. In this contribution, we present an extremely simple visualization system for permanent marks in PALs based on digital in-line holography. Light emitted by a superluminescent diode (SLD) is used to illuminate the PAL which is placed just before a digital (CCD) sensor. Thus, the CCD records an in-line hologram incoming from the diffracted wavefront provided by the PAL. As a result, it is possible to recover an in-focus image of the PAL inspected region by means of classical holographic tools applied in the digital domain. This numerical process involves digital recording of the in-line hologram, numerical back propagation to the PAL plane, and some digital processing to reduce noise and present a high quality final image. Preliminary experimental results are provided showing the applicability of the proposed method.
Color normalization of histology slides using graph regularized sparse NMF
NASA Astrophysics Data System (ADS)
Sha, Lingdao; Schonfeld, Dan; Sethi, Amit
2017-03-01
Computer based automatic medical image processing and quantification are becoming popular in digital pathology. However, preparation of histology slides can vary widely due to differences in staining equipment, procedures and reagents, which can reduce the accuracy of algorithms that analyze their color and texture information. To re- duce the unwanted color variations, various supervised and unsupervised color normalization methods have been proposed. Compared with supervised color normalization methods, unsupervised color normalization methods have advantages of time and cost efficient and universal applicability. Most of the unsupervised color normaliza- tion methods for histology are based on stain separation. Based on the fact that stain concentration cannot be negative and different parts of the tissue absorb different stains, nonnegative matrix factorization (NMF), and particular its sparse version (SNMF), are good candidates for stain separation. However, most of the existing unsupervised color normalization method like PCA, ICA, NMF and SNMF fail to consider important information about sparse manifolds that its pixels occupy, which could potentially result in loss of texture information during color normalization. Manifold learning methods like Graph Laplacian have proven to be very effective in interpreting high-dimensional data. In this paper, we propose a novel unsupervised stain separation method called graph regularized sparse nonnegative matrix factorization (GSNMF). By considering the sparse prior of stain concentration together with manifold information from high-dimensional image data, our method shows better performance in stain color deconvolution than existing unsupervised color deconvolution methods, especially in keeping connected texture information. To utilized the texture information, we construct a nearest neighbor graph between pixels within a spatial area of an image based on their distances using heat kernal in lαβ space. The representation of a pixel in the stain density space is constrained to follow the feature distance of the pixel to pixels in the neighborhood graph. Utilizing color matrix transfer method with the stain concentrations found using our GSNMF method, the color normalization performance was also better than existing methods.
Task scheduling in dataflow computer architectures
NASA Technical Reports Server (NTRS)
Katsinis, Constantine
1994-01-01
Dataflow computers provide a platform for the solution of a large class of computational problems, which includes digital signal processing and image processing. Many typical applications are represented by a set of tasks which can be repetitively executed in parallel as specified by an associated dataflow graph. Research in this area aims to model these architectures, develop scheduling procedures, and predict the transient and steady state performance. Researchers at NASA have created a model and developed associated software tools which are capable of analyzing a dataflow graph and predicting its runtime performance under various resource and timing constraints. These models and tools were extended and used in this work. Experiments using these tools revealed certain properties of such graphs that require further study. Specifically, the transient behavior at the beginning of the execution of a graph can have a significant effect on the steady state performance. Transformation and retiming of the application algorithm and its initial conditions can produce a different transient behavior and consequently different steady state performance. The effect of such transformations on the resource requirements or under resource constraints requires extensive study. Task scheduling to obtain maximum performance (based on user-defined criteria), or to satisfy a set of resource constraints, can also be significantly affected by a transformation of the application algorithm. Since task scheduling is performed by heuristic algorithms, further research is needed to determine if new scheduling heuristics can be developed that can exploit such transformations. This work has provided the initial development for further long-term research efforts. A simulation tool was completed to provide insight into the transient and steady state execution of a dataflow graph. A set of scheduling algorithms was completed which can operate in conjunction with the modeling and performance tools previously developed. Initial studies on the performance of these algorithms were done to examine the effects of application algorithm transformations as measured by such quantities as number of processors, time between outputs, time between input and output, communication time, and memory size.
North Dakota aeromagnetic and gravity maps and data, a web site for distribution of data
Sweeney, Ronald E.; Hill, Patricia L.
2003-01-01
The North Dakota aeromagnetic grid is constructed from grids that combine information collected in 13 separate aeromagnetic surveys conducted between 1978 and 2001. The data from these surveys are of varying quality. The design and specifications (terrain clearance, sampling rates, line spacing, and reduction procedures) varied from survey to survey depending on the purpose of the project and the technology of that time. Every attempt was made to acquire the data in digital form. Most of the available digital data were obtained from aeromagnetic surveys flown by the U.S. Geological Survey (USGS), flown on contract with the USGS, or were obtained from other federal agencies and state universities. Some of the 1980 data are available only on hand-contoured maps and had to be digitized. These maps were digitized along flight-line/contour-line intersections, which is considered to be the most accurate method of recovering the original data. Digitized data are available as USGS Open File Report 99-557. All surveys have been continued to 304.8 meters (1000 feet) above ground and then blended or merged together.
Glacier-specific elevation changes in western Alaska
NASA Astrophysics Data System (ADS)
Paul, Frank; Le Bris, Raymond
2013-04-01
Deriving glacier-specific elevation changes from DEM differencing and digital glacier outlines is rather straight-forward if the required datasets are available. Calculating such changes over large regions and including glaciers selected for mass balance measurements in the field, provides a possibility to determine the representativeness of the changes observed at these glaciers for the entire region. The related comparison of DEM-derived values for these glaciers with the overall mean avoids the rather error-prone conversion of volume to mass changes (e.g. due to unknown densities) and gives unit-less correction factors for upscaling the field measurements to a larger region. However, several issues have to be carefully considered, such as proper co-registration of the two DEMs, date and accuracy of the datasets compared, as well as source data used for DEM creation and potential artefacts (e.g. voids). In this contribution we present an assessment of the representativeness of the two mass balance glaciers Gulkana and Wolverine for the overall changes of nearly 3200 glaciers in western Alaska over a ca. 50-year time period. We use an elevation change dataset from a study by Berthier et al. (2010) that was derived from the USGS DEM of the 1960s (NED) and a more recent DEM derived from SPOT5 data for the SPIRIT project. Additionally, the ASTER GDEM was used as a more recent DEM. Historic glacier outlines were taken from the USGS digital line graph (DLG) dataset, corrected with the digital raster graph (DRG) maps from USGS. Mean glacier specific elevation changes were derived based on drainage divides from a recently created inventory. Land-terminating, lake-calving and tidewater glaciers were marked in the attribute table to determine their changes separately. We also investigated the impact of handling potential DEM artifacts in three different ways and compared elevation changes with altitude. The mean elevation changes of Gulkana and Wolverine glaciers (about -0.65 m / year) are very similar to the mean of the lake-calving and tidewater glaciers (about -0.6 m / year), but much more negative than for the land-terminating glaciers (about -0.24 m / year). The two mass balance glaciers are thus well representative for the entire region, but not for their own class. The different ways of considering positive elevation changes (e.g. setting them to zero or no data) influence the total values, but has otherwise little impact on the results (e.g. the correction factors are similar). The massive elevation loss of Columbia Glacier (-2.8 m / year) is exceptional and strongly influences the statistics when area-weighting is used to determine the regional mean. For the entire region this method yields more negative values for land-terminating and tidewater glaciers than the arithmetically averaged values, but for the lake-calving glaciers both are about the same.
NASA Technical Reports Server (NTRS)
Monford, L. G., Jr. (Inventor)
1974-01-01
A digital communication system is reported for parallel operation of 16 or more transceiver units with the use of only four interconnecting wires. A remote synchronization circuit produces unit address control words sequentially in data frames of 16 words. Means are provided in each transceiver unit to decode calling signals and to transmit calling and data signals. The transceivers communicate with each other over one data line. The synchronization unit communicates the address control information to the transceiver units over an address line and further provides the timing information over a clock line. A reference voltage level or ground line completes the interconnecting four wire hookup.
Loop expansion around the Bethe approximation through the M-layer construction
NASA Astrophysics Data System (ADS)
Altieri, Ada; Chiara Angelini, Maria; Lucibello, Carlo; Parisi, Giorgio; Ricci-Tersenghi, Federico; Rizzo, Tommaso
2017-11-01
For every physical model defined on a generic graph or factor graph, the Bethe M-layer construction allows building a different model for which the Bethe approximation is exact in the large M limit, and coincides with the original model for M=1 . The 1/M perturbative series is then expressed by a diagrammatic loop expansion in terms of so-called fat diagrams. Our motivation is to study some important second-order phase transitions that do exist on the Bethe lattice, but are either qualitatively different or absent in the corresponding fully connected case. In this case, the standard approach based on a perturbative expansion around the naive mean field theory (essentially a fully connected model) fails. On physical grounds, we expect that when the construction is applied to a lattice in finite dimension there is a small region of the external parameters, close to the Bethe critical point, where strong deviations from mean-field behavior will be observed. In this region, the 1/M expansion for the corrections diverges, and can be the starting point for determining the correct non-mean-field critical exponents using renormalization group arguments. In the end, we will show that the critical series for the generic observable can be expressed as a sum of Feynman diagrams with the same numerical prefactors of field theories. However, the contribution of a given diagram is not evaluated by associating Gaussian propagators to its lines, as in field theories: one has to consider the graph as a portion of the original lattice, replacing the internal lines with appropriate one-dimensional chains, and attaching to the internal points the appropriate number of infinite-size Bethe trees to restore the correct local connectivity of the original model. The actual contribution of each (fat) diagram is the so-called line-connected observable, which also includes contributions from sub-diagrams with appropriate prefactors. In order to compute the corrections near to the critical point, Feynman diagrams (with their symmetry factors) can be read directly from the appropriate field-theoretical literature; the computation of momentum integrals is also quite similar; the extra work consists of computing the line-connected observable of the associated fat diagram in the limit of all lines becoming infinitely long.
Digital communications: Microwave applications
NASA Astrophysics Data System (ADS)
Feher, K.
Transmission concepts and techniques of digital systems are presented; and practical state-of-the-art implementation of digital communications systems by line-of-sight microwaves is described. Particular consideration is given to statistical methods in digital transmission systems analysis, digital modulation methods, microwave amplifiers, system gain, m-ary and QAM microwave systems, correlative techniques and applications to digital radio systems, hybrid systems, digital microwave systems design, diversity and protection switching techniques, measurement techniques, and research and development trends and unsolved problems.
Evaluating video digitizer errors
NASA Astrophysics Data System (ADS)
Peterson, C.
2016-01-01
Analog output video cameras remain popular for recording meteor data. Although these cameras uniformly employ electronic detectors with fixed pixel arrays, the digitization process requires resampling the horizontal lines as they are output in order to reconstruct the pixel data, usually resulting in a new data array of different horizontal dimensions than the native sensor. Pixel timing is not provided by the camera, and must be reconstructed based on line sync information embedded in the analog video signal. Using a technique based on hot pixels, I present evidence that jitter, sync detection, and other timing errors introduce both position and intensity errors which are not present in cameras which internally digitize their sensors and output the digital data directly.
Reportable STDs in Young People 15-24 Years of Age, by State
... STD 101 in a Box Home Script for Sex in the City Video STD Clinical Slides STD Clinical Slides STD Picture ... include: line graphs by year; pie charts for sex; bar charts by state and country; bar charts for age, race/ethnicity, and transmission ... Quicktime file RealPlayer file Text file ...
Polynomial Graphs and Symmetry
ERIC Educational Resources Information Center
Goehle, Geoff; Kobayashi, Mitsuo
2013-01-01
Most quadratic functions are not even, but every parabola has symmetry with respect to some vertical line. Similarly, every cubic has rotational symmetry with respect to some point, though most cubics are not odd. We show that every polynomial has at most one point of symmetry and give conditions under which the polynomial has rotational or…
Battery Lifespan | Transportation Research | NREL
over time (ranging from 0 to 15 years) for three different climates (represented by Minneapolis . Trend lines from upper left to lower right reflect diminished capacity over time and shorter lifespan in Battery Life Model. Graph of relative capacity (ranging from .75 to 1) of battery in percent over time
TEACHERS' GUIDE. SEVENTH GRADE MATHEMATICS FOR THE ACADEMICALLY TALENTED.
ERIC Educational Resources Information Center
HORN, R.A.; AND OTHERS
MATERIALS ARE INTENDED FOR USE BY BOTH ENRICHED AND ACCELERATED MATHEMATICS COURSES. TEXTBOOKS AND SUPPLEMENTARY TEXTS FOR THE COURSE ARE GIVEN. THE COURSE OF STUDY IS DIVIDED INTO BROAD UNITS FOR EACH OF THE SEMESTERS--SEMESTER I- LINES, ANGLES, NUMBER SYSTEMS, POLYGONS, COMMON AND DECIMAL FRACTIONS, PERCENTAGE, AND MEASUREMENTS, GRAPHS,…
Common Progress Monitoring Graph Omissions: Missing Goal and Goal Line. Progress Monitoring Brief #2
ERIC Educational Resources Information Center
National Center on Response to Intervention, 2013
2013-01-01
Progress monitoring assessment is one of the four essential components of Response to Intervention (RTI), as defined by the National Center on Response to Intervention (NCRTI). Progress data allow teachers to evaluate the academic performance of students over time, quantify rates of improvement or responsiveness to instruction, and evaluate…
Fuzzy Math: A Meditation on Test Scoring
ERIC Educational Resources Information Center
Jacks, Meredith
2011-01-01
As a public school English teacher, the author observes standardized testing season each year with a sort of grim fascination. "So this is it," she thinks as she paces around her silent classroom, peering over kids' shoulders at articles about parasailing. Line graphs tracking the rainfall in Tulsa. Parts of speech. Functions of "x." "These are…
The Development of the Graphics-Decoding Proficiency Instrument
ERIC Educational Resources Information Center
Lowrie, Tom; Diezmann, Carmel M.; Kay, Russell
2011-01-01
The graphics-decoding proficiency (G-DP) instrument was developed as a screening test for the purpose of measuring students' (aged 8-11 years) capacity to solve graphics-based mathematics tasks. These tasks include number lines, column graphs, maps and pie charts. The instrument was developed within a theoretical framework which highlights the…
Quantitation & Case-Study-Driven Inquiry to Enhance Yeast Fermentation Studies
ERIC Educational Resources Information Center
Grammer, Robert T.
2012-01-01
We propose a procedure for the assay of fermentation in yeast in microcentrifuge tubes that is simple and rapid, permitting assay replicates, descriptive statistics, and the preparation of line graphs that indicate reproducibility. Using regression and simple derivatives to determine initial velocities, we suggest methods to compare the effects of…
ERIC Educational Resources Information Center
Kennon, J. Tillman; Fong, Bryant; Grippo, Anne
2016-01-01
This article describes how by using three points to make a line and comparing the graphs for water and oil, students can mathematically demonstrate that Gatorade dissolves in water much more readily than in oil. Students can also use units to understand and solve a multi-step problem by observing the color of each solution, making conductivity…
The Visual Side to Numeracy: Students' Sensemaking with Graphics
ERIC Educational Resources Information Center
Diezmann, Carmel; Lowrie, Tom; Sugars, Lindy; Logan, Tracy
2009-01-01
The 21st century has placed increasing demand on individuals' proficiency with a wide array of visual representations, that is graphics. Hence, proficiency with visual tasks needs to be embedded across the curriculum. In mathematics, various graphics (e.g., maps, charts, number lines, graphs) are used as means of communication of mathematical…
Overlapping communities detection based on spectral analysis of line graphs
NASA Astrophysics Data System (ADS)
Gui, Chun; Zhang, Ruisheng; Hu, Rongjing; Huang, Guoming; Wei, Jiaxuan
2018-05-01
Community in networks are often overlapping where one vertex belongs to several clusters. Meanwhile, many networks show hierarchical structure such that community is recursively grouped into hierarchical organization. In order to obtain overlapping communities from a global hierarchy of vertices, a new algorithm (named SAoLG) is proposed to build the hierarchical organization along with detecting the overlap of community structure. SAoLG applies the spectral analysis into line graphs to unify the overlap and hierarchical structure of the communities. In order to avoid the limitation of absolute distance such as Euclidean distance, SAoLG employs Angular distance to compute the similarity between vertices. Furthermore, we make a micro-improvement partition density to evaluate the quality of community structure and use it to obtain the more reasonable and sensible community numbers. The proposed SAoLG algorithm achieves a balance between overlap and hierarchy by applying spectral analysis to edge community detection. The experimental results on one standard network and six real-world networks show that the SAoLG algorithm achieves higher modularity and reasonable community number values than those generated by Ahn's algorithm, the classical CPM and GN ones.
Experimental transition probabilities for Mn II spectral lines
NASA Astrophysics Data System (ADS)
Manrique, J.; Aguilera, J. A.; Aragón, C.
2018-06-01
Transition probabilities for 46 spectral lines of Mn II with wavelengths in the range 2000-3500 Å have been measured by CSigma laser-induced breakdown spectroscopy (Cσ-LIBS). For 28 of the lines, experimental data had not been reported previously. The Cσ-LIBS method, based in the construction of generalized curves of growth called Cσ graphs, avoids the error due to self-absorption. The samples used to generate the laser-induced plasmas are fused glass disks prepared from pure MnO. The Mn concentrations in the samples and the lines included in the study are selected to ensure the validity of the model of homogeneous plasma used. The results are compared to experimental and theoretical values available in the literature.
U.S. Geological Survey spatial data access
Faundeen, John L.; Kanengieter, Ronald L.; Buswell, Michael D.
2002-01-01
The U.S. Geological Survey (USGS) has done a progress review on improving access to its spatial data holdings over the Web. The USGS EROS Data Center has created three major Web-based interfaces to deliver spatial data to the general public; they are Earth Explorer, the Seamless Data Distribution System (SDDS), and the USGS Web Mapping Portal. Lessons were learned in developing these systems, and various resources were needed for their implementation. The USGS serves as a fact-finding agency in the U.S. Government that collects, monitors, analyzes, and provides scientific information about natural resource conditions and issues. To carry out its mission, the USGS has created and managed spatial data since its inception. Originally relying on paper maps, the USGS now uses advanced technology to produce digital representations of the Earth’s features. The spatial products of the USGS include both source and derivative data. Derivative datasets include Digital Orthophoto Quadrangles (DOQ), Digital Elevation Models, Digital Line Graphs, land-cover Digital Raster Graphics, and the seamless National Elevation Dataset. These products, created with automated processes, use aerial photographs, satellite images, or other cartographic information such as scanned paper maps as source data. With Earth Explorer, users can search multiple inventories through metadata queries and can browse satellite and DOQ imagery. They can place orders and make payment through secure credit card transactions. Some USGS spatial data can be accessed with SDDS. The SDDS uses an ArcIMS map service interface to identify the user’s areas of interest and determine the output format; it allows the user to either download the actual spatial data directly for small areas or place orders for larger areas to be delivered on media. The USGS Web Mapping Portal provides views of national and international datasets through an ArcIMS map service interface. In addition, the map portal posts news about new map services available from the USGS, many simultaneously published on the Environmental Systems Research Institute Geography Network. These three information systems use new software tools and expanded hardware to meet the requirements of the users. The systems are designed to handle the required workload and are relatively easy to enhance and maintain. The software tools give users a high level of functionality and help the system conform to industry standards. The hardware and software architecture is designed to handle the large amounts of spatial data and Internet traffic required by the information systems. Last, customer support was needed to answer questions, monitor e-mail, and report customer problems.
Chiu, Stephanie J; Toth, Cynthia A; Bowes Rickman, Catherine; Izatt, Joseph A; Farsiu, Sina
2012-05-01
This paper presents a generalized framework for segmenting closed-contour anatomical and pathological features using graph theory and dynamic programming (GTDP). More specifically, the GTDP method previously developed for quantifying retinal and corneal layer thicknesses is extended to segment objects such as cells and cysts. The presented technique relies on a transform that maps closed-contour features in the Cartesian domain into lines in the quasi-polar domain. The features of interest are then segmented as layers via GTDP. Application of this method to segment closed-contour features in several ophthalmic image types is shown. Quantitative validation experiments for retinal pigmented epithelium cell segmentation in confocal fluorescence microscopy images attests to the accuracy of the presented technique.
Chiu, Stephanie J.; Toth, Cynthia A.; Bowes Rickman, Catherine; Izatt, Joseph A.; Farsiu, Sina
2012-01-01
This paper presents a generalized framework for segmenting closed-contour anatomical and pathological features using graph theory and dynamic programming (GTDP). More specifically, the GTDP method previously developed for quantifying retinal and corneal layer thicknesses is extended to segment objects such as cells and cysts. The presented technique relies on a transform that maps closed-contour features in the Cartesian domain into lines in the quasi-polar domain. The features of interest are then segmented as layers via GTDP. Application of this method to segment closed-contour features in several ophthalmic image types is shown. Quantitative validation experiments for retinal pigmented epithelium cell segmentation in confocal fluorescence microscopy images attests to the accuracy of the presented technique. PMID:22567602
Smile line assessment comparing quantitative measurement and visual estimation.
Van der Geld, Pieter; Oosterveld, Paul; Schols, Jan; Kuijpers-Jagtman, Anne Marie
2011-02-01
Esthetic analysis of dynamic functions such as spontaneous smiling is feasible by using digital videography and computer measurement for lip line height and tooth display. Because quantitative measurements are time-consuming, digital videography and semiquantitative (visual) estimation according to a standard categorization are more practical for regular diagnostics. Our objective in this study was to compare 2 semiquantitative methods with quantitative measurements for reliability and agreement. The faces of 122 male participants were individually registered by using digital videography. Spontaneous and posed smiles were captured. On the records, maxillary lip line heights and tooth display were digitally measured on each tooth and also visually estimated according to 3-grade and 4-grade scales. Two raters were involved. An error analysis was performed. Reliability was established with kappa statistics. Interexaminer and intraexaminer reliability values were high, with median kappa values from 0.79 to 0.88. Agreement of the 3-grade scale estimation with quantitative measurement showed higher median kappa values (0.76) than the 4-grade scale estimation (0.66). Differentiating high and gummy smile lines (4-grade scale) resulted in greater inaccuracies. The estimation of a high, average, or low smile line for each tooth showed high reliability close to quantitative measurements. Smile line analysis can be performed reliably with a 3-grade scale (visual) semiquantitative estimation. For a more comprehensive diagnosis, additional measuring is proposed, especially in patients with disproportional gingival display. Copyright © 2011 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.
Quasirandom geometric networks from low-discrepancy sequences
NASA Astrophysics Data System (ADS)
Estrada, Ernesto
2017-08-01
We define quasirandom geometric networks using low-discrepancy sequences, such as Halton, Sobol, and Niederreiter. The networks are built in d dimensions by considering the d -tuples of digits generated by these sequences as the coordinates of the vertices of the networks in a d -dimensional Id unit hypercube. Then, two vertices are connected by an edge if they are at a distance smaller than a connection radius. We investigate computationally 11 network-theoretic properties of two-dimensional quasirandom networks and compare them with analogous random geometric networks. We also study their degree distribution and their spectral density distributions. We conclude from this intensive computational study that in terms of the uniformity of the distribution of the vertices in the unit square, the quasirandom networks look more random than the random geometric networks. We include an analysis of potential strategies for generating higher-dimensional quasirandom networks, where it is know that some of the low-discrepancy sequences are highly correlated. In this respect, we conclude that up to dimension 20, the use of scrambling, skipping and leaping strategies generate quasirandom networks with the desired properties of uniformity. Finally, we consider a diffusive process taking place on the nodes and edges of the quasirandom and random geometric graphs. We show that the diffusion time is shorter in the quasirandom graphs as a consequence of their larger structural homogeneity. In the random geometric graphs the diffusion produces clusters of concentration that make the process more slow. Such clusters are a direct consequence of the heterogeneous and irregular distribution of the nodes in the unit square in which the generation of random geometric graphs is based on.
A review on high-resolution CMOS delay lines: towards sub-picosecond jitter performance.
Abdulrazzaq, Bilal I; Abdul Halin, Izhal; Kawahito, Shoji; Sidek, Roslina M; Shafie, Suhaidi; Yunus, Nurul Amziah Md
2016-01-01
A review on CMOS delay lines with a focus on the most frequently used techniques for high-resolution delay step is presented. The primary types, specifications, delay circuits, and operating principles are presented. The delay circuits reported in this paper are used for delaying digital inputs and clock signals. The most common analog and digitally-controlled delay elements topologies are presented, focusing on the main delay-tuning strategies. IC variables, namely, process, supply voltage, temperature, and noise sources that affect delay resolution through timing jitter are discussed. The design specifications of these delay elements are also discussed and compared for the common delay line circuits. As a result, the main findings of this paper are highlighting and discussing the followings: the most efficient high-resolution delay line techniques, the trade-off challenge found between CMOS delay lines designed using either analog or digitally-controlled delay elements, the trade-off challenge between delay resolution and delay range and the proposed solutions for this challenge, and how CMOS technology scaling can affect the performance of CMOS delay lines. Moreover, the current trends and efforts used in order to generate output delayed signal with low jitter in the sub-picosecond range are presented.
Strabo: An App and Database for Structural Geology and Tectonics Data
NASA Astrophysics Data System (ADS)
Newman, J.; Williams, R. T.; Tikoff, B.; Walker, J. D.; Good, J.; Michels, Z. D.; Ash, J.
2016-12-01
Strabo is a data system designed to facilitate digital storage and sharing of structural geology and tectonics data. The data system allows researchers to store and share field and laboratory data as well as construct new multi-disciplinary data sets. Strabo is built on graph database technology, as opposed to a relational database, which provides the flexibility to define relationships between objects of any type. This framework allows observations to be linked in a complex and hierarchical manner that is not possible in traditional database topologies. Thus, the advantage of the Strabo data structure is the ability of graph databases to link objects in both numerous and complex ways, in a manner that more accurately reflects the realities of the collecting and organizing of geological data sets. The data system is accessible via a mobile interface (iOS and Android devices) that allows these data to be stored, visualized, and shared during primary collection in the field or the laboratory. The Strabo Data System is underlain by the concept of a "Spot," which we define as any observation that characterizes a specific area. This can be anything from a strike and dip measurement of bedding to cross-cutting relationships between faults in complex dissected terrains. Each of these spots can then contain other Spots and/or measurements (e.g., lithology, slickenlines, displacement magnitude.) Hence, the Spot concept is applicable to all relationships and observation sets. Strabo is therefore capable of quantifying and digitally storing large spatial variations and complex geometries of naturally deformed rocks within hierarchically related maps and images. These approaches provide an observational fidelity comparable to a traditional field book, but with the added benefits of digital data storage, processing, and ease of sharing. This approach allows Strabo to integrate seamlessly into the workflow of most geologists. Future efforts will focus on extending Strabo to other sub-disciplines as well as developing a desktop system for the enhanced collection and organization of microstructural data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakano, M; Haga, A; Hanaoka, S
2016-06-15
Purpose: The purpose of this study is to propose a new concept of four-dimensional (4D) cone-beam CT (CBCT) reconstruction for non-periodic organ motion using the Time-ordered Chain Graph Model (TCGM), and to compare the reconstructed results with the previously proposed methods, the total variation-based compressed sensing (TVCS) and prior-image constrained compressed sensing (PICCS). Methods: CBCT reconstruction method introduced in this study consisted of maximum a posteriori (MAP) iterative reconstruction combined with a regularization term derived from a concept of TCGM, which includes a constraint coming from the images of neighbouring time-phases. The time-ordered image series were concurrently reconstructed in themore » MAP iterative reconstruction framework. Angular range of projections for each time-phase was 90 degrees for TCGM and PICCS, and 200 degrees for TVCS. Two kinds of projection data, an elliptic-cylindrical digital phantom data and two clinical patients’ data, were used for reconstruction. The digital phantom contained an air sphere moving 3 cm along longitudinal axis, and temporal resolution of each method was evaluated by measuring the penumbral width of reconstructed moving air sphere. The clinical feasibility of non-periodic time-ordered 4D CBCT reconstruction was also examined using projection data of prostate cancer patients. Results: The results of reconstructed digital phantom shows that the penumbral widths of TCGM yielded the narrowest result; PICCS and TCGM were 10.6% and 17.4% narrower than that of TVCS, respectively. This suggests that the TCGM has the better temporal resolution than the others. Patients’ CBCT projection data were also reconstructed and all three reconstructed results showed motion of rectal gas and stool. The result of TCGM provided visually clearer and less blurring images. Conclusion: The present study demonstrates that the new concept for 4D CBCT reconstruction, TCGM, combined with MAP iterative reconstruction framework enables time-ordered image reconstruction with narrower time-window.« less
Quantum Graphical Models and Belief Propagation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leifer, M.S.; Perimeter Institute for Theoretical Physics, 31 Caroline Street North, Waterloo Ont., N2L 2Y5; Poulin, D.
Belief Propagation algorithms acting on Graphical Models of classical probability distributions, such as Markov Networks, Factor Graphs and Bayesian Networks, are amongst the most powerful known methods for deriving probabilistic inferences amongst large numbers of random variables. This paper presents a generalization of these concepts and methods to the quantum case, based on the idea that quantum theory can be thought of as a noncommutative, operator-valued, generalization of classical probability theory. Some novel characterizations of quantum conditional independence are derived, and definitions of Quantum n-Bifactor Networks, Markov Networks, Factor Graphs and Bayesian Networks are proposed. The structure of Quantum Markovmore » Networks is investigated and some partial characterization results are obtained, along the lines of the Hammersley-Clifford theorem. A Quantum Belief Propagation algorithm is presented and is shown to converge on 1-Bifactor Networks and Markov Networks when the underlying graph is a tree. The use of Quantum Belief Propagation as a heuristic algorithm in cases where it is not known to converge is discussed. Applications to decoding quantum error correcting codes and to the simulation of many-body quantum systems are described.« less
Space Shuttle Orbiter Digital Outer Mold Line Scanning
NASA Technical Reports Server (NTRS)
Campbell, Charles H.; Wilson, Brad; Pavek, Mike; Berger, Karen
2012-01-01
The Space Shuttle Orbiters Discovery and Endeavor have been digitally scanned to produce post-flight configuration outer mold line surfaces. Very detailed scans of the windward side of these vehicles provide resolution of the detailed tile step and gap geometry, as well as the reinforced carbon carbon nose cap and leading edges. Lower resolution scans of the upper surface provide definition of the crew cabin windows, wing upper surfaces, payload bay doors, orbital maneuvering system pods and the vertical tail. The process for acquisition of these digital scans as well as post-processing of the very large data set will be described.
A National Consideration of Digital Equity
ERIC Educational Resources Information Center
Davis, T.; Fuller, M.; Jackson, S.; Pittman, J.; Sweet, J.
2007-01-01
The National Center for Education Statistics (NCES) report, "Computer and Internet Use by Students in 2003" (NCES, 2006) reveals that the digital divide continues to exist, particularly along demographic and socioeconomic lines. Though an exact definition remains elusive, the term "digital divide" generally refers to the…
A novel Iterative algorithm to text segmentation for web born-digital images
NASA Astrophysics Data System (ADS)
Xu, Zhigang; Zhu, Yuesheng; Sun, Ziqiang; Liu, Zhen
2015-07-01
Since web born-digital images have low resolution and dense text atoms, text region over-merging and miss detection are still two open issues to be addressed. In this paper a novel iterative algorithm is proposed to locate and segment text regions. In each iteration, the candidate text regions are generated by detecting Maximally Stable Extremal Region (MSER) with diminishing thresholds, and categorized into different groups based on a new similarity graph, and the texted region groups are identified by applying several features and rules. With our proposed overlap checking method the final well-segmented text regions are selected from these groups in all iterations. Experiments have been carried out on the web born-digital image datasets used for robust reading competition in ICDAR 2011 and 2013, and the results demonstrate that our proposed scheme can significantly reduce both the number of over-merge regions and the lost rate of target atoms, and the overall performance outperforms the best compared with the methods shown in the two competitions in term of recall rate and f-score at the cost of slightly higher computational complexity.
"How Long Is a Piece of String?"
ERIC Educational Resources Information Center
Aitchison, Kate
2001-01-01
Provides a lesson plan designed to form three one-hour sessions with mixed ability groups of 11-12 year olds. The activity involves students estimating road distances by counting the number of map grid lines crossed in going from A to B. The activity is designed to introduce students to graphing calculators. Screenshots from the calculator are…
NASA TLA workload analysis support. Volume 3: FFD autopilot scenario validation data
NASA Technical Reports Server (NTRS)
Sundstrom, J. L.
1980-01-01
The data used to validate a seven time line analysis of forward flight deck autopilot mode for the pilot and copilot for NASA B737 terminal configured vehicle are presented. Demand workloads are given in two forms: workload histograms and workload summaries (bar graphs). A report showing task length and task interaction is also presented.
Venus - Ishtar gravity anomaly
NASA Technical Reports Server (NTRS)
Sjogren, W. L.; Bills, B. G.; Mottinger, N. A.
1984-01-01
The gravity anomaly associated with Ishtar Terra on Venus is characterized, comparing line-of-sight acceleration profiles derived by differentiating Pioneer Venus Orbiter Doppler residual profiles with an Airy-compensated topographic model. The results are presented in graphs and maps, confirming the preliminary findings of Phillips et al. (1979). The isostatic compensation depth is found to be 150 + or - 30 km.
Exploring First Responder Tactics in a Terrorist Chemical Attack
2008-12-01
on the Prowl. TODAY Newspaper. Singapore: Mediacorp Publishing, January 4. Lucas, T . W., S. M. Sanchez, L. R. Sickinger, F. Martinez, and J . W...APPENDIX J . PYTHAGORAS 2.0.X ISSUES ENCOUNTERED.......................185 A. DRAWING DISPLAY BUG...in base case scenario. Bar graph shows the t ratio, with blue line showing the 0.05 significance level
Zero Autocorrelation Waveforms: A Doppler Statistic and Multifunction Problems
2006-01-01
by ANSI Std Z39-18 It is natural to refer to A as the ambiguity function of u, since in the usual setting on the real line R, the analogue ambiguity...Doppler statistic |Cu,uek(j)| is excellent and provable for detecting deodorized Doppler frequency shift [11] (see Fig. 2). Also, if one graphs only
Mathematics Textbooks and Their Potential Role in Supporting Misconceptions
ERIC Educational Resources Information Center
Kajander, Ann; Lovric, Miroslav
2009-01-01
As a fundamental resource, textbooks shape the way we teach and learn mathematics. Based on examination of secondary school and university textbooks, we describe to what extent, and how, the presentation of mathematics material--in our case study, the concept of the line tangent to the graph of a function--could contribute to creation and…
Graph and Table Use in Special Education: A Review and Analysis of the Communication of Data
ERIC Educational Resources Information Center
Kubina, Richard M.; Kostewicz, Douglas E.; Datchuk, Shawn M.
2010-01-01
An emerging line of research demonstrates a distinction between social and natural sciences; natural sciences devote more page space in journals to data graphics than social sciences. The present survey asked how the subdiscipline of Education, Special Education, compares to other disciplines of science. Also, how do the Individuals with…
Optical transmission modules for multi-channel superconducting quantum interference device readouts.
Kim, Jin-Mok; Kwon, Hyukchan; Yu, Kwon-kyu; Lee, Yong-Ho; Kim, Kiwoong
2013-12-01
We developed an optical transmission module consisting of 16-channel analog-to-digital converter (ADC), digital-noise filter, and one-line serial transmitter, which transferred Superconducting Quantum Interference Device (SQUID) readout data to a computer by a single optical cable. A 16-channel ADC sent out SQUID readouts data with 32-bit serial data of 8-bit channel and 24-bit voltage data at a sample rate of 1.5 kSample/s. A digital-noise filter suppressed digital noises generated by digital clocks to obtain SQUID modulation as large as possible. One-line serial transmitter reformed 32-bit serial data to the modulated data that contained data and clock, and sent them through a single optical cable. When the optical transmission modules were applied to 152-channel SQUID magnetoencephalography system, this system maintained a field noise level of 3 fT/√Hz @ 100 Hz.
ERIC Educational Resources Information Center
Tan, Thomas
2011-01-01
Digital citizenship is how educators, citizens, and parents can teach where the lines of cyber safety and ethics are in the interconnected online world their students will inhabit. Aside from keeping technology users safe, digital citizenship also prepares students to survive and thrive in an environment embedded with information, communication,…
Graphical introduction to chromospheric line formation
NASA Astrophysics Data System (ADS)
Rutten, Rob
2012-03-01
The basics of chromospheric line formation theory were laid out in the 1960s and 1970s by e.g., Thomas, Avrett, Hummer, Jefferies, Mihalas, Shine, Milkey. Since then there has been a long silence, without much progress in understanding the chromosphere or its diagnostics. At present, the situation changes thanks to better ground-based observing, space-based monitoring, and increasingly realistic numerical simulations. There is a now a strong need to revamp classical one-dimensional static modeling as basis for chromospheric line interpretation into 3D dynamic understanding of the major diagnostics, including IRIS's Mg II h&k. In this introduction I aim to explain the old wisdom in tutorial fashion, using cartoons and graphs as means towards an intuitive grasp of fads and fallacies of chromospheric line formation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guildenbecher, Daniel Robert; Munz, Elise Dahnke; Farias, Paul Abraham
2015-12-01
Digital in-line holography and plenoptic photography are two techniques for single-shot, volumetric measurement of 3D particle fields. Here we present a preliminary comparison of the two methods by applying plenoptic imaging to experimental configurations that have been previously investigated with digital in-line holography. These experiments include the tracking of secondary droplets from the impact of a water drop on a thin film of water and tracking of pellets from a shotgun. Both plenoptic imaging and digital in-line holography successfully quantify the 3D nature of these particle fields. This includes measurement of the 3D particle position, individual particle sizes, and three-componentmore » velocity vectors. For the initial processing methods presented here, both techniques give out-of-plane positional accuracy of approximately 1-2 particle diameters. For a fixed image sensor, digital holography achieves higher effective in-plane spatial resolutions. However, collimated and coherent illumination makes holography susceptible to image distortion through index of refraction gradients, as demonstrated in the shotgun experiments. On the other hand, plenotpic imaging allows for a simpler experimental configuration. Furthermore, due to the use of diffuse, white-light illumination, plenoptic imaging is less susceptible to image distortion in the shotgun experiments. Additional work is needed to better quantify sources of uncertainty, particularly in the plenoptic experiments, as well as develop data processing methodologies optimized for the plenoptic measurement.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guildenbecher, Daniel Robert; Munz, Elise Dahnke; Farias, Paul Abraham
2015-12-01
Digital in-line holography and plenoptic photography are two techniques for single-shot, volumetric measurement of 3D particle fields. Here we present a preliminary comparison of the two methods by applying plenoptic imaging to experimental configurations that have been previously investigated with digital in-line holography. These experiments include the tracking of secondary droplets from the impact of a water drop on a thin film of water and tracking of pellets from a shotgun. Both plenoptic imaging and digital in-line holography successfully quantify the 3D nature of these particle fields. This includes measurement of the 3D particle position, individual particle sizes, and three-componentmore » velocity vectors. For the initial processing methods presented here, both techniques give out-of-plane positional accuracy of approximately 1-2 particle diameters. For a fixed image sensor, digital holography achieves higher effective in-plane spatial resolutions. However, collimated and coherent illumination makes holography susceptible to image distortion through index of refraction gradients, as demonstrated in the shotgun experiments. On the other hand, plenotpic imaging allows for a simpler experimental configuration. Furthermore, due to the use of diffuse, white-light illumination, plenoptic imaging is less susceptible to image distortion in the shotgun experiments. Additional work is needed to better quantify sources of uncertainty, particularly in the plenoptic experiments, as well as develop data processing methodologies optimized for the plenoptic measurement.« less
Development of ATC for High Speed and High Density Commuter Line
NASA Astrophysics Data System (ADS)
Okutani, Tamio; Nakamura, Nobuyuki; Araki, Hisato; Irie, Shouji; Osa, Hiroki; Sano, Minoru; Ikeda, Keigo; Ozawa, Hiroyuki
A new ATC (Automatic Train Control) system has been developed with solutions to realize short train headway by assured braking utilizing digital data transmission via rails; the digital data for the ATP (Automatic Train Protection) function; and to achieve EMC features for both AC and DC sections. The DC section is of the unprecedented DC traction power supply system utilizing IGBT PWM converter at all DC substations. Within the AC section, train traction force is controlled by PWM converter/inverters. The carrier frequencies of the digital data signals and chopping frequency of PWM traction power converters on-board are decided via spectral analysis of noise up to degraded mode cases of equipment. Developed system was equipped to the Tukuba Express Line, new commuter line of Tokyo metropolitan area, and opened since Aug. 2005.
NASA Astrophysics Data System (ADS)
Smith, A.; Siegel, Edward Carl-Ludwig
2011-03-01
Numbers: primality/indivisibility/non-factorization versus compositeness/divisibility/ factorization, often in tandem but not always, provocatively close analogy to nuclear-physics: (2 + 1)=(fusion)=3; (3+1)=(fission)=4[=2 x 2]; (4+1)=(fusion)=5; (5 +1)=(fission)=6[=2 x 3]; (6 + 1)=(fusion)=7; (7+1)=(fission)=8[= 2 x 4 = 2 x 2 x 2]; (8 + 1) =(non: fission nor fusion)= 9[=3 x 3]; then ONLY composites' Islands of fusion-INstability: 8, 9, 10; then 14, 15, 16, ... Could inter-digit Feshbach-resonances exist??? Possible applications to: quantum-information/ computing non-Shore factorization, millennium-problem Riemann-hypotheses proof as Goodkin BEC intersection with graph-theory "short-cut" method: Rayleigh(1870)-Polya(1922)-"Anderson"(1958)-localization, Goldbach-conjecture, financial auditing/accounting as quantum-statistical-physics; ...abound!!! Watkins [www.secamlocal.ex.ac.uk/people/staff/mrwatkin/] "Number-Theory in Physics" many interconnections: "pure"-maths number-theory to physics including Siegel [AMS Joint Mtg.(2002)-Abs.# 973-60-124] inversion of statistics on-average digits' Newcomb(1881)-Weyl(14-16)-Benford(38)-law to reveal both the quantum and BEQS (digits = bosons = digits:"spinEless-boZos"). 1881 1885 1901 1905 1925 < 1927, altering quantum-theory history!!!
NASA Astrophysics Data System (ADS)
Yuan, Yanbin; Zhou, You; Zhu, Yaqiong; Yuan, Xiaohui; Sælthun, N. R.
2007-11-01
Based on digital technology, flood routing simulation system development is an important component of "digital catchment". Taking QingJiang catchment as a pilot case, in-depth analysis on informatization of Qingjiang catchment management being the basis, aiming at catchment data's multi-source, - dimension, -element, -subject, -layer and -class feature, the study brings the design thought and method of "subject-point-source database" (SPSD) to design system structure in order to realize the unified management of catchments data in great quantity. Using the thought of integrated spatial information technology for reference, integrating hierarchical structure development model of digital catchment is established. The model is general framework of the flood routing simulation system analysis, design and realization. In order to satisfy the demands of flood routing three-dimensional simulation system, the object-oriented spatial data model are designed. We can analyze space-time self-adapting relation between flood routing and catchments topography, express grid data of terrain by using non-directed graph, apply breadth first search arithmetic, set up search method for the purpose of dynamically searching stream channel on the basis of simulated three-dimensional terrain. The system prototype is therefore realized. Simulation results have demonstrated that the proposed approach is feasible and effective in the application.
NASA Astrophysics Data System (ADS)
Siegel, Edward
2011-04-01
Numbers: primality/indivisibility/non-factorization versus compositeness/divisibility /factor-ization, often in tandem but not always, provocatively close analogy to nuclear-physics: (2 + 1)=(fusion)=3; (3+1)=(fission)=4[=2 x 2]; (4+1)=(fusion)=5; (5+1)=(fission)=6[=2 x 3]; (6 + 1)=(fusion)=7; (7+1)=(fission)=8[= 2 x 4 = 2 x 2 x 2]; (8 + 1) =(non: fission nor fusion)= 9[=3 x 3]; then ONLY composites' Islands of fusion-INstability: 8, 9, 10; then 14, 15, 16,... Could inter-digit Feshbach-resonances exist??? Applications to: quantum-information and computing non-Shore factorization, millennium-problem Riemann-hypotheses physics-proof as numbers/digits Goodkin Bose-Einstein Condensation intersection with graph-theory ``short-cut'' method: Rayleigh(1870)-Polya(1922)-``Anderson'' (1958)-localization, Goldbach-conjecture, financial auditing/accounting as quantum-statistical-physics;... abound!!!
Metric Aspects of Digital Images and Digital Image Processing.
1984-09-01
produced in a reconstructed digital image. Synthesized aerial photographs were formed by processing a combined elevation and orthophoto data base. These...brightness values h1 and Iion b) a line equation whose two parameters are calculated h12, along with tile borderline that separates the two intensity
ERIC Educational Resources Information Center
Acker, Stephen R.
2008-01-01
With faculty changing instructional practices to take advantage of customizable, focused content (and digital delivery of that content), many people assume that digital distribution is the answer to bringing the costs of course content delivery in line. But the picture just isn't that simple. A wide continuum of options is available to faculty and…
ERIC Educational Resources Information Center
Piedmo, Greg
1995-01-01
Integrated services digital network (ISDN) is a dial-up digital transmission service supporting transmission of audio, video, and text data over standard copper telephone wires or fiber optic cables. Advantages of ISDN over analog transmission include the ability of one phone line to support up to three simultaneous, separate conversations (phone,…
Shocks and finite-time singularities in Hele-Shaw flow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teodorescu, Razvan; Wiegmann, P; Lee, S-y
Hele-Shaw flow at vanishing surface tension is ill-defined. In finite time, the flow develops cusplike singularities. We show that the ill-defined problem admits a weak dispersive solution when singularities give rise to a graph of shock waves propagating in the viscous fluid. The graph of shocks grows and branches. Velocity and pressure jump across the shock. We formulate a few simple physical principles which single out the dispersive solution and interpret shocks as lines of decompressed fluid. We also formulate the dispersive solution in algebro-geometrical terms as an evolution of Krichever-Boutroux complex curve. We study in details the most genericmore » (2,3) cusp singularity which gives rise to an elementary branching event. This solution is self-similar and expressed in terms of elliptic functions.« less
The ancestral selection graph under strong directional selection.
Pokalyuk, Cornelia; Pfaffelhuber, Peter
2013-08-01
The ancestral selection graph (ASG) was introduced by Neuhauser and Krone (1997) in order to study populations of constant size which evolve under selection. Coalescence events, which occur at rate 1 for every pair of lines, lead to joint ancestry. In addition, splitting events in the ASG at rate α, the scaled selection coefficient, produce possible ancestors, such that the real ancestor depends on the ancestral alleles. Here, we use the ASG in the case without mutation in order to study fixation of a beneficial mutant. Using our main tool, a reversibility property of the ASG, we provide a new proof of the fact that a beneficial allele fixes roughly in time (2logα)/α if α is large. Copyright © 2012 Elsevier Inc. All rights reserved.
Telesca, Luciano; Lovallo, Michele; Ramirez-Rojas, Alejandro; Flores-Marquez, Leticia
2014-01-01
By using the method of the visibility graph (VG) the synthetic seismicity generated by a simple stick-slip system with asperities is analysed. The stick-slip system mimics the interaction between tectonic plates, whose asperities are given by sandpapers of different granularity degrees. The VG properties of the seismic sequences have been put in relationship with the typical seismological parameter, the b-value of the Gutenberg-Richter law. Between the b-value of the synthetic seismicity and the slope of the least square line fitting the k-M plot (relationship between the magnitude M of each synthetic event and its connectivity degree k) a close linear relationship is found, also verified by real seismicity.
Unsalan, Cem; Boyer, Kim L
2005-04-01
Today's commercial satellite images enable experts to classify region types in great detail. In previous work, we considered discriminating rural and urban regions [23]. However, a more detailed classification is required for many purposes. These fine classifications assist government agencies in many ways including urban planning, transportation management, and rescue operations. In a step toward the automation of the fine classification process, this paper explores graph theoretical measures over grayscale images. The graphs are constructed by assigning photometric straight line segments to vertices, while graph edges encode their spatial relationships. We then introduce a set of measures based on various properties of the graph. These measures are nearly monotonic (positively correlated) with increasing structure (organization) in the image. Thus, increased cultural activity and land development are indicated by increases in these measures-without explicit extraction of road networks, buildings, residences, etc. These latter, time consuming (and still only partially automated) tasks can be restricted only to "promising" image regions, according to our measures. In some applications our measures may suffice. We present a theoretical basis for the measures followed by extensive experimental results in which the measures are first compared to manual evaluations of land development. We then present and test a method to focus on, and (pre)extract, suburban-style residential areas. These are of particular importance in many applications, and are especially difficult to extract. In this work, we consider commercial IKONOS data. These images are orthorectified to provide a fixed resolution of 1 meter per pixel on the ground. They are, therefore, metric in the sense that ground distance is fixed in scale to pixel distance. Our data set is large and diverse, including sea and coastline, rural, forest, residential, industrial, and urban areas.
Spectral stability of shifted states on star graphs
NASA Astrophysics Data System (ADS)
Kairzhan, Adilbek; Pelinovsky, Dmitry E.
2018-03-01
We consider the nonlinear Schrödinger (NLS) equation with the subcritical power nonlinearity on a star graph consisting of N edges and a single vertex under generalized Kirchhoff boundary conditions. The stationary NLS equation may admit a family of solitary waves parameterized by a translational parameter, which we call the shifted states. The two main examples include (i) the star graph with even N under the classical Kirchhoff boundary conditions and (ii) the star graph with one incoming edge and N - 1 outgoing edges under a single constraint on coefficients of the generalized Kirchhoff boundary conditions. We obtain the general counting results on the Morse index of the shifted states and apply them to the two examples. In the case of (i), we prove that the shifted states with even N ≥slant 4 are saddle points of the action functional which are spectrally unstable under the NLS flow. In the case of (ii), we prove that the shifted states with the monotone profiles in the N - 1 edges are spectrally stable, whereas the shifted states with non-monotone profiles in the N - 1 edges are spectrally unstable, the two families intersect at the half-soliton states which are spectrally stable but nonlinearly unstable under the NLS flow. Since the NLS equation on a star graph with shifted states can be reduced to the homogeneous NLS equation on an infinite line, the spectral instability of shifted states is due to the perturbations breaking this reduction. We give a simple argument suggesting that the spectrally stable shifted states in the case of (ii) are nonlinearly unstable under the NLS flow due to the perturbations breaking the reduction to the homogeneous NLS equation.
An application of cluster detection to scene analysis
NASA Technical Reports Server (NTRS)
Rosenfeld, A. H.; Lee, Y. H.
1971-01-01
Certain arrangements of local features in a scene tend to group together and to be seen as units. It is suggested that in some instances, this phenomenon might be interpretable as a process of cluster detection in a graph-structured space derived from the scene. This idea is illustrated using a class of scenes that contain only horizontal and vertical line segments.
The Copernicus ultraviolet spectral atlas of Sirius
NASA Technical Reports Server (NTRS)
Rogerson, John B., Jr.
1987-01-01
A near-ultraviolet spectral atlas for the A1 V star Alpha CMa (Sirius) has been prepared from data taken by the Princeton spectrometer aboard the Copernicus satellite. The spectral region from 1649 to 3170 A has been scanned with a resolution of 0.1 A. The atlas is presented in graphs, and line identifications for the absorption features have been tabulated.
ERIC Educational Resources Information Center
Kabaca, Tolga
2013-01-01
Solution set of any inequality or compound inequality, which has one-variable, lies in the real line which is one dimensional. So a difficulty appears when computer assisted graphical representation is intended to use for teaching these topics. Sketching a one-dimensional graph by using computer software is not a straightforward work. In this…
ERIC Educational Resources Information Center
Samson, Frank L.
2013-01-01
This study identifies a theoretical mechanism that could potentially affect public university admissions standards in a context of demographic change. I explore how demographic changes at a prestigious public university in the United States affect individuals' evaluations of college applications. Responding to a line graph that randomly displays a…
Kirchhoff's rule for quantum wires
NASA Astrophysics Data System (ADS)
Kostrykin, V.; Schrader, R.
1999-01-01
We formulate and discuss one-particle quantum scattering theory on an arbitrary finite graph with n open ends and where we define the Hamiltonian to be (minus) the Laplace operator with general boundary conditions at the vertices. This results in a scattering theory with n channels. The corresponding on-shell S-matrix formed by the reflection and transmission amplitudes for incoming plane waves of energy E>0 is given explicitly in terms of the boundary conditions and the lengths of the internal lines. It is shown to be unitary, which may be viewed as the quantum version of Kirchhoff's law. We exhibit covariance and symmetry properties. It is symmetric if the boundary conditions are real. Also there is a duality transformation on the set of boundary conditions and the lengths of the internal lines such that the low-energy behaviour of one theory gives the high-energy behaviour of the transformed theory. Finally, we provide a composition rule by which the on-shell S-matrix of a graph is factorizable in terms of the S-matrices of its subgraphs. All proofs use only known facts from the theory of self-adjoint extensions, standard linear algebra, complex function theory and elementary arguments from the theory of Hermitian symplectic forms.
Review of Designs for Haptic Data Visualization.
Paneels, Sabrina; Roberts, Jonathan C
2010-01-01
There are many different uses for haptics, such as training medical practitioners, teleoperation, or navigation of virtual environments. This review focuses on haptic methods that display data. The hypothesis is that haptic devices can be used to present information, and consequently, the user gains quantitative, qualitative, or holistic knowledge about the presented data. Not only is this useful for users who are blind or partially sighted (who can feel line graphs, for instance), but also the haptic modality can be used alongside other modalities, to increase the amount of variables being presented, or to duplicate some variables to reinforce the presentation. Over the last 20 years, a significant amount of research has been done in haptic data presentation; e.g., researchers have developed force feedback line graphs, bar charts, and other forms of haptic representations. However, previous research is published in different conferences and journals, with different application emphases. This paper gathers and collates these various designs to provide a comprehensive review of designs for haptic data visualization. The designs are classified by their representation: Charts, Maps, Signs, Networks, Diagrams, Images, and Tables. This review provides a comprehensive reference for researchers and learners, and highlights areas for further research.
Research of Characteristics of the Low Voltage Power Line in Underground Coal Mine
NASA Astrophysics Data System (ADS)
Wei, Shaoliang; Qin, Shiqun; Gao, Wenchang; Cheng, Fengyu; Cao, Zhongyue
The power line communications (PLCs) can count on existing electrical connections reaching each corner in the locations where such applications are required, so signal transmission over power lines is nowadays gaining more and more interest for applications like internet. The research of characteristics of the low voltage power line is the fundamental and importance task. This work presents a device to test the characteristics of the low voltage power line. The low voltage power line channel characteristics overground and the channel characteristics underground were tested in using this device. Experiments show that, the characteristics are different between the PLCs channel underground coal mine and the PLC channel overground. Different technology should be adopted to structure the PLCs channel model underground coal mine and transmit high speed digital signal. But how to use the technology better to the high-speed digital communication under coal mine is worth of further studying.
Method to prevent sulfur accumulation in membrane electrode assembly
Steimke, John L; Steeper, Timothy J; Herman, David T
2014-04-29
A method of operating a hybrid sulfur electrolyzer to generate hydrogen is provided that includes the steps of providing an anolyte with a concentration of sulfur dioxide, and applying a current. During steady state generation of hydrogen a plot of applied current density versus concentration of sulfur dioxide is below a boundary line. The boundary line may be linear and extend through the origin of the graph with a slope of 0.001 in which the current density is measured in mA/cm2 and the concentration of sulfur dioxide is measured in moles of sulfur dioxide per liter of anolyte.
Vulnerability of dynamic systems
NASA Technical Reports Server (NTRS)
Siljak, D. D.
1976-01-01
Directed graphs are associated with dynamic systems in order to determine in any given system if each state can be reached by at least one input (input reachability), or can each state reach at least one output (output reachability). Then, the structural perturbations of a dynamic system are identified as lines or points removals from the corresponding digraph, and a system is considered vulnerable at those lines or points of the digraph whose removal destroys its input or output reachability. A suitable framework is formulated for resolving the problems of reachability and vulnerability which applies to both linear and nonlinear systems alike.
Full range line-field parallel swept source imaging utilizing digital refocusing
NASA Astrophysics Data System (ADS)
Fechtig, Daniel J.; Kumar, Abhishek; Drexler, Wolfgang; Leitgeb, Rainer A.
2015-12-01
We present geometric optics-based refocusing applied to a novel off-axis line-field parallel swept source imaging (LPSI) system. LPSI is an imaging modality based on line-field swept source optical coherence tomography, which permits 3-D imaging at acquisition speeds of up to 1 MHz. The digital refocusing algorithm applies a defocus-correcting phase term to the Fourier representation of complex-valued interferometric image data, which is based on the geometrical optics information of the LPSI system. We introduce the off-axis LPSI system configuration, the digital refocusing algorithm and demonstrate the effectiveness of our method for refocusing volumetric images of technical and biological samples. An increase of effective in-focus depth range from 255 μm to 4.7 mm is achieved. The recovery of the full in-focus depth range might be especially valuable for future high-speed and high-resolution diagnostic applications of LPSI in ophthalmology.
A high-resolution time-to-digital converter using a three-level resolution
NASA Astrophysics Data System (ADS)
Dehghani, Asma; Saneei, Mohsen; Mahani, Ali
2016-08-01
In this article, a three-level resolution Vernier delay line time-to-digital converter (TDC) was proposed. The proposed TDC core was based on the pseudo-differential digital architecture that made it insensitive to nMOS and pMOS transistor mismatches. It also employed a Vernier delay line (VDL) in conjunction with an asynchronous read-out circuitry. The time interval resolution was equal to the difference of delay between buffers of upper and lower chains. Then, via the extra chain included in the lower delay line, resolution was controlled and power consumption was reduced. This method led to high resolution and low power consumption. The measurement results of TDC showed a resolution of 4.5 ps, 12-bit output dynamic range, and integral nonlinearity of 1.5 least significant bits. This TDC achieved the consumption of 68.43 µW from 1.1-V supply.
NASA Astrophysics Data System (ADS)
McCarthy, K.
2017-12-01
NASA's Operation IceBridge (OIB), the largest airborne survey of Earth's polar ice uses remote sensing methods to collect data on changing sea and land ice. PolarTREC teacher Kelly McCarthy joined the team during the 2016 Spring Arctic Campaign. This presentation explores ways in which k-12 students were engaged in the work being done by OIB through classroom learning experiences, digital communications, and independent research. Initially, digital communication including chats via NASA's Mission Tools Suite for Education (MTSE) platform was leveraged to engage students in the daily work of OIB. Two lessons were piloted with student groups during the 2016-2017 academic year both for students who actively engaged in communications with the team during the expedition and those who had no prior connections to the field. All of the data collected on OIB missions is stored for public use in a digital portal on the National Snow and Ice Data Center (NSIDC) website. In one lesson, 10th-12th grade students were guided through a tutorial to learn how to access data and begin to develop a story about Greenland's Jakobshavn Glacier using pre-selected data sets, Google's MyMaps app, and independent research methods. In the second lesson, 8th grade students were introduced to remote sensing, first through a discussion on vocabulary using productive talk moves and then via a demonstration using Vernier motion detectors and a graph matching simulation. Students worked in groups to develop procedures to map a hidden surface region (boxed assortment of miscellaneous objects) using a Vernier motion sensor to simulate sonar. Students translated data points collected from the motion sensor into a vertical profile of the simulated surface region. Both lessons allowed students a way to engage in two of the most important components of OIB. The ability to work with real data collected by the OIB team provided a unique context through which students gained skill and overcame challenges in Excel, Google Apps, construction of graphs, and data analysis. The remote sensing simulation allowed students to practice and gain hands-on knowledge of the components of OIB discussed in the digital communications that may have felt unclear to students who have had limited or no exposure to remote sensing technologies or the science behind them.
NASA Technical Reports Server (NTRS)
Giddings, L.; Boston, S.
1976-01-01
A method for digitizing zone maps is presented, starting with colored images and producing a final one-channel digitized tape. This method automates the work previously done interactively on the Image-100 and Data Analysis System computers of the Johnson Space Center (JSC) Earth Observations Division (EOD). A color-coded map was digitized through color filters on a scanner to form a digital tape in LARSYS-2 or JSC Universal format. The taped image was classified by the EOD LARSYS program on the basis of training fields included in the image. Numerical values were assigned to all pixels in a given class, and the resulting coded zone map was written on a LARSYS or Universal tape. A unique spatial filter option permitted zones to be made homogeneous and edges of zones to be abrupt transitions from one zone to the next. A zoom option allowed the output image to have arbitrary dimensions in terms of number of lines and number of samples on a line. Printouts of the computer program are given and the images that were digitized are shown.
Low-Loss, High-Isolation Microwave Microelectromechanical Systems (MEMS) Switches Being Developed
NASA Technical Reports Server (NTRS)
Ponchak, George E.
2002-01-01
Switches, electrical components that either permit or prevent the flow of electricity, are the most important and widely used electrical devices in integrated circuits. In microwave systems, switches are required for switching between the transmitter and receiver; in communication systems, they are needed for phase shifters in phased-array antennas, for radar and communication systems, and for the new class of digital or software definable radios. Ideally, switches would be lossless devices that did not depend on the electrical signal's frequency or power, and they would not consume electrical power to change from OFF to ON or to maintain one of these two states. Reality is quite different, especially at microwave frequencies. Typical switches in microwave integrated circuits are pin diodes or gallium arsenide (GaAs) field-effect transistors that are nonlinear, with characteristics that depend on the power of the signal. In addition, they are frequency-dependent, lossy, and require electrical power to maintain a certain state. A new type of component has been developed that overcomes most of these technical difficulties. Microelectromechanical (MEMS) switches rely on mechanical movement as a response to an applied electrical force to either transmit or reflect electrical signal power. The NASA Glenn Research Center has been actively developing MEMS for microwave applications for over the last 5 years. Complete fabrication procedures have been developed so that the moving parts of the switch can be released with near 100-percent yield. Moreover, the switches fabricated at Glenn have demonstrated state-of-the-art performance. A typical MEMS switch is shown. The switch extends over the signal and ground lines of a finite ground coplanar waveguide, a commonly used microwave transmission line. In the state shown, the switch is in the UP state and all the microwave power traveling along the transmission line proceeds unimpeded. When a potential difference is applied between the cantilever and the transmission line, the cantilever is pulled downward until it connects the signal line to the ground planes, creating a short circuit. In this state, all the microwave power is reflected. The graph shows the measured performance of the switch, which has less than 0.1 dB of insertion loss and greater than 30dB of isolation. These switches consume negligible electrical power and are extremely linear. Additional research is required to address reliability and to increase the switching speed.
Multiframe digitization of x-ray (TV) images (abstract)
NASA Astrophysics Data System (ADS)
Karpenko, V. A.; Khil'chenko, A. D.; Lysenko, A. P.; Panchenko, V. E.
1989-07-01
The work in progress deals with the experimental search for a technique of digitizing x-ray TV images. The small volume of the buffer memory of the analog-to-digital (A/D) converter (ADC) we have previously used to detect TV signals made it necessary to digitize only one line at a time of the television raster and also to make use of gating to gain the video information contained in the whole frame. This paper is devoted to multiframe digitizing. The recorder of video signals comprises a broadband 8-bit A/D converter, a buffer memory having 128K words and a control circuit which forms a necessary sequence of advance pulses for the A/D converter and the memory relative to the input frame and line sync pulses (FSP and LSP). The device provides recording of video signals corresponding to one or a few frames following one after another, or to their fragments. The control circuit is responsible for the separation of the required fragment of the TV image. When loading the limit registers, the following input parameters of the control circuit are set: the skipping of a definite number of lines after the next FSP, the number of the lines of recording inside a fragment, the frequency of the information lines inside a fragment, the delay in the start of the ADC conversion relative to the arrival of the LSP, the length of the information section of a line, and the frequency of taking the readouts in a line. In addition, among the instructions given are the number of frames of recording and the frequency of their sequence. Thus, the A/D converter operates only inside a given fragment of the TV image. The information is introduced into the memory in sequence, fragment by fragment, without skipping and is then extracted as samples according to the addresses needed for representation in the required form, and processing. The video signal recorder governs the shortest time of the ADC conversion per point of 250 ns. As before, among the apparatus used were an image vidicon with luminophor conversion of x-radiation to light, and a single-crystal x-ray diffraction scheme necessary to form dynamic test objects from x-ray lines dispersed in space (the projections of the linear focus of an x-ray tube).
WIDELink: A Bootstrapping Approach to Identifying, Modeling and Linking On-Line Data Sources
2005-07-01
Mercedes Benz of Laguna Niguel — and collected on the order of 250 records from these sites. We normalized all data by lowercasing it. We then ran the...0.38 > [ mercedes , benz ] [47K] [5 . 0l v8] [ford] [21 , 3Digit] [4 . 3l v8] [mercury] [20 , 3Digit] [3 . 2l 6cyl] [lincoln] [2Digit , 3Digit] [3 . 2l
NASA Astrophysics Data System (ADS)
Chidananda, H.; Reddy, T. Hanumantha
2017-06-01
This paper presents a natural representation of numerical digit(s) using hand activity analysis based on number of fingers out stretched for each numerical digit in sequence extracted from a video. The analysis is based on determining a set of six features from a hand image. The most important features used from each frame in a video are the first fingertip from top, palm-line, palm-center, valley points between the fingers exists above the palm-line. Using this work user can convey any number of numerical digits using right or left or both the hands naturally in a video. Each numerical digit ranges from 0 to9. Hands (right/left/both) used to convey digits can be recognized accurately using the valley points and with this recognition whether the user is a right / left handed person in practice can be analyzed. In this work, first the hand(s) and face parts are detected by using YCbCr color space and face part is removed by using ellipse based method. Then, the hand(s) are analyzed to recognize the activity that represents a series of numerical digits in a video. This work uses pixel continuity algorithm using 2D coordinate geometry system and does not use regular use of calculus, contours, convex hull and datasets.
A graph-based watershed merging using fuzzy C-means and simulated annealing for image segmentation
NASA Astrophysics Data System (ADS)
Vadiveloo, Mogana; Abdullah, Rosni; Rajeswari, Mandava
2015-12-01
In this paper, we have addressed the issue of over-segmented regions produced in watershed by merging the regions using global feature. The global feature information is obtained from clustering the image in its feature space using Fuzzy C-Means (FCM) clustering. The over-segmented regions produced by performing watershed on the gradient of the image are then mapped to this global information in the feature space. Further to this, the global feature information is optimized using Simulated Annealing (SA). The optimal global feature information is used to derive the similarity criterion to merge the over-segmented watershed regions which are represented by the region adjacency graph (RAG). The proposed method has been tested on digital brain phantom simulated dataset to segment white matter (WM), gray matter (GM) and cerebrospinal fluid (CSF) soft tissues regions. The experiments showed that the proposed method performs statistically better, with average of 95.242% regions are merged, than the immersion watershed and average accuracy improvement of 8.850% in comparison with RAG-based immersion watershed merging using global and local features.
Optimal atlas construction through hierarchical image registration
NASA Astrophysics Data System (ADS)
Grevera, George J.; Udupa, Jayaram K.; Odhner, Dewey; Torigian, Drew A.
2016-03-01
Atlases (digital or otherwise) are common in medicine. However, there is no standard framework for creating them from medical images. One traditional approach is to pick a representative subject and then proceed to label structures/regions of interest in this image. Another is to create a "mean" or average subject. Atlases may also contain more than a single representative (e.g., the Visible Human contains both a male and a female data set). Other criteria besides gender may be used as well, and the atlas may contain many examples for a given criterion. In this work, we propose that atlases be created in an optimal manner using a well-established graph theoretic approach using a min spanning tree (or more generally, a collection of them). The resulting atlases may contain many examples for a given criterion. In fact, our framework allows for the addition of new subjects to the atlas to allow it to evolve over time. Furthermore, one can apply segmentation methods to the graph (e.g., graph-cut, fuzzy connectedness, or cluster analysis) which allow it to be separated into "sub-atlases" as it evolves. We demonstrate our method by applying it to 50 3D CT data sets of the chest region, and by comparing it to a number of traditional methods using measures such as Mean Squared Difference, Mattes Mutual Information, and Correlation, and for rigid registration. Our results demonstrate that optimal atlases can be constructed in this manner and outperform other methods of construction using freely available software.
From Online to Ubiquitous Cities: The Technical Transformation of Virtual Communities
NASA Astrophysics Data System (ADS)
Anthopoulos, Leonidas; Fitsilis, Panos
Various digital city projects, from the online cases (e.g. the America on Line) to the ubiquitous cities of South Korea, have achieved in creating technically 'physical' areas for the virtual communities, which share knowledge of common interest. Moreover, digital cities can succeed in simplifying citizen access to public information and services. Early digital cities deliver 'smart' and social services to citizens even with no digital skills, closing digital divide and establishing digital areas of trust in local communities. This paper presents the evolution of the digital cities, from the web to the ubiquitous architecture. It uses the latest digital city architecture and the current conditions of the digital city of Trikala (Greece), in order to present the evolution procedure of a digital city.
An online ID identification system for liquefied-gas cylinder plant
NASA Astrophysics Data System (ADS)
He, Jin; Ding, Zhenwen; Han, Lei; Zhang, Hao
2017-11-01
An automatic ID identification system for gas cylinders' online production was developed based on the production conditions and requirements of the Technical Committee for Standardization of Gas Cylinders. A cylinder ID image acquisition system was designed to improve the image contrast of ID regions on gas cylinders against the background. Then the ID digits region was located by the CNN template matching algorithm. Following that, an adaptive threshold method based on the analysis of local average grey value and standard deviation was proposed to overcome defects of non-uniform background in the segmentation results. To improve the single digit identification accuracy, two BP neural networks were trained respectively for the identification of all digits and the easily confusable digits. If the single digit was classified as one of confusable digits by the former BP neural network, it was further tested by the later one, and the later result was taken as the final identification result of this single digit. At last, the majority voting was adopted to decide the final identification result for the 6-digit cylinder ID. The developed system was installed on a production line of a liquefied-petroleum-gas cylinder plant and worked in parallel with the existing weighing step on the line. Through the field test, the correct identification rate for single ID digit was 94.73%, and none of the tested 2000 cylinder ID was misclassified through the majority voting.
X-1A in flight with flight data superimposed
1953-12-12
This photo of the X-1A includes graphs of the flight data from Maj. Charles E. Yeager's Mach 2.44 flight on December 12, 1953. (This was only a few days short of the 50th anniversary of the Wright brothers' first powered flight.) After reaching Mach 2.44, then the highest speed ever reached by a piloted aircraft, the X-1A tumbled completely out of control. The motions were so violent that Yeager cracked the plastic canopy with his helmet. He finally recovered from a inverted spin and landed on Rogers Dry Lakebed. Among the data shown are Mach number and altitude (the two top graphs). The speed and altitude changes due to the tumble are visible as jagged lines. The third graph from the bottom shows the G-forces on the airplane. During the tumble, these twice reached 8 Gs or 8 times the normal pull of gravity at sea level. (At these G forces, a 200-pound human would, in effect, weigh 1,600 pounds if a scale were placed under him in the direction of the force vector.) Producing these graphs was a slow, difficult process. The raw data from on-board instrumentation recorded on oscillograph film. Human computers then reduced the data and recorded it on data sheets, correcting for such factors as temperature and instrument errors. They used adding machines or slide rules for their calculations, pocket calculators being 20 years in the future.
Analysis of TIMS performance subjected to simulated wind blast
NASA Technical Reports Server (NTRS)
Jaggi, S.; Kuo, S.
1992-01-01
The results of the performance of the Thermal Infrared Multispectral Scanner (TIMS) when it is subjected to various wind conditions in the laboratory are described. Various wind conditions were simulated using a 24 inch fan or combinations of air jet streams blowing toward either or both of the blackbody surfaces. The fan was used to simulate a large volume of air flow at moderate speeds (up to 30 mph). The small diameter air jets were used to probe TIMS system response in reaction to localized wind perturbations. The maximum nozzle speed of the air jet was 60 mph. A range of wind directions and speeds were set up in the laboratory during the test. The majority of the wind tests were conducted under ambient conditions with the room temperature fluctuating no more than 2 C. The temperature of the high speed air jet was determined to be within 1 C of the room temperature. TIMS response was recorded on analog tape. Additional thermistor readouts of the blackbody temperatures and thermocouple readout of the ambient temperature were recorded manually to be compared with the housekeeping data recorded on the tape. Additional tests were conducted under conditions of elevated and cooled room temperatures. The room temperature was varied between 19.5 to 25.5 C in these tests. The calibration parameters needed for quantitative analysis of TIMS data were first plotted on a scanline-by-scanline basis. These parameters are the low and high blackbody temperature readings as recorded by the TIMS and their corresponding digitized count values. Using these values, the system transfer equations were calculated. This equation allows us to compute the flux for any video count by computing the slope and intercept of the straight line that relates the flux to the digital count. The actual video of the target (the lab floor in this case) was then compared with a simulated target. This simulated target was assumed to be a blackbody at emissivity of .95 degrees and the temperature was assumed to be at ambient temperature as recorded by the TIMS for each scanline. Using the slope and the intercept the flux corresponding to this target was converted into digital counts. The counts were observed to have a strong correlation with the actual video as recorded by the TIMS. The attached graphs describe the performance of the TIMS when compressed air is blown at each one of the blackbodies at different speeds. The effect of blowing a fan and changing the room temperature is also being analyzed. Results indicate that the TIMS system responds to variation in wind speed in real time and maintains the capability to produce accurate temperatures on a scan line basis.
Higley, Debra K.
2014-01-01
The 13 chapters included in U.S. Geological Survey Digital Data Series DDS–69–EE cover topics that range from the oil and gas resource assessment results (chapter 1 and 5–7), to geological, geochemical, and geophysical research across the province (chapters 3–11), tabular data and graphs in support of the assessment (chapter 12), and data releases of zmap-format grid files that were used to build petroleum system models and a standalone three-dimensional geologic model (chapter 13).
John Herschel's Graphical Method
NASA Astrophysics Data System (ADS)
Hankins, Thomas L.
2011-01-01
In 1833 John Herschel published an account of his graphical method for determining the orbits of double stars. He had hoped to be the first to determine such orbits, but Felix Savary in France and Johann Franz Encke in Germany beat him to the punch using analytical methods. Herschel was convinced, however, that his graphical method was much superior to analytical methods, because it used the judgment of the hand and eye to correct the inevitable errors of observation. Line graphs of the kind used by Herschel became common only in the 1830s, so Herschel was introducing a new method. He also found computation fatiguing and devised a "wheeled machine" to help him out. Encke was skeptical of Herschel's methods. He said that he lived for calculation and that the English would be better astronomers if they calculated more. It is difficult to believe that the entire Scientific Revolution of the 17th century took place without graphs and that only a few examples appeared in the 18th century. Herschel promoted the use of graphs, not only in astronomy, but also in the study of meteorology and terrestrial magnetism. Because he was the most prominent scientist in England, Herschel's advocacy greatly advanced graphical methods.
TopoMS: Comprehensive topological exploration for molecular and condensed-matter systems.
Bhatia, Harsh; Gyulassy, Attila G; Lordi, Vincenzo; Pask, John E; Pascucci, Valerio; Bremer, Peer-Timo
2018-06-15
We introduce TopoMS, a computational tool enabling detailed topological analysis of molecular and condensed-matter systems, including the computation of atomic volumes and charges through the quantum theory of atoms in molecules, as well as the complete molecular graph. With roots in techniques from computational topology, and using a shared-memory parallel approach, TopoMS provides scalable, numerically robust, and topologically consistent analysis. TopoMS can be used as a command-line tool or with a GUI (graphical user interface), where the latter also enables an interactive exploration of the molecular graph. This paper presents algorithmic details of TopoMS and compares it with state-of-the-art tools: Bader charge analysis v1.0 (Arnaldsson et al., 01/11/17) and molecular graph extraction using Critic2 (Otero-de-la-Roza et al., Comput. Phys. Commun. 2014, 185, 1007). TopoMS not only combines the functionality of these individual codes but also demonstrates up to 4× performance gain on a standard laptop, faster convergence to fine-grid solution, robustness against lattice bias, and topological consistency. TopoMS is released publicly under BSD License. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.
Weighted link graphs: a distributed IDS for secondary intrusion detection and defense
NASA Astrophysics Data System (ADS)
Zhou, Mian; Lang, Sheau-Dong
2005-03-01
While a firewall installed at the perimeter of a local network provides the first line of defense against the hackers, many intrusion incidents are the results of successful penetration of the firewalls. One computer"s compromise often put the entire network at risk. In this paper, we propose an IDS that provides a finer control over the internal network. The system focuses on the variations of connection-based behavior of each single computer, and uses a weighted link graph to visualize the overall traffic abnormalities. The functionality of our system is of a distributed personal IDS system that also provides a centralized traffic analysis by graphical visualization. We use a novel weight assignment schema for the local detection within each end agent. The local abnormalities are quantitatively carried out by the node weight and link weight and further sent to the central analyzer to build the weighted link graph. Thus, we distribute the burden of traffic processing and visualization to each agent and make it more efficient for the overall intrusion detection. As the LANs are more vulnerable to inside attacks, our system is designed as a reinforcement to prevent corruption from the inside.
A Synthetic Teammate for UAV Applications: A Prospective Look
2006-08-01
was facilitated by the use of digital readouts for the flight instruments (other than the horizon line and reticle) in the STE, such that the model ...ACT-R returns a digital value for pitch and bank to the model (as reflected in the orientation of the horizon line with respect to the reticle), even...the development of a Situation Model (Zwann & Radvansky , 1998) to ground the n Historically Black Colleges and Universities (HBCU) research contract
Guy, Kristy K.
2015-11-09
This Data Series Report includes open-ocean shorelines, back-island shorelines, back-island shoreline points, sand polygons, and sand lines for the undeveloped areas of New Jersey barrier islands. These data were extracted from orthoimagery (aerial photography) taken between March 9, 1991, and July 30, 2013. The images used were 0.3–1-meter (m)-resolution U.S. Geological Survey Digital Orthophoto Quarter Quads (DOQQ), U.S. Department of Agriculture National Agriculture Imagery Program (NAIP) images, National Oceanic and Atmospheric Administration images, and New Jersey Geographic Information Network images. The back-island shorelines were hand-digitized at the intersects of the apparent back-island shoreline and transects spaced at 20-m intervals. The open-ocean shorelines were hand-digitized at the approximate still-water level, such as tide level, which was fit through the average position of waves and swash apparent on the beach. Hand-digitizing was done at a scale of approximately 1:2,000. The sand polygons were derived by an image-processing unsupervised classification technique that separates images into classes. The classes were then visually categorized as either sand or not sand. Sand lines were taken from the sand polygons. Also included in this report are 20-m-spaced transect lines and the transect base lines.
Zhong, Hai-ying; Wei, Cong; Zhang, Ya-lin
2013-02-01
Salivary glands of the cicada Karenia caelatata Distant were investigated using light microscopy and transmission electron microscopy. The salivary glands are paired structures and consist of principal glands and accessory glands. The principal gland is subdivided into anterior lobe and posterior lobe; the former contains about 34-39 long digitate lobules, while the latter contains approximately 30-33 long digitate lobules and 13-22 short digitate lobules. These short digitate lobules, about one fifth or sixth as long as the long digitate lobules, locate at the base of the long digitate lobules of posterior lobe. All of these digitate lobules vary in size, disposition, length and shape. The anterior lobe and the posterior lobe are connected by an anterior-posterior duct. Two efferent salivary ducts, which connect with the posterior lobe, fuse to form a common duct. The accessory gland is composed of three parts: a greatly tortuous and folded accessory salivary tube, a circlet of gular gland constituting of several acini of the same size, and a non-collapsible accessory salivary duct. The digitate lobules and gular glands possess secretory cells containing abundant secretory granules vary in size, shape, and electron density, as might indicate different materials are synthesized in different secretory regions. The anterior-posterior duct lines with a player of cuticular lining, and cells beneath the cuticular lining lack of basal infoldings, as suggests the duct serves just to transport secretions. The accessory salivary duct is lined with cuticular lining; cells of the duct have well developed basal infoldings associated with abundant mitochondria, as probably suggests the duct is a reabsorptive region of ions. The cells of the accessory salivary tube possess deep basal infoldings and well developed apical dense microvilli, indicating the cells of the tube are secretory in function. Concentric lamellar structures and a peculiar structure with abundant membrane-bound vesicles and secretory granules are observed for the first time, but their derivation and function remain unclear. The morphology and ultrastructure differences observed in the principal glands and accessory gland of the salivary glands of K. caelatata indicate that the sheath saliva was secreted by the principal glands, and the watery saliva was secreted by the accessory salivary glands. Rod-shaped microorganisms are found in the salivary glands (i.e., accessory salivary duct, gular gland, and long digitate lobule of salivary glands) for the first time, and their identity, function, and relationship to microorganisms residing in the salivary glands and/or other parts of alimentary canal of other cicadas need to be investigated further. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kahn, Jason
This dissertation concerns kindergarteners' and second graders' invented representations of motion, their interactions with conventional representations of motion built from the child's movement in front of a motion detector and using real-time graphing tools, and any changes in the invented representations that this interaction brings about. We have known for several decades that advanced learners (high school aged and beyond) struggle with physics concepts of motion and sometimes Cartesian graph-based representations of motion. Little has been known about how younger students approach the same concepts. In this study, eighteen children (10 kindergarteners and eight second graders) completed a three-hour clinical interview spread out evenly over three weeks. In the first and last interviews, the child was asked to produce external representations of movement and interpret conventional distance and time graphs of motion. In the second interview the children interacted with a motion detector and real-time graphing tools in a semi-self-directed format. Qualitative and quantitative results are presented and discussed. Qualitative data shows that children are adroit at representing motion and their productions are systematic and purposeful. Children produce drawings that both give context to the physical environment around them and also redescribe the drawn environment, meaning that they provide a potential audience with information otherwise imperceptible, by making certain implicit aspects more explicit. Second graders quickly appropriate the Cartesian graph during the intervention, though at times misinterpret the meaning associated with slope. Children correctly associate slope with direction, but at times misattribute sign of slope (positive or negative) and its corresponding direction (i.e. some children do not ascribe positive slope with motion away from a point of reference, but toward it). Kindergarteners showed a range of experiences during the intervention, one of the students showed a near mastery in interpretation of a Cartesian graph as a representation of motion, while another vehemently resisted graph as a representation of motion. Quantitative data gives a mechanism for comparing pre- and post-assessment productions. Both kindergarten and second grade students provide richer post-assessment representations, with kindergarteners more likely to include a figurative point of reference in the post-assessment and second graders including more explicit information about speed. The implications of this study are that invented representations of motion are a powerful tool for providing insights into children's thinking. The motion detector and real-time graphing tool can be used as early as kindergarten to help children build resources in their representations of motion; second grade students could find the same benefit and potentially begin to build conventional ideas about graphing and movement.
Park, Jong Kang; Rowlands, Christopher J; So, Peter T C
2017-01-01
Temporal focusing multiphoton microscopy is a technique for performing highly parallelized multiphoton microscopy while still maintaining depth discrimination. While the conventional wide-field configuration for temporal focusing suffers from sub-optimal axial resolution, line scanning temporal focusing, implemented here using a digital micromirror device (DMD), can provide substantial improvement. The DMD-based line scanning temporal focusing technique dynamically trades off the degree of parallelization, and hence imaging speed, for axial resolution, allowing performance parameters to be adapted to the experimental requirements. We demonstrate this new instrument in calibration specimens and in biological specimens, including a mouse kidney slice.
Park, Jong Kang; Rowlands, Christopher J.; So, Peter T. C.
2017-01-01
Temporal focusing multiphoton microscopy is a technique for performing highly parallelized multiphoton microscopy while still maintaining depth discrimination. While the conventional wide-field configuration for temporal focusing suffers from sub-optimal axial resolution, line scanning temporal focusing, implemented here using a digital micromirror device (DMD), can provide substantial improvement. The DMD-based line scanning temporal focusing technique dynamically trades off the degree of parallelization, and hence imaging speed, for axial resolution, allowing performance parameters to be adapted to the experimental requirements. We demonstrate this new instrument in calibration specimens and in biological specimens, including a mouse kidney slice. PMID:29387484
Accounting Data to Web Interface Using PERL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hargeaves, C
2001-08-13
This document will explain the process to create a web interface for the accounting information generated by the High Performance Storage Systems (HPSS) accounting report feature. The accounting report contains useful data but it is not easily accessed in a meaningful way. The accounting report is the only way to see summarized storage usage information. The first step is to take the accounting data, make it meaningful and store the modified data in persistent databases. The second step is to generate the various user interfaces, HTML pages, that will be used to access the data. The third step is tomore » transfer all required files to the web server. The web pages pass parameters to Common Gateway Interface (CGI) scripts that generate dynamic web pages and graphs. The end result is a web page with specific information presented in text with or without graphs. The accounting report has a specific format that allows the use of regular expressions to verify if a line is storage data. Each storage data line is stored in a detailed database file with a name that includes the run date. The detailed database is used to create a summarized database file that also uses run date in its name. The summarized database is used to create the group.html web page that includes a list of all storage users. Scripts that query the database folder to build a list of available databases generate two additional web pages. A master script that is run monthly as part of a cron job, after the accounting report has completed, manages all of these individual scripts. All scripts are written in the PERL programming language. Whenever possible data manipulation scripts are written as filters. All scripts are written to be single source, which means they will function properly on both the open and closed networks at LLNL. The master script handles the command line inputs for all scripts, file transfers to the web server and records run information in a log file. The rest of the scripts manipulate the accounting data or use the files created to generate HTML pages. Each script will be described in detail herein. The following is a brief description of HPSS taken directly from an HPSS web site. ''HPSS is a major development project, which began in 1993 as a Cooperative Research and Development Agreement (CRADA) between government and industry. The primary objective of HPSS is to move very large data objects between high performance computers, workstation clusters, and storage libraries at speeds many times faster than is possible with today's software systems. For example, HPSS can manage parallel data transfers from multiple network-connected disk arrays at rates greater than 1 Gbyte per second, making it possible to access high definition digitized video in real time.'' The HPSS accounting report is a canned report whose format is controlled by the HPSS developers.« less
NASA Technical Reports Server (NTRS)
Marshall, Paul; Carts, Marty; Campbell, Art; Reed, Robert; Ladbury, Ray; Seidleck, Christina; Currie, Steve; Riggs, Pam; Fritz, Karl; Randall, Barb
2004-01-01
A viewgraph presentation that reviews recent SiGe bit error test data for different commercially available high speed SiGe BiCMOS chips that were subjected to various levels of heavy ion and proton radiation. Results for the tested chips at different operating speeds are displayed in line graphs.
ERIC Educational Resources Information Center
Dissemination and Assessment Center for Bilingual Education, Austin, TX.
This is one of a series of student booklets designed for use in a bilingual mathematics program in grades 6-8. The general format is to present each page in both Spanish and English. The mathematical topics in this booklet include graphing on a number line, place value, using exponents, flow charts, and Roman numerals. (MK)
The Copernicus ultraviolet spectral atlas of Gamma Pegasi
NASA Technical Reports Server (NTRS)
Rogerson, J. B., Jr.
1985-01-01
An ultraviolet spectral atlas is presented for the B2 IV star Gamma Pegasi, which has been scanned from 970 to 1501 A by the Princeton spectrometer aboard the Copernicus satellite. From 970 to 1430 A the observations have a nominal resolution of 0.05 A. At the longer wavelengths the resolution is 0.1 A. The atlas is presented in graphs. Line identifications are also listed.
1990-09-30
UTP 620344220 30 September 1990 Command Line form gfl MSG: Estructure closed apIcatic Figure 5-64b (AFTER) 5-132 UTP 620344220 30 September 1990 Command...620344220 30 September 1990 Comman Lineay form gf 1 MSG: Estructure 9 psted on workstation 1 at priority 1 applcatioo Figure 5-101a (BEFORE) 5-205 UTP
Distributed operating system for NASA ground stations
NASA Technical Reports Server (NTRS)
Doyle, John F.
1987-01-01
NASA ground stations are characterized by ever changing support requirements, so application software is developed and modified on a continuing basis. A distributed operating system was designed to optimize the generation and maintenance of those applications. Unusual features include automatic program generation from detailed design graphs, on-line software modification in the testing phase, and the incorporation of a relational database within a real-time, distributed system.
Smith, Bruce D.; Abraham, Jared D.; Cannia, James C.; Hill, Patricia
2009-01-01
This report is a release of digital data from a helicopter electromagnetic and magnetic survey that was conducted during June 2008 in areas of western Nebraska as part of a joint hydrologic study by the North Platte Natural Resource District, South Platte Natural Resource District, and U.S. Geological Survey. The objective of the contracted survey, conducted by Fugro Airborne, Ltd., was to improve the understanding of the relationship between surface water and groundwater systems critical to developing groundwater models used in management programs for water resources. The survey covered 1,375 line km (854 line mi). A unique aspect of this survey is the flight line layout. One set of flight lines were flown paralleling each side of the east-west trending North Platte River and Lodgepole Creek. The survey also included widely separated (10 km) perpendicular north-south lines. The success of this survey design depended on a well understood regional hydrogeologic framework and model developed by the Cooperative Hydrologic Study of the Platte River Basin. Resistivity variations along lines could be related to this framework. In addition to these lines, more traditional surveys consisting of parallel flight lines separated by about 270 m were carried out for one block in each of the drainages. These surveys helped to establish the spatial variations of the resistivity of hydrostratigraphic units. The electromagnetic equipment consisted of six different coil-pair orientations that measured resistivity at separated frequencies from about 400 Hz to about 140,000 Hz. The electromagnetic data along flight lines were converted to electrical resistivity. The resulting line data were converted to geo-referenced grids and maps which are included with this report. In addition to the electromagnetic data, total field magnetic data and digital elevation data were collected. Data released in this report consist of data along flight lines, digital grids, and digital maps of the apparent resistivity and total magnetic field. The depth range of the subsurface investigation for the electromagnetic survey (estimated as deep as 60 m) is comparable to the depth of shallow aquifers. The geophysical data and hydrologic information from U.S. Geological Survey and cooperator studies are being used by resource managers to develop groundwater resource plans for the area. In addition, data will be used to refine hydrologic models in western Nebraska.
New ergonomic and functional design of digital conferencing rooms in a clinical environment
NASA Astrophysics Data System (ADS)
Ratib, Osman M.; Amato, Carlos L.; McGill, D. Ric; Liu, Brent J.; Balbona, Joseph A.; McCoy, J. Michael
2003-05-01
Clinical conferences and multidisciplinary medical rounds play a major role in patient management and decision-making relying on presentation of variety of documents: films, charts, videotapes, graphs etc. These conferences and clinical rounds are often carried out in conferencing rooms or department libraries that are usually not suitable for presentation of the data in electronic format. In most instances digital projection equipment is added to existing rooms without proper consideration to functional, ergonomic, acoustical, spatial and environmental requirements. Also, in large academic institutions, the conference rooms serve multiple purposes including as classrooms for teaching and education of students and for administrative meetings among managers and staff. In the migration toward a fully digital hospital we elected to analyze the functional requirements and optimize the ergonomic design of conferencing rooms that can accommodate clinical rounds, multidisciplinary reviews, seminars, formal lectures and department meetings. 3D computer simulation was used for better evaluation and analysis of spatial and ergonomic parameters and for gathering opinions and input from users on different design options. A critical component of the design is the understanding of the different workflow and requirements of different types of conferences and presentations that can be carried out in these conference rooms.
Sweeney, Ronald E.; Hill, Patricia L.
2005-01-01
The Nebraska, Kansas, and Oklahoma aeromagnetic grid is constructed from grids that combine information collected in 28 separate aeromagnetic surveys conducted between 1954 and 1985. The data from these surveys are of varying quality. The design and specifications (terrain clearance, sampling rates, line spacing, and reduction procedures) varied from survey to survey depending on the purpose of the project and the technology of that time. Every attempt was made to acquire the data in digital form. Most of the available digital data were obtained from aeromagnetic surveys flown by the U.S. Geological Survey (USGS), flown on contract with the USGS, or were obtained from other Federal agencies and State universities. The Kansas data were flown by and acquired from the Kansas Geological Survey. Some of the 1954, 1963, and 1964 data are available only on hand-contoured maps and had to be digitized. These maps were digitized along flight-line/contour-line intersections, which is considered to be the most accurate method of recovering the original data. All surveys have been continued to 304.8 m (1,000 ft) above ground and then blended or merged together.
Dupree, Jean A.; Crowfoot, Richard M.
2012-01-01
This geodatabase and its component datasets are part of U.S. Geological Survey Digital Data Series 650 and were generated to store basin boundaries for U.S. Geological Survey streamgages and other sites in Colorado. The geodatabase and its components were created by the U.S. Geological Survey, Colorado Water Science Center, and are used to derive the numeric drainage areas for Colorado that are input into the U.S. Geological Survey's National Water Information System (NWIS) database and also published in the Annual Water Data Report and on NWISWeb. The foundational dataset used to create the basin boundaries in this geodatabase was the National Watershed Boundary Dataset. This geodatabase accompanies a U.S. Geological Survey Techniques and Methods report (Book 11, Section C, Chapter 6) entitled "Digital Database Architecture and Delineation Methodology for Deriving Drainage Basins, and Comparison of Digitally and Non-Digitally Derived Numeric Drainage Areas." The Techniques and Methods report details the geodatabase architecture, describes the delineation methodology and workflows used to develop these basin boundaries, and compares digitally derived numeric drainage areas in this geodatabase to non-digitally derived areas. 1. COBasins.gdb: This geodatabase contains site locations and basin boundaries for Colorado. It includes a single feature dataset, called BasinsFD, which groups the component feature classes and topology rules. 2. BasinsFD: This feature dataset in the "COBasins.gdb" geodatabase is a digital container that holds the feature classes used to archive site locations and basin boundaries as well as the topology rules that govern spatial relations within and among component feature classes. This feature dataset includes three feature classes: the sites for which basins have been delineated (the "Sites" feature class), basin bounding lines (the "BasinLines" feature class), and polygonal basin areas (the "BasinPolys" feature class). The feature dataset also stores the topology rules (the "BasinsFD_Topology") that constrain the relations within and among component feature classes. The feature dataset also forces any feature classes inside it to have a consistent projection system, which is, in this case, an Albers-Equal-Area projection system. 3. BasinsFD_Topology: This topology contains four persistent topology rules that constrain the spatial relations within the "BasinLines" feature class and between the "BasinLines" feature class and the "BasinPolys" feature classes. 4. Sites: This point feature class contains the digital representations of the site locations for which Colorado Water Science Center basin boundaries have been delineated. This feature class includes point locations for Colorado Water Science Center active (as of September 30, 2009) gages and for other sites. 5. BasinLines: This line feature class contains the perimeters of basins delineated for features in the "Sites" feature class, and it also contains information regarding the sources of lines used for the basin boundaries. 6. BasinPolys: This polygon feature class contains the polygonal basin areas delineated for features in the "Sites" feature class, and it is used to derive the numeric drainage areas published by the Colorado Water Science Center.
Design features of on-line anatomy information resources: a comparison with the Digital Anatomist.
Kim, S; Brinkley, J F; Rosse, C
1999-01-01
In order to update the design of the next generation of the Digital Anatomist, we have surveyed teaching assistants who have used the Digital Anatomist for learning and teaching anatomy as medical students, and have also examined available anatomy web sites with sufficient content to support learning. The majority of web sites function in an atlas mode and provide for the identification of structures. These atlases incorporate a variety of features for interactivity with 2D images, some of which are not available in the Digital Anatomist. The surveys suggest that the greatest need is for on-line access to comprehensive and detailed anatomical information and for the development of knowledge-based methods that allow the direct manipulation of segmented 3D graphical models by the user. The requirement for such interactivity is a comprehensive symbolic model of the physical organization of the body that can support inference.
Holographic line field en-face OCT with digital adaptive optics in the retina in vivo.
Ginner, Laurin; Schmoll, Tilman; Kumar, Abhishek; Salas, Matthias; Pricoupenko, Nastassia; Wurster, Lara M; Leitgeb, Rainer A
2018-02-01
We demonstrate a high-resolution line field en-face time domain optical coherence tomography (OCT) system using an off-axis holography configuration. Line field en-face OCT produces high speed en-face images at rates of up to 100 Hz. The high frame rate favors good phase stability across the lateral field-of-view which is indispensable for digital adaptive optics (DAO). Human retinal structures are acquired in-vivo with a broadband light source at 840 nm, and line rates of 10 kHz to 100 kHz. Structures of different retinal layers, such as photoreceptors, capillaries, and nerve fibers are visualized with high resolution of 2.8 µm and 5.5 µm in lateral directions. Subaperture based DAO is successfully applied to increase the visibility of cone-photoreceptors and nerve fibers. Furthermore, en-face Doppler OCT maps are generated based on calculating the differential phase shifts between recorded lines.
Interface For Fault-Tolerant Control System
NASA Technical Reports Server (NTRS)
Shaver, Charles; Williamson, Michael
1989-01-01
Interface unit and controller emulator developed for research on electronic helicopter-flight-control systems equipped with artificial intelligence. Interface unit interrupt-driven system designed to link microprocessor-based, quadruply-redundant, asynchronous, ultra-reliable, fault-tolerant control system (controller) with electronic servocontrol unit that controls set of hydraulic actuators. Receives digital feedforward messages from, and transmits digital feedback messages to, controller through differential signal lines or fiber-optic cables (thus far only differential signal lines have been used). Analog signals transmitted to and from servocontrol unit via coaxial cables.
DIGITAL CARTOGRAPHY AIDS IN THE SOLUTION OF BOUNDARY DISPUTE.
Beck, Francis J.
1983-01-01
The boundary between the States of Ohio and Kentucky and Indiana and Kentucky has been in dispute for many years. A major breakthrough in this continuing dispute has been a recent agreement between the States to accept the boundary line as depicted on U. S. Geological Survey 7. 5-minute quadrangle maps. A new segment of the boundary line was established utilizing the shoreline depicted on the 1966 U. S. Army Corps of Engineers charts. Segments of the boundary were then digitized from the quadrangle maps.
The use of U.S. Geological Survey digital geospatial data products for science research
Varanka, Dalia E.; Deering, Carol; Caro, Holly
2012-01-01
The development of geographic information system (GIS) transformed the practice of geographic science research. The availability of low-cost, reliable data by the U.S. Geological Survey (USGS) supported the advance of GIS in the early stages of the transition to digital technology. To estimate the extent of the scientific use of USGS digital geospatial data products, a search of science literature databases yielded numbers of articles citing USGS products. Though this method requires careful consideration to avoid false positives, these citation numbers of three types of products (vector, land-use/land-cover, and elevation data) were graphed, and the frequency trends were examined. Trends indicated that the use of several, but not all, products increased with time. The use of some products declined and reasons for these declines are offered. To better understand how these data affected the design and outcomes of research projects, the study begins to build a context for the data by discussing digital cartographic research preceding the production of mass-produced products. The data distribution methods used various media for different system types and were supported by instructional material. The findings are an initial assessment of the affect of USGS products on GIS-enabled science research. A brief examination of the specific papers indicates that USGS data were used for science and GIS conceptual research, advanced education, and problem analysis and solution applications.
Smith, B.D.; Abraham, J.D.; Cannia, J.C.; Minsley, B.J.; Ball, L.B.; Steele, G.V.; Deszcz-Pan, M.
2011-01-01
This report is a release of digital data from a helicopter electromagnetic and magnetic survey conducted by Fugro Airborne Surveys in areas of eastern Nebraska as part of a joint hydrologic study by the Lower Platte North and Lower Platte South Natural Resources Districts, and the U.S. Geological Survey. The survey flight lines covered 1,418.6 line km (882 line mile). The survey was flown from April 22 to May 2, 2009. The objective of the contracted survey was to improve the understanding of the relation between surface water and groundwater systems critical to developing groundwater models used in management programs for water resources. The electromagnetic equipment consisted of six different coil-pair orientations that measured resistivity at separate frequencies from about 400 hertz to about 140,000 hertz. The electromagnetic data were converted to georeferenced electrical resistivity grids and maps for each frequency that represent different approximate depths of investigation for each survey area. The electrical resistivity data were input into a numerical inversion to estimate resistivity variations with depth. In addition to the electromagnetic data, total field magnetic data and digital elevation data were collected. Data released in this report consist of flight line data, digital grids, digital databases of the inverted electrical resistivity with depth, and digital maps of the apparent resistivity and total magnetic field. The range of subsurface investigation is comparable to the depth of shallow aquifers. The survey areas, Swedeburg and Sprague, were chosen based on results from test flights in 2007 in eastern Nebraska and needs of local water managers. The geophysical and hydrologic information from U.S. Geological Survey studies are being used by resource managers to develop groundwater resource plans for the area.
ERIC Educational Resources Information Center
Selwyn, Neil; Henderson, Michael; Chao, Shu-Hua
2015-01-01
The generation, processing and circulation of data in digital form is now an integral aspect of contemporary schooling. Based upon empirical study of two secondary school settings in Australia, this paper considers the different forms of digitally-based "data work" engaged in by school leaders, managers, administrators and teachers. In…
High-Tech, Hard Work: An Investigation of Teachers' Work in the Digital Age
ERIC Educational Resources Information Center
Selwyn, Neil; Nemorin, Selena; Johnson, Nicola
2017-01-01
This paper explores the ways in which digital technologies are now implicated in the work--and specifically the labour--of school teachers. Drawing upon qualitative studies in two Australian high schools, the paper examines the variety of ways in which teachers' work is now enacted and experienced along digital lines. In particular, the paper…
A Bowl of Hematite-Rich 'Berries'
NASA Technical Reports Server (NTRS)
2004-01-01
This graph shows two spectra of outcrop regions near the Mars Exploration Rover Opportunity's landing site. The blue line shows data for a region dubbed 'Berry Bowl,' which contains a handful of the sphere-like grains dubbed 'blueberries.' The yellow line represents an area called 'Empty' next to Berry Bowl that is devoid of berries. Berry Bowl's spectrum still shows typical outcrop characteristics, but also exhibits an intense hematite signature, seen as a 'magnetic sextet.' Hematite is an iron-bearing mineral often formed in water. These spectra were taken by the rover's Moessbauer spectrometer on the 46th (Empty) and 48th (Berry Bowl) martian days, or sols, of its mission.
AgRISTARS. Supporting research: Algorithms for scene modelling
NASA Technical Reports Server (NTRS)
Rassbach, M. E. (Principal Investigator)
1982-01-01
The requirements for a comprehensive analysis of LANDSAT or other visual data scenes are defined. The development of a general model of a scene and a computer algorithm for finding the particular model for a given scene is discussed. The modelling system includes a boundary analysis subsystem, which detects all the boundaries and lines in the image and builds a boundary graph; a continuous variation analysis subsystem, which finds gradual variations not well approximated by a boundary structure; and a miscellaneous features analysis, which includes texture, line parallelism, etc. The noise reduction capabilities of this method and its use in image rectification and registration are discussed.
Recognizing simple polyhedron from a perspective drawing
NASA Astrophysics Data System (ADS)
Zhang, Guimei; Chu, Jun; Miao, Jun
2009-10-01
Existed methods can't be used for recognizing simple polyhedron. In this paper, three problems are researched. First, a method for recognizing triangle and quadrilateral is introduced based on geometry and angle constraint. Then Attribute Relation Graph (ARG) is employed to describe simple polyhedron and line drawing. Last, a new method is presented to recognize simple polyhedron from a line drawing. The method filters the candidate database before matching line drawing and model, thus the recognition efficiency is improved greatly. We introduced the geometrical characteristics and topological characteristics to describe each node of ARG, so the algorithm can not only recognize polyhedrons with different shape but also distinguish between polyhedrons with the same shape but with different sizes and proportions. Computer simulations demonstrate the effectiveness of the method preliminarily.
Multiple-function multi-input/multi-output digital control and on-line analysis
NASA Technical Reports Server (NTRS)
Hoadley, Sherwood T.; Wieseman, Carol D.; Mcgraw, Sandra M.
1992-01-01
The design and capabilities of two digital controller systems for aeroelastic wind-tunnel models are described. The first allowed control of flutter while performing roll maneuvers with wing load control as well as coordinating the acquisition, storage, and transfer of data for on-line analysis. This system, which employs several digital signal multi-processor (DSP) boards programmed in high-level software languages, is housed in a SUN Workstation environment. A second DCS provides a measure of wind-tunnel safety by functioning as a trip system during testing in the case of high model dynamic response or in case the first DCS fails. The second DCS uses National Instruments LabVIEW Software and Hardware within a Macintosh environment.
Spectral analysis and filtering techniques in digital spatial data processing
Pan, Jeng-Jong
1989-01-01
A filter toolbox has been developed at the EROS Data Center, US Geological Survey, for retrieving or removing specified frequency information from two-dimensional digital spatial data. This filter toolbox provides capabilities to compute the power spectrum of a given data and to design various filters in the frequency domain. Three types of filters are available in the toolbox: point filter, line filter, and area filter. Both the point and line filters employ Gaussian-type notch filters, and the area filter includes the capabilities to perform high-pass, band-pass, low-pass, and wedge filtering techniques. These filters are applied for analyzing satellite multispectral scanner data, airborne visible and infrared imaging spectrometer (AVIRIS) data, gravity data, and the digital elevation models (DEM) data. -from Author
Association between basic numerical abilities and mathematics achievement.
Sasanguie, Delphine; De Smedt, Bert; Defever, Emmy; Reynvoet, Bert
2012-06-01
Various measures have been used to investigate number processing in children, including a number comparison or a number line estimation task. The present study aimed to examine whether and to which extent these different measures of number representation are related to performance on a curriculum-based standardized mathematics achievement test in kindergarteners, first, second, and sixth graders. Children completed a number comparison task and a number line estimation task with a balanced set of symbolic (Arabic digits) and non-symbolic (dot patterns) stimuli. Associations with mathematics achievement were observed for the symbolic measures. Although the association with number line estimation was consistent over grades, the association with number comparison was much stronger in kindergarten compared to the other grades. The current data indicate that a good knowledge of the numerical meaning of Arabic digits is important for children's mathematical development and that particularly the access to the numerical meaning of symbolic digits rather than the representation of number per se is important. © 2011 The British Psychological Society.
NASA Astrophysics Data System (ADS)
Hanzalová, K.; Pavelka, K.
2013-07-01
The Czech Technical University in Prague in the cooperation with the University of Applied Sciences in Dresden (Germany) work on the Nasca Project. The cooperation started in 2004 and much work has been done since then. All work is connected with Nasca lines in southern Peru. The Nasca project started in 1995 and its main target is documentation and conservation of the Nasca lines. Most of the project results are presented as WebGIS application via Internet. In the face of the impending destruction of the soil drawings, it is possible to preserve this world cultural heritage for the posterity at least in a digital form. Creating of Nasca lines map is very useful. The map is in a digital form and it is also available as a paper map. The map contains planimetric component of the map, map lettering and altimetry. Thematic folder in this map is a vector layer of the geoglyphs in Nasca/Peru. Basis for planimetry are georeferenced satellite images, altimetry is created from digital elevation model. This map was created in ArcGis software.
Digital image transformation and rectification of spacecraft and radar images
Wu, S.S.C.
1985-01-01
Digital image transformation and rectification can be described in three categories: (1) digital rectification of spacecraft pictures on workable stereoplotters; (2) digital correction of radar image geometry; and (3) digital reconstruction of shaded relief maps and perspective views including stereograms. Digital rectification can make high-oblique pictures workable on stereoplotters that would otherwise not accommodate such extreme tilt angles. It also enables panoramic line-scan geometry to be used to compile contour maps with photogrammetric plotters. Rectifications were digitally processed on both Viking Orbiter and Lander pictures of Mars as well as radar images taken by various radar systems. By merging digital terrain data with image data, perspective and three-dimensional views of Olympus Mons and Tithonium Chasma, also of Mars, are reconstructed through digital image processing. ?? 1985.
Processing, mosaicking and management of the Monterey Bay digital sidescan-sonar images
Chavez, P.S.; Isbrecht, J.; Galanis, P.; Gabel, G.L.; Sides, S.C.; Soltesz, D.L.; Ross, Stephanie L.; Velasco, M.G.
2002-01-01
Sidescan-sonar imaging systems with digital capabilities have now been available for approximately 20 years. In this paper we present several of the various digital image processing techniques developed by the U.S. Geological Survey (USGS) and used to apply intensity/radiometric and geometric corrections, as well as enhance and digitally mosaic, sidescan-sonar images of the Monterey Bay region. New software run by a WWW server was designed and implemented to allow very large image data sets, such as the digital mosaic, to be easily viewed interactively, including the ability to roam throughout the digital mosaic at the web site in either compressed or full 1-m resolution. The processing is separated into the two different stages: preprocessing and information extraction. In the preprocessing stage, sensor-specific algorithms are applied to correct for both geometric and intensity/radiometric distortions introduced by the sensor. This is followed by digital mosaicking of the track-line strips into quadrangle format which can be used as input to either visual or digital image analysis and interpretation. An automatic seam removal procedure was used in combination with an interactive digital feathering/stenciling procedure to help minimize tone or seam matching problems between image strips from adjacent track-lines. The sidescan-sonar image processing package is part of the USGS Mini Image Processing System (MIPS) and has been designed to process data collected by any 'generic' digital sidescan-sonar imaging system. The USGS MIPS software, developed over the last 20 years as a public domain package, is available on the WWW at: http://terraweb.wr.usgs.gov/trs/software.html.
A Coastal Hazards Data Base for the U.S. Gulf Coast (1993) (NDP-04bB)
Gornitz, Vivien M. [National Aeronautics and Space Administration, Goddard Institute for Space Studies, New York, NY (USA); White, Tammy W. [CDIAC, Oak Ridge National Laboratory, Oak Ridge, TN (USA)
2008-01-01
This document describes the contents of a digital data base that may be used to identify coastlines along the U.S. Gulf Coast at risk to sea-level rise. The data base integrates point, line, and polygon data for the U.S. Gulf Coast into 0.25° latitude by 0.25° longitude grid cells and into 1:2,000,000 digitized line segments that can be used by raster or vector geographic information systems (GIS) as well as by non-GIS data base systems. Each coastal grid cell and line segment contains data on elevations, geology, geomorphology, sea-level trends, shoreline displacement (erosion/accretion), tidal ranges, and wave heights.
Perucho, Beatriz; Micó, Vicente
2014-01-01
Progressive addition lenses (PALs) are engraved with permanent marks at standardized locations in order to guarantee correct centering and alignment throughout the manufacturing and mounting processes. Out of the production line, engraved marks provide useful information about the PAL as well as act as locator marks to re-ink again the removable marks. Even though those marks should be visible by simple visual inspection with the naked eye, engraving marks are often faint and weak, obscured by scratches, and partially occluded and difficult to recognize on tinted or antireflection-coated lenses. Here, we present an extremely simple optical device (named as wavefront holoscope) for visualization and characterization of permanent marks in PAL based on digital in-line holography. Essentially, a point source of coherent light illuminates the engraved mark placed just before a CCD camera that records a classical Gabor in-line hologram. The recorded hologram is then digitally processed to provide a set of high-contrast images of the engraved marks. Experimental results are presented showing the applicability of the proposed method as a new ophthalmic instrument for visualization and characterization of engraved marks in PALs.
Context and Domain Knowledge Enhanced Entity Spotting in Informal Text
NASA Astrophysics Data System (ADS)
Gruhl, Daniel; Nagarajan, Meena; Pieper, Jan; Robson, Christine; Sheth, Amit
This paper explores the application of restricted relationship graphs (RDF) and statistical NLP techniques to improve named entity annotation in challenging Informal English domains. We validate our approach using on-line forums discussing popular music. Named entity annotation is particularly difficult in this domain because it is characterized by a large number of ambiguous entities, such as the Madonna album "Music" or Lilly Allen's pop hit "Smile".
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarocki, John Charles; Zage, David John; Fisher, Andrew N.
LinkShop is a software tool for applying the method of Linkography to the analysis time-sequence data. LinkShop provides command line, web, and application programming interfaces (API) for input and processing of time-sequence data, abstraction models, and ontologies. The software creates graph representations of the abstraction model, ontology, and derived linkograph. Finally, the tool allows the user to perform statistical measurements of the linkograph and refine the ontology through direct manipulation of the linkograph.
ERIC Educational Resources Information Center
New South Wales Dept. of Education, Sydney (Australia).
As part of a series of tests to measure mastery of specific skills in the natural sciences, copies of the first 13 tests are provided. Skills to be tested include: (1) reading a table; (2) using a biological key; (3) identifying chemical symbols; (4) identifying parts of a human body; (5) reading a line graph; (6) identifying electronic and…
The Copernicus ultraviolet spectral atlas of Beta Orionis
NASA Technical Reports Server (NTRS)
Rogerson, J. B., Jr.; Upson, W. L., II
1982-01-01
An ultraviolet spectral atlas is presented for the B8 Ia star Beta Orionis, which has been scanned from 999 to 1561 A by the Princeton spectrometer aboard the Copernicus satellite. From 999 to 1420 A the observations have a nominal resolution of 0.05 A. At the longer wavelengths the resolution is 0.1 A. The atlas is presented in graphs. Lines identified in the spectrum are also listed.
Attitude dynamic of spin-stabilized satellites with flexible appendages
NASA Technical Reports Server (NTRS)
Renard, M. L.
1973-01-01
Equations of motion and computer programs have been developed for analyzing the motion of a spin-stabilized spacecraft having long, flexible appendages. Stability charts were derived, or can be redrawn with the desired accuracy for any particular set of design parameters. Simulation graphs of variables of interest are readily obtainable on line using program FLEXAT. Finally, applications to actual satellites, such as UK-4 and IMP-1 have been considered.
The Copernicus ultraviolet spectral atlas of Iota Herculis
NASA Technical Reports Server (NTRS)
Upson, W. L., II; Rogerson, J. B., Jr.
1980-01-01
An ultraviolet spectral atlas is presented for the B3 IV star Iota Herculis, which has been scanned from 999 to 1467 A by the Princeton spectrometer aboard the Copernicus satellite. From 999 to 1422 A the observations have a nominal resolution of 0.05 A. At the longer wavelengths the resolution is 0.1 A. The atlas is presented in graphs. Lines identified in the spectrum are also listed.
Nomogram Method as Means for Resource Potential Efficiency Predicative Aid of Petrothermal Energy
NASA Astrophysics Data System (ADS)
Gabdrakhmanova, K. F.; Izmailova, G. R.; Larin, P. A.; Vasilyeva, E. R.; Madjidov, M. A.; Marupov, S. R.
2018-05-01
The article describes the innovative approach when predicting the resource potential efficiency of petrothermal energy. Various geothermal gradients representative of Bashkortostan and Tatarstan republics regions were considered. With the help of nomograms, the authors analysed fluid temperature dependency graphs at the outlet and the thermal power versus fluid velocity along the wellbore. From the family of graphs plotted by us, velocities corresponding to specific temperature were found. Then, according to thermal power versus velocity curve, power levels corresponding to these velocities relative to the selected fluid temperature were found. On the basis of two dependencies obtained, nomograms were plotted. The result of determining the petrothermal energy production efficiency is a family of isocline lines that enables one to select the optimum temperature and injection rate to obtain the required amount of heat for a particular depth and geothermal gradient.
Quantify spatial relations to discover handwritten graphical symbols
NASA Astrophysics Data System (ADS)
Li, Jinpeng; Mouchère, Harold; Viard-Gaudin, Christian
2012-01-01
To model a handwritten graphical language, spatial relations describe how the strokes are positioned in the 2-dimensional space. Most of existing handwriting recognition systems make use of some predefined spatial relations. However, considering a complex graphical language, it is hard to express manually all the spatial relations. Another possibility would be to use a clustering technique to discover the spatial relations. In this paper, we discuss how to create a relational graph between strokes (nodes) labeled with graphemes in a graphical language. Then we vectorize spatial relations (edges) for clustering and quantization. As the targeted application, we extract the repetitive sub-graphs (graphical symbols) composed of graphemes and learned spatial relations. On two handwriting databases, a simple mathematical expression database and a complex flowchart database, the unsupervised spatial relations outperform the predefined spatial relations. In addition, we visualize the frequent patterns on two text-lines containing Chinese characters.
NASA Astrophysics Data System (ADS)
Luna Acosta, German Aurelio
The masses of observed hadrons are fitted according to the kinematic predictions of Conformal Relativity. The hypothesis gives a remarkably good fit. The isospin SU(2) gauge invariant Lagrangian L(,(pi)NN)(x,(lamda)) is used in the calculation of d(sigma)/d(OMEGA) to 2nd-order Feynman graphs for simplified models of (pi)N(--->)(pi)N. The resulting infinite mass sums over the nucleon (Conformal) families are done via the Generalized-Sommerfeld-Watson Transform Theorem. Even though the models are too simple to be realistic, they indicate that if (DELTA)-internal lines were to be included, 2nd-order Feynman graphs may reproduce the experimental data qualitatively. The energy -dependence of the propagator and couplings in Conformal QFT is different from that of ordinary QFT. Suggestions for further work are made in the areas of ultra-violet divergences and OPEC calculations.
Using Tutte polynomials to analyze the structure of the benzodiazepines
NASA Astrophysics Data System (ADS)
Cadavid Muñoz, Juan José
2014-05-01
Graph theory in general and Tutte polynomials in particular, are implemented for analyzing the chemical structure of the benzodiazepines. Similarity analysis are used with the Tutte polynomials for finding other molecules that are similar to the benzodiazepines and therefore that might show similar psycho-active actions for medical purpose, in order to evade the drawbacks associated to the benzodiazepines based medicine. For each type of benzodiazepines, Tutte polynomials are computed and some numeric characteristics are obtained, such as the number of spanning trees and the number of spanning forests. Computations are done using the computer algebra Maple's GraphTheory package. The obtained analytical results are of great importance in pharmaceutical engineering. As a future research line, the usage of the chemistry computational program named Spartan, will be used to extent and compare it with the obtained results from the Tutte polynomials of benzodiazepines.
Digital Museum of Retinal Ganglion Cells with Dense Anatomy and Physiology.
Bae, J Alexander; Mu, Shang; Kim, Jinseop S; Turner, Nicholas L; Tartavull, Ignacio; Kemnitz, Nico; Jordan, Chris S; Norton, Alex D; Silversmith, William M; Prentki, Rachel; Sorek, Marissa; David, Celia; Jones, Devon L; Bland, Doug; Sterling, Amy L R; Park, Jungman; Briggman, Kevin L; Seung, H Sebastian
2018-05-17
When 3D electron microscopy and calcium imaging are used to investigate the structure and function of neural circuits, the resulting datasets pose new challenges of visualization and interpretation. Here, we present a new kind of digital resource that encompasses almost 400 ganglion cells from a single patch of mouse retina. An online "museum" provides a 3D interactive view of each cell's anatomy, as well as graphs of its visual responses. The resource reveals two aspects of the retina's inner plexiform layer: an arbor segregation principle governing structure along the light axis and a density conservation principle governing structure in the tangential plane. Structure is related to visual function; ganglion cells with arbors near the layer of ganglion cell somas are more sustained in their visual responses on average. Our methods are potentially applicable to dense maps of neuronal anatomy and physiology in other parts of the nervous system. Copyright © 2018 Elsevier Inc. All rights reserved.
Single-shot dual-wavelength in-line and off-axis hybrid digital holography
NASA Astrophysics Data System (ADS)
Wang, Fengpeng; Wang, Dayong; Rong, Lu; Wang, Yunxin; Zhao, Jie
2018-02-01
We propose an in-line and off-axis hybrid holographic real-time imaging technique. The in-line and off-axis digital holograms are generated simultaneously by two lasers with different wavelengths, and they are recorded using a color camera with a single shot. The reconstruction is carried using an iterative algorithm in which the initial input is designed to include the intensity of the in-line hologram and the approximate phase distributions obtained from the off-axis hologram. In this way, the complex field in the object plane and the output by the iterative procedure can produce higher quality amplitude and phase images compared to traditional iterative phase retrieval. The performance of the technique has been demonstrated by acquiring the amplitude and phase images of a green lacewing's wing and a living moon jellyfish.
Digital Bedrock Compilation: A Geodatabase Covering Forest Service Lands in California
NASA Astrophysics Data System (ADS)
Elder, D.; de La Fuente, J. A.; Reichert, M.
2010-12-01
This digital database contains bedrock geologic mapping for Forest Service lands within California. This compilation began in 2004 and the first version was completed in 2005. Second publication of this geodatabase was completed in 2010 and filled major gaps in the southern Sierra Nevada and Modoc/Medicine Lake/Warner Mountains areas. This digital map database was compiled from previously published and unpublished geologic mapping, with source mapping and review from California Geological Survey, the U.S. Geological Survey and others. Much of the source data was itself compilation mapping. This geodatabase is huge, containing ~107,000 polygons and ~ 280,000 arcs. Mapping was compiled from more than one thousand individual sources and covers over 41,000,000 acres (~166,000 km2). It was compiled from source maps at various scales - from ~ 1:4,000 to 1:250,000 and represents the best available geologic mapping at largest scale possible. An estimated 70-80% of the source information was digitized from geologic mapping at 1:62,500 scale or better. Forest Service ACT2 Enterprise Team compiled the bedrock mapping and developed a geodatabase to store this information. This geodatabase supports feature classes for polygons (e.g, map units), lines (e.g., contacts, boundaries, faults and structural lines) and points (e.g., orientation data, structural symbology). Lookup tables provide detailed information for feature class items. Lookup/type tables contain legal values and hierarchical groupings for geologic ages and lithologies. Type tables link coded values with descriptions for line and point attributes, such as line type, line location and point type. This digital mapping is at the core of many quantitative analyses and derivative map products. Queries of the database are used to produce maps and to quantify rock types of interest. These include the following: (1) ultramafic rocks - where hazards from naturally occurring asbestos are high, (2) granitic rocks - increased erosion hazards, (3) limestone, chert, sedimentary rocks - paleontological resources (Potential Fossil Yield Classification maps), (4) calcareous rocks (cave resources, water chemistry), and (5) lava flows - lava tubes (more caves). Map unit groupings (e.g., belts, terranes, tectonic & geomorphic provinces) can also be derived from the geodatabase. Digital geologic mapping was used in ground water modeling to predict effects of tunneling through the San Bernardino Mountains. Bedrock mapping is used in models that characterize watershed sediment regimes and quantify anthropogenic influences. When combined with digital geomorphology mapping, this geodatabase helps to assess landslide hazards.
An Ada/SQL (Structured Query Language) Application Scanner.
1988-03-01
Digital ...8217 (" DIGITS "), 46 new STRING’ ("DO"), new STRING’ ("ELSE"), new STRING’ ("ELSIF"), new STRING’ ("END"), new STRING’ ("ENTRY"), new STRING’ ("EXCEPTION...INTEGERPRINT; generic type NUM is digits <>; package FLOATPRINT is package txtprts.ada 18 prcdr PR (FL inFL %YE LINE n LINTYPE UNCLASSIFIED procedure
The use of LANDSAT digital data and computer-implemented techniques for an agricultural application
NASA Technical Reports Server (NTRS)
Joyce, A. T.; Griffin, R. H., II
1978-01-01
Agricultural applications procedures are described for use of LANDSAT digital data and other digitalized data (e.g., soils). The results of having followed these procedures are shown in production estimates for cotton and soybeans in Washington County, Mississippi. Examples of output products in both line printer and map formats are included, and a product adequacy assessment is made.
Approximation methods for stochastic petri nets
NASA Technical Reports Server (NTRS)
Jungnitz, Hauke Joerg
1992-01-01
Stochastic Marked Graphs are a concurrent decision free formalism provided with a powerful synchronization mechanism generalizing conventional Fork Join Queueing Networks. In some particular cases the analysis of the throughput can be done analytically. Otherwise the analysis suffers from the classical state explosion problem. Embedded in the divide and conquer paradigm, approximation techniques are introduced for the analysis of stochastic marked graphs and Macroplace/Macrotransition-nets (MPMT-nets), a new subclass introduced herein. MPMT-nets are a subclass of Petri nets that allow limited choice, concurrency and sharing of resources. The modeling power of MPMT is much larger than that of marked graphs, e.g., MPMT-nets can model manufacturing flow lines with unreliable machines and dataflow graphs where choice and synchronization occur. The basic idea leads to the notion of a cut to split the original net system into two subnets. The cuts lead to two aggregated net systems where one of the subnets is reduced to a single transition. A further reduction leads to a basic skeleton. The generalization of the idea leads to multiple cuts, where single cuts can be applied recursively leading to a hierarchical decomposition. Based on the decomposition, a response time approximation technique for the performance analysis is introduced. Also, delay equivalence, which has previously been introduced in the context of marked graphs by Woodside et al., Marie's method and flow equivalent aggregation are applied to the aggregated net systems. The experimental results show that response time approximation converges quickly and shows reasonable accuracy in most cases. The convergence of Marie's method and flow equivalent aggregation are applied to the aggregated net systems. The experimental results show that response time approximation converges quickly and shows reasonable accuracy in most cases. The convergence of Marie's is slower, but the accuracy is generally better. Delay equivalence often fails to converge, while flow equivalent aggregation can lead to potentially bad results if a strong dependence of the mean completion time on the interarrival process exists.
NASA Technical Reports Server (NTRS)
Klutz, Glenn
1989-01-01
A facility was established that uses collected data and feeds it into mathematical models that generate improved data arrays by correcting for various losses, base line drift, and conversion to unity scaling. These developed data arrays have headers and other identifying information affixed and are subsequently stored in a Laser Materials and Characteristics data base which is accessible to various users. The two part data base: absorption - emission spectra and tabulated data, is developed around twelve laser models. The tabulated section of the data base is divided into several parts: crystalline, optical, mechanical, and thermal properties; aborption and emission spectra information; chemical name and formulas; and miscellaneous. A menu-driven, language-free graphing program will reduce and/or remove the requirement that users become competent FORTRAN programmers and the concomitant requirement that they also spend several days to a few weeks becoming conversant with the GEOGRAF library and sequence of calls and the continual refreshers of both. The work included becoming thoroughly conversant with or at least very familiar with GEOGRAF by GEOCOMP Corp. The development of the graphing program involved trial runs of the various callable library routines on dummy data in order to become familiar with actual implementation and sequencing. This was followed by trial runs with actual data base files and some additional data from current research that was not in the data base but currently needed graphs. After successful runs, with dummy and real data, using actual FORTRAN instructions steps were undertaken to develop the menu-driven language-free implementation of a program which would require the user only know how to use microcomputers. The user would simply be responding to items displayed on the video screen. To assist the user in arriving at the optimum values needed for a specific graph, a paper, and pencil check list was made available to use on the trial runs.
Magnified reconstruction of digitally recorded holograms by Fresnel-Bluestein transform.
Restrepo, John F; Garcia-Sucerquia, Jorge
2010-11-20
A method for numerical reconstruction of digitally recorded holograms with variable magnification is presented. The proposed strategy allows for smaller, equal, or larger magnification than that achieved with Fresnel transform by introducing the Bluestein substitution into the Fresnel kernel. The magnification is obtained independent of distance, wavelength, and number of pixels, which enables the method to be applied in color digital holography and metrological applications. The approach is supported by experimental and simulation results in digital holography of objects of comparable dimensions with the recording device and in the reconstruction of holograms from digital in-line holographic microscopy.
Transformations: Technology and the Music Industry.
ERIC Educational Resources Information Center
Peters, G. David
2001-01-01
Focuses on the companies and organizations of the Music Industry Conference (MIC). Addresses topics such as: changes in companies due to technology, audio compact discs, the music instrument digital interface (MIDI) , digital sound recording, and the MIC on-line music instruction programs offered. (CMK)
Functional test generation for digital circuits described with a declarative language: LUSTRE
NASA Astrophysics Data System (ADS)
Almahrous, Mazen
1990-08-01
A functional approach to the test generation problem starting from a high level description is proposed. The circuit tested is modeled, using the LUSTRE high level data flow description language. The different LUSTRE primitives are translated to a SATAN format graph in order to evaluate the testability of the circuit and to generate test sequences. Another method of testing the complex circuits comprising an operative part and a control part is defined. It consists of checking experiments for the control part observed through the operative part. It was applied to the automata generated from a LUSTRE description of the circuit.
Guy, Kristy K.
2015-01-01
This Data Series Report includes several open-ocean shorelines, back-island shorelines, back-island shoreline points, sand area polygons, and sand lines for Assateague Island that were extracted from natural-color orthoimagery (aerial photography) dated from April 12, 1989, to September 5, 2013. The images used were 0.3–2-meter (m)-resolution U.S. Geological Survey Digital Orthophoto Quarter Quads (DOQQ), U.S. Department of Agriculture National Agriculture Imagery Program (NAIP) images, and Virginia Geographic Information Network Virginia Base Map Program (VBMP) images courtesy of the Commonwealth of Virginia. The back-island shorelines were hand-digitized at the intersect of the apparent back-island shoreline and transects spaced at 20-m intervals. The open-ocean shorelines were hand-digitized at the approximate still water level, such as tide level, which was fit through the average position of waves and swash apparent on the beach. Hand-digitizing was done at a scale of approximately 1:2,000. The sand polygons were derived by using an image-processing unsupervised classification technique that separates images into classes. The classes were then visually categorized as either sand or not sand. Also included in this report are 20-m-spaced transect lines and the transect base lines.
High-speed line-scan camera with digital time delay integration
NASA Astrophysics Data System (ADS)
Bodenstorfer, Ernst; Fürtler, Johannes; Brodersen, Jörg; Mayer, Konrad J.; Eckel, Christian; Gravogl, Klaus; Nachtnebel, Herbert
2007-02-01
Dealing with high-speed image acquisition and processing systems, the speed of operation is often limited by the amount of available light, due to short exposure times. Therefore, high-speed applications often use line-scan cameras, based on charge-coupled device (CCD) sensors with time delayed integration (TDI). Synchronous shift and accumulation of photoelectric charges on the CCD chip - according to the objects' movement - result in a longer effective exposure time without introducing additional motion blur. This paper presents a high-speed color line-scan camera based on a commercial complementary metal oxide semiconductor (CMOS) area image sensor with a Bayer filter matrix and a field programmable gate array (FPGA). The camera implements a digital equivalent to the TDI effect exploited with CCD cameras. The proposed design benefits from the high frame rates of CMOS sensors and from the possibility of arbitrarily addressing the rows of the sensor's pixel array. For the digital TDI just a small number of rows are read out from the area sensor which are then shifted and accumulated according to the movement of the inspected objects. This paper gives a detailed description of the digital TDI algorithm implemented on the FPGA. Relevant aspects for the practical application are discussed and key features of the camera are listed.
Stabley, Deborah L; Holbrook, Jennifer; Harris, Ashlee W; Swoboda, Kathryn J; Crawford, Thomas O; Sol-Church, Katia; Butchbach, Matthew E R
2017-05-01
Fibroblasts and lymphoblastoid cell lines (LCLs) derived from individuals with spinal muscular atrophy (SMA) have been and continue to be essential for translational SMA research. Authentication of cell lines helps ensure reproducibility and rigor in biomedical research. This quality control measure identifies mislabeling or cross-contamination of cell lines and prevents misinterpretation of data. Unfortunately, authentication of SMA cell lines used in various studies has not been possible because of a lack of a reference. In this study, we provide said reference so that SMA cell lines can be subsequently authenticated. We use short tandem repeat (STR) profiling and digital PCR (dPCR), which quantifies SMN1 and SMN2 copy numbers, to generate molecular identity codes for fibroblasts and LCLs that are commonly used in SMA research. Using these molecular identity codes, we clarify the familial relationships within a set of fibroblasts commonly used in SMA research. This study presents the first cell line reference set for the SMA research community and demonstrates its usefulness for re-identification and authentication of lines commonly used as in vitro models for future studies. Copyright © 2017 Elsevier B.V. All rights reserved.
Aeromagnetic and Gravity Surveys in Afghanistan: A Web Site for Distribution of Data
Sweeney, Ronald E.; Kucks, Robert P.; Hill, Patricia L.; Finn, Carol A.
2006-01-01
Aeromagnetic data were digitized from aeromagnetic maps created from aeromagnetic surveys flown in southeastern and southern Afghanistan in 1966 by PRAKLA, Gesellschaft fur praktische Lagerstattenforschung GmbH, Hannover, Germany, on behalf of the 'Bundesanstalt fur Bodenforschung', Hannover, Germany. The digitization was done along contour lines, followed by interpolation of the data along the original survey flight-lines. Survey and map specifications can be found in two project reports, 'prakla_report_1967.pdf' and 'bgr_report_1968.pdf', made available in this open-file report.
Novel presentational approaches were developed for reporting network meta-analysis.
Tan, Sze Huey; Cooper, Nicola J; Bujkiewicz, Sylwia; Welton, Nicky J; Caldwell, Deborah M; Sutton, Alexander J
2014-06-01
To present graphical tools for reporting network meta-analysis (NMA) results aiming to increase the accessibility, transparency, interpretability, and acceptability of NMA analyses. The key components of NMA results were identified based on recommendations by agencies such as the National Institute for Health and Care Excellence (United Kingdom). Three novel graphs were designed to amalgamate the identified components using familiar graphical tools such as the bar, line, or pie charts and adhering to good graphical design principles. Three key components for presentation of NMA results were identified, namely relative effects and their uncertainty, probability of an intervention being best, and between-study heterogeneity. Two of the three graphs developed present results (for each pairwise comparison of interventions in the network) obtained from both NMA and standard pairwise meta-analysis for easy comparison. They also include options to display the probability best, ranking statistics, heterogeneity, and prediction intervals. The third graph presents rankings of interventions in terms of their effectiveness to enable clinicians to easily identify "top-ranking" interventions. The graphical tools presented can display results tailored to the research question of interest, and targeted at a whole spectrum of users from the technical analyst to the nontechnical clinician. Copyright © 2014 Elsevier Inc. All rights reserved.
All-Digital Baseband 65nm PLL/FPLL Clock Multiplier using 10-cell Library
NASA Technical Reports Server (NTRS)
Shuler, Robert L., Jr.; Wu, Qiong; Liu, Rui; Chen, Li
2014-01-01
PLLs for clock generation are essential for modern circuits, to generate specialized frequencies for many interfaces and high frequencies for chip internal operation. These circuits depend on analog circuits and careful tailoring for each new process, and making them fault tolerant is an incompletely solved problem. Until now, all digital PLLs have been restricted to sampled data DSP techniques and not available for the highest frequency baseband applications. This paper presents the design and preliminary evaluation of an all-digital baseband technique built entirely with an easily portable 10-cell digital library. The library is also described, as it aids in research and low volume design porting to new processes. The advantages of the digital approach are the wide variety of techniques available to give varying degrees of fault tolerance, and the simplicity of porting the design to new processes, even to exotic processes that may not have analog capability. The only tuning parameter is digital gate delay. An all-digital approach presents unique problems and standard analog loop stability design criteria cannot be directly used. Because of the quantization of frequency, there is effectively infinite gain for very small loop error feedback. The numerically controlled oscillator (NCO) based on a tapped delay line cannot be reliably updated while a pulse is active in the delay line, and ordinarily does not have enough frequency resolution for a low-jitter output.
ALL-Digital Baseband 65nm PLL/FPLL Clock Multiplier Using 10-Cell Library
NASA Technical Reports Server (NTRS)
Schuler, Robert L., Jr.; Wu, Qiong; Liu, Rui; Chen, Li; Madala, Shridhar
2014-01-01
PLLs for clock generation are essential for modern circuits, to generate specialized frequencies for many interfaces and high frequencies for chip internal operation. These circuits depend on analog circuits and careful tailoring for each new process, and making them fault tolerant is an incompletely solved problem. Until now, all digital PLLs have been restricted to sampled data DSP techniques and not available for the highest frequency baseband applications. This paper presents the design and preliminary evaluation of an all-digital baseband technique built entirely with an easily portable 10-cell digital library. The library is also described, as it aids in research and low volume design porting to new processes. The advantages of the digital approach are the wide variety of techniques available to give varying degrees of fault tolerance, and the simplicity of porting the design to new processes, even to exotic processes that may not have analog capability. The only tuning parameter is digital gate delay. An all-digital approach presents unique problems and standard analog loop stability design criteria cannot be directly used. Because of the quantization of frequency, there is effectively infinite gain for very small loop error feedback. The numerically controlled oscillator (NCO) based on a tapped delay line cannot be reliably updated while a pulse is active in the delay line, and ordinarily does not have enough frequency resolution for a low-jitter output.
Vecchio, F; Miraglia, F; Quaranta, D; Granata, G; Romanello, R; Marra, C; Bramanti, P; Rossini, P M
2016-03-01
Functional brain abnormalities including memory loss are found to be associated with pathological changes in connectivity and network neural structures. Alzheimer's disease (AD) interferes with memory formation from the molecular level, to synaptic functions and neural networks organization. Here, we determined whether brain connectivity of resting-state networks correlate with memory in patients affected by AD and in subjects with mild cognitive impairment (MCI). One hundred and forty-four subjects were recruited: 70 AD (MMSE Mini Mental State Evaluation 21.4), 50 MCI (MMSE 25.2) and 24 healthy subjects (MMSE 29.8). Undirected and weighted cortical brain network was built to evaluate graph core measures to obtain Small World parameters. eLORETA lagged linear connectivity as extracted by electroencephalogram (EEG) signals was used to weight the network. A high statistical correlation between Small World and memory performance was found. Namely, higher Small World characteristic in EEG gamma frequency band during the resting state, better performance in short-term memory as evaluated by the digit span tests. Such Small World pattern might represent a biomarker of working memory impairment in older people both in physiological and pathological conditions. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.
Simulation Comparisons of Three Different Meander Line Dipoles
2015-01-01
Paez C I. Design formulas for a meandered dipole. IEEE Xplore Digital Library, 2014: n. pag. Web. 2 September 2014. 2. Nguyen, VH, Phan, HP, Hoang...MH. Improving radiation characteristics of UHF RFID antennas by zigzag structures. IEEE Xplore Digital Library, 2014: n. pag. Web. 2 September 2014...geometry-based, frequency-independent lumped element model. IEEE Xplore Digital Library, 2014: n. pag. Web. 2 September 2014. 5. Olaode OO, Palmer WD
The use of the digital smile design concept as an auxiliary tool in periodontal plastic surgery.
Santos, Felipe Rychuv; Kamarowski, Stephanie Felice; Lopez, Camilo Andres Villabona; Storrer, Carmen Lucia Mueller; Neto, Alexandre Teixeira; Deliberador, Tatiana Miranda
2017-01-01
Periodontal surgery associated with prior waxing, mock-up, and the use of digital tools to design the smile is the current trend of reverse planning in periodontal plastic surgery. The objective of this study is to report a surgical resolution of the gummy smile using a prior esthetic design with the use of digital tools. A digital smile design and mock-up were used for performing gingival recontouring surgery. The relationship between the facial and dental measures and the incisal plane with the horizontal facial plane of reference were evaluated. The relative dental height x width was measured, and the dental contour drawing was inserted. Complementary lines are drawn such as the gingival zenith, joining lines of the gingival and incisal battlements. The periodontal esthetic was improved according to the established design digital smile pattern. These results demonstrate the importance of surgical techniques and are well accepted by patients and are easy to perform for the professional. When properly planned, they provide the desired expectations. Periodontal Surgical procedures associated with the design digital smile facilitate the communication between the patient and the professional. It is, therefore, essential to demonstrate the reverse planning of the smile and periodontal parameters with approval by the patient to solve the esthetic problem.
Understanding Charts and Graphs.
1987-07-28
34notational.* English, then, is obviously not a notational system because ambiguous words or sentences are possible, whereas musical notion is notational...how lines and regions are detected and organized; these principles grow out of discoveries about human visual information processing. A syntactic...themselves name other colors (e.g., the word "red" is printed in blue ink; this is known as the OStroop effecto ). Similarly, if "left" and "right" are
Graph Learning for Anomaly Detection using Psychological Context GLAD-PC
2015-08-03
comparison study of user behavior on Facebook and Gmail, ArXiv: 1305.6082, (11 2013): 0. doi: 10.1016/j.chb.2013.06.043 TOTAL: 1 Received Paper...Fournelle, Steve Gaffigan, Oliver Brdiczka, Jianqiang Shen, Juan Liu, Kendra E. Moore. Characterizing user behavior and information propagation on a...media data; and c) detecting unusual and anomalous behavior from on-line activities. (5) Summary of the most important results With regard to
SVEN: Informative Visual Representation of Complex Dynamic Structure
2014-12-23
nodes in the diagram can be chosen to minimize crossings, but this is the Traveling Salesman Problem , and even if an optimal solution was found, there...visualization problem inherits the challenges of optimizing the aesthetic properties of the static views of the graphs, it also introduces a new problem of how to...inevitable problem of having an overwhelming number of edge crossings for larger datasets is addressed by reducing the opacity of the lines drawn
NASA Technical Reports Server (NTRS)
Cardelli, Jason A.; Clayton, Geoffrey C.
1991-01-01
The range of validity of the average absolute extinction law (AAEL) proposed by Cardelli et al. (1988 and 1989) is investigated, combining published visible and NIR data with IUE UV observations for three lines of sight through dense dark cloud environments with high values of total-to-selective extinction. The characteristics of the data sets and the reduction and parameterization methods applied are described in detail, and the results are presented in extensive tables and graphs. Good agreement with the AAEL is demonstrated for wavelengths from 3.4 microns to 250 nm, but significant deviations are found at shorter wavelengths (where previous studies of lines of sight through bright nebulosity found good agreement with the AAEL). These differences are attributed to the effects of coatings on small-bump and FUV grains.
NASA Astrophysics Data System (ADS)
Gupta, S. R. D.; Gupta, Santanu D.
1991-10-01
The flow of laser radiation in a plane-parallel cylindrical slab of active amplifying medium with axial symmetry is treated as a problem in radiative transfer. The appropriate one-dimensional transfer equation describing the transfer of laser radiation has been derived by an appeal to Einstein's A, B coefficients (describing the processes of stimulated line absorption, spontaneous line emission, and stimulated line emission sustained by population inversion in the medium) and considering the 'rate equations' to completely establish the rational of the transfer equation obtained. The equation is then exactly solved and the angular distribution of the emergent laser beam intensity is obtained; its numerically computed values are given in tables and plotted in graphs showing the nature of peaks of the emerging laser beam intensity about the axis of the laser cylinder.
Crackscope : automatic pavement cracking inspection system.
DOT National Transportation Integrated Search
2008-08-01
The CrackScope system is an automated pavement crack rating system consisting of a : digital line scan camera, laser-line illuminator, and proprietary crack detection and classification : software. CrackScope is able to perform real-time pavement ins...
Orwoll, Benjamin; Diane, Shelley; Henry, Duncan; Tsang, Lisa; Chu, Kristin; Meer, Carrie; Hartman, Kevin; Roy-Burman, Arup
Central line-associated bloodstream infections (CLABSIs) cause major patient harm, preventable through attention to line care best practice standards. The objective was to determine if a digital self-assessment application (CLABSI App), bundling line care best practices with social gamification and in-context microlearning, could engage nurses in CLABSI prevention. Nurses caring for children with indwelling central venous catheters in 3 high-risk units were eligible to participate. All other units served as controls. The intervention was a 12-month nonrandomized quality improvement study of CLABSI App implementation with interunit competitions. Compared to the preceding year, the intervention group (9886 line days) CLABSI rate decreased by 48% ( P = .03). Controls (7879 line days) did not change significantly. In all, 105 unique intervention group nurses completed 673 self-assessments. Competitions were associated with increased engagement as measured by self-assessments and unique participants. This model could be extended to other health care-associated infections, and more broadly to process improvement within and across health care systems.
How Digital Image Processing Became Really Easy
NASA Astrophysics Data System (ADS)
Cannon, Michael
1988-02-01
In the early and mid-1970s, digital image processing was the subject of intense university and corporate research. The research lay along two lines: (1) developing mathematical techniques for improving the appearance of or analyzing the contents of images represented in digital form, and (2) creating cost-effective hardware to carry out these techniques. The research has been very effective, as evidenced by the continued decline of image processing as a research topic, and the rapid increase of commercial companies to market digital image processing software and hardware.
Ortiz-Ruiz, Alejandra; Postigo, María; Gil-Casanova, Sara; Cuadrado, Daniel; Bautista, José M; Rubio, José Miguel; Luengo-Oroz, Miguel; Linares, María
2018-01-30
Routine field diagnosis of malaria is a considerable challenge in rural and low resources endemic areas mainly due to lack of personnel, training and sample processing capacity. In addition, differential diagnosis of Plasmodium species has a high level of misdiagnosis. Real time remote microscopical diagnosis through on-line crowdsourcing platforms could be converted into an agile network to support diagnosis-based treatment and malaria control in low resources areas. This study explores whether accurate Plasmodium species identification-a critical step during the diagnosis protocol in order to choose the appropriate medication-is possible through the information provided by non-trained on-line volunteers. 88 volunteers have performed a series of questionnaires over 110 images to differentiate species (Plasmodium falciparum, Plasmodium ovale, Plasmodium vivax, Plasmodium malariae, Plasmodium knowlesi) and parasite staging from thin blood smear images digitalized with a smartphone camera adapted to the ocular of a conventional light microscope. Visual cues evaluated in the surveys include texture and colour, parasite shape and red blood size. On-line volunteers are able to discriminate Plasmodium species (P. falciparum, P. malariae, P. vivax, P. ovale, P. knowlesi) and stages in thin-blood smears according to visual cues observed on digitalized images of parasitized red blood cells. Friendly textual descriptions of the visual cues and specialized malaria terminology is key for volunteers learning and efficiency. On-line volunteers with short-training are able to differentiate malaria parasite species and parasite stages from digitalized thin smears based on simple visual cues (shape, size, texture and colour). While the accuracy of a single on-line expert is far from perfect, a single parasite classification obtained by combining the opinions of multiple on-line volunteers over the same smear, could improve accuracy and reliability of Plasmodium species identification in remote malaria diagnosis.
47 CFR 36.125 - Local switching equipment-Category 3.
Code of Federal Regulations, 2010 CFR
2010-10-01
... electronic analog or digital remote line locations. Equipment used for the identification, recording and... which has a common intermediate distributing frame, market group or other separately identifiable... composed of an electronic analog or digital host office and all of its remote locations. A host/remote...
From CAD to Digital Modeling: the Necessary Hybridization of Processes
NASA Astrophysics Data System (ADS)
Massari, G. A.; Bernardi, F.; Cristofolini, A.
2011-09-01
The essay deals with the themes of digital representation of architecture starting from several years of teaching activity which is growing within the course of Automatic Design of the degree course in Engineering/Architecture in the University of Trento. With the development of CAD systems, architectural representation lies less in the tracking of a simple graph and drawn deeper into a series of acts of building a complex digital model, which can be used as a data base on which to report all the stages of project and interpretation work, and from which to derive final drawings and documents. The advent of digital technology has led to increasing difficulty in finding explicit connections between one type of operation and the subsequent outcome; thereby increasing need for guidelines, the need to understand in order to precede the changes, the desire not to be overwhelmed by uncontrollable influences brought by technological hardware and software systems to use only in accordance with the principle of maximum productivity. Formation occupies a crucial role because has the ability to direct the profession toward a thoughtful and selective use of specific applications; teaching must build logical routes in the fluid world of info-graphics and the only way to do so is to describe its contours through method indications: this will consist in understanding, studying and divulging what in its mobility does not change, as procedural issues, rather than what is transitory in its fixity, as manual questions.
Real-time volume rendering of digital medical images on an iOS device
NASA Astrophysics Data System (ADS)
Noon, Christian; Holub, Joseph; Winer, Eliot
2013-03-01
Performing high quality 3D visualizations on mobile devices, while tantalizingly close in many areas, is still a quite difficult task. This is especially true for 3D volume rendering of digital medical images. Allowing this would empower medical personnel a powerful tool to diagnose and treat patients and train the next generation of physicians. This research focuses on performing real time volume rendering of digital medical images on iOS devices using custom developed GPU shaders for orthogonal texture slicing. An interactive volume renderer was designed and developed with several new features including dynamic modification of render resolutions, an incremental render loop, a shader-based clipping algorithm to support OpenGL ES 2.0, and an internal backface culling algorithm for properly sorting rendered geometry with alpha blending. The application was developed using several application programming interfaces (APIs) such as OpenSceneGraph (OSG) as the primary graphics renderer coupled with iOS Cocoa Touch for user interaction, and DCMTK for DICOM I/O. The developed application rendered volume datasets over 450 slices up to 50-60 frames per second, depending on the specific model of the iOS device. All rendering is done locally on the device so no Internet connection is required.
A submersible digital in-line holographic microscope
NASA Astrophysics Data System (ADS)
Jericho, Manfred; Jericho, Stefan; Kreuzer, Hans Juergen; Garcia, Jeorge; Klages, Peter
Few instruments exist that can image microscopic marine organisms in their natural environment so that their locomotion mechanisms, feeding habits and interactions with surfaces, such as bio-fouling, can be investigated in situ. In conventional optical microscopy under conditions of high magnification, only objects confined to the narrow focal plane can be imaged and processes that involve translation of the object perpendicular to this plane are not accessible. To overcome this severe limitation of optical microscopy, we developed digital in-line holographic microscopy (DIHM) as a high-resolution tool for the tracking of organisms in three dimensions. We describe here the design and performance of a very simple submersible digital in-line holographic microscope (SDIHM) that can image organisms and their motion with micron resolution and that can be deployed from small vessels. Holograms and reconstructed images of several microscopic marine organisms were successfully obtained down to a depth of 20 m. The maximum depth was limited by the length of data transmission cables available at the time and operating depth in excess of 100 m are easily possible for the instrument.
Pasion, Editha; Good, Levell; Tizon, Jisebelle; Krieger, Staci; O'Kier, Catherine; Taylor, Nicole; Johnson, Jennifer; Horton, Carrie M; Peterson, Mary
2010-11-01
To determine if the monitor cursor-line feature on bedside monitors is accurate for measuring central venous and pulmonary artery pressures in cardiac surgery patients. Central venous and pulmonary artery pressures were measured via 3 methods (end-expiratory graphic recording, monitor cursor-line display, and monitor digital display) in a convenience sample of postoperative cardiac surgery patients. Pressures were measured twice during both mechanical ventilation and spontaneous breathing. Analysis of variance was used to determine differences between measurement methods and the percentage of monitor pressures that differed by 4 mm Hg or more from the measurement obtained from the graphic recording. Significance level was set at P less than .05. Twenty-five patients were studied during mechanical ventilation (50 measurements) and 21 patients during spontaneous breathing (42 measurements). Measurements obtained via the 3 methods did not differ significantly for either type of pressure (P > .05). Graphically recorded pressures and measurements obtained via the monitor cursor-line or digital display methods differed by 4 mm Hg or more in 4% and 6% of measurements, respectively, during mechanical ventilation and 4% and 11%, respectively, during spontaneous breathing. The monitor cursor-line method for measuring central venous and pulmonary artery pressures may be a reasonable alternative to the end-expiratory graphic recording method in hemodynamically stable, postoperative cardiac surgery patients. Use of the digital display on the bedside monitor may result in larger discrepancies from the graphically recorded pressures than when the cursor-line method is used, particularly in spontaneously breathing patients.
A Low Power Digital Accumulation Technique for Digital-Domain CMOS TDI Image Sensor.
Yu, Changwei; Nie, Kaiming; Xu, Jiangtao; Gao, Jing
2016-09-23
In this paper, an accumulation technique suitable for digital domain CMOS time delay integration (TDI) image sensors is proposed to reduce power consumption without degrading the rate of imaging. In terms of the slight variations of quantization codes among different pixel exposures towards the same object, the pixel array is divided into two groups: one is for coarse quantization of high bits only, and the other one is for fine quantization of low bits. Then, the complete quantization codes are composed of both results from the coarse-and-fine quantization. The equivalent operation comparably reduces the total required bit numbers of the quantization. In the 0.18 µm CMOS process, two versions of 16-stage digital domain CMOS TDI image sensor chains based on a 10-bit successive approximate register (SAR) analog-to-digital converter (ADC), with and without the proposed technique, are designed. The simulation results show that the average power consumption of slices of the two versions are 6 . 47 × 10 - 8 J/line and 7 . 4 × 10 - 8 J/line, respectively. Meanwhile, the linearity of the two versions are 99.74% and 99.99%, respectively.
NASA Astrophysics Data System (ADS)
Schroeder, Mubina Khan
In science education, the use of digital technology-based learning can help students struggling with difficult concepts such as the movement of molecules. While digital learning tools hold much promise for science education, the question arises as to whether or not such technology can serve as an adequate surrogate for the teacher-student interactions that theorists like Lev Vygotsky (1978) underscored as being critical to learning. In response to such concerns, designers of digital curricula often utilize scaffolds to help students as they learn from such programs. Using a simulation designed to teach students about the concept of diffusion as an example, I examine the effect of including prompting language in the learning sequence of the simulation. The use of prompting language in digital curriculum appears to be successful because it elicits science students to reflect and metacognise about their learning, lending support to Vygotsky's (1978) ideas of teaching and learning involving outer and inner dialog. However, findings from think aloud data continue to underscore the importance of human linguistic exchange as a preferable learning paradigm.
NASA Astrophysics Data System (ADS)
Maslov, L. A.; Choi, D. R.
2014-12-01
Earthquake epicenters in the Eastern Pacific Tectonic Belt (Pacific - North and South American continents tectonic margin) are distributed symmetrically about latitude with the following three minima: around the equator, at 35o N latitude, and at 35o S latitude, Figure 1a. In analysing the data, we looked at two characteristics - occurance dates, and epicenter latitudes. We calculated the power spectrum Sd(f) for occurance dates, and found that this spectrum can be approximated by the function Cfα, where α<0, Figure 1b. To interpret the data, we have also shown a graph of Ln(fα), Figure 1c. This graph shows that the exponent α is not a constant, but varies with the frequency. In addition, we calculated the power spectrum for epicenter latitudes Sl(f), Figure 1d, and found that this spectrum can be similarly approximated by the function Cfβ, where β<0. As with the occurance dates, we show a graph of Ln(fβ), Figure 1e, which indicates that β also varies with the frequency. This result is quite different from the well-known Gutenberg-Richter "frequency-magnitude" relation represented in bilogatithmic coordinates by a straight line. Coefficients α and β vary approximately from -2.5 to -1.5, depending on the "length" of the calculated spectrum subset used to plot the trend line. Based on the fact that the power spectrum has the form Cfα, -2.5<α<-1.5, we conclude that a long-time and long-distance correlation exists between earthquakes in the Eastern Pacific Tectonic Belt. In this work, we present an interpretation of the regularities in the spatial and temporal distribution of earthquakes in the Eastern Pacific Tectonic Belt. Earthquake data were taken from http://www.iris.edu/ieb/index.html.
Couple Graph Based Label Propagation Method for Hyperspectral Remote Sensing Data Classification
NASA Astrophysics Data System (ADS)
Wang, X. P.; Hu, Y.; Chen, J.
2018-04-01
Graph based semi-supervised classification method are widely used for hyperspectral image classification. We present a couple graph based label propagation method, which contains both the adjacency graph and the similar graph. We propose to construct the similar graph by using the similar probability, which utilize the label similarity among examples probably. The adjacency graph was utilized by a common manifold learning method, which has effective improve the classification accuracy of hyperspectral data. The experiments indicate that the couple graph Laplacian which unite both the adjacency graph and the similar graph, produce superior classification results than other manifold Learning based graph Laplacian and Sparse representation based graph Laplacian in label propagation framework.
Multi-Centrality Graph Spectral Decompositions and Their Application to Cyber Intrusion Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Pin-Yu; Choudhury, Sutanay; Hero, Alfred
Many modern datasets can be represented as graphs and hence spectral decompositions such as graph principal component analysis (PCA) can be useful. Distinct from previous graph decomposition approaches based on subspace projection of a single topological feature, e.g., the centered graph adjacency matrix (graph Laplacian), we propose spectral decomposition approaches to graph PCA and graph dictionary learning that integrate multiple features, including graph walk statistics, centrality measures and graph distances to reference nodes. In this paper we propose a new PCA method for single graph analysis, called multi-centrality graph PCA (MC-GPCA), and a new dictionary learning method for ensembles ofmore » graphs, called multi-centrality graph dictionary learning (MC-GDL), both based on spectral decomposition of multi-centrality matrices. As an application to cyber intrusion detection, MC-GPCA can be an effective indicator of anomalous connectivity pattern and MC-GDL can provide discriminative basis for attack classification.« less
Graphs, matrices, and the GraphBLAS: Seven good reasons
Kepner, Jeremy; Bader, David; Buluç, Aydın; ...
2015-01-01
The analysis of graphs has become increasingly important to a wide range of applications. Graph analysis presents a number of unique challenges in the areas of (1) software complexity, (2) data complexity, (3) security, (4) mathematical complexity, (5) theoretical analysis, (6) serial performance, and (7) parallel performance. Implementing graph algorithms using matrix-based approaches provides a number of promising solutions to these challenges. The GraphBLAS standard (istcbigdata.org/GraphBlas) is being developed to bring the potential of matrix based graph algorithms to the broadest possible audience. The GraphBLAS mathematically defines a core set of matrix-based graph operations that can be used to implementmore » a wide class of graph algorithms in a wide range of programming environments. This paper provides an introduction to the GraphBLAS and describes how the GraphBLAS can be used to address many of the challenges associated with analysis of graphs.« less
Rapid Prototyping of High Performance Signal Processing Applications
NASA Astrophysics Data System (ADS)
Sane, Nimish
Advances in embedded systems for digital signal processing (DSP) are enabling many scientific projects and commercial applications. At the same time, these applications are key to driving advances in many important kinds of computing platforms. In this region of high performance DSP, rapid prototyping is critical for faster time-to-market (e.g., in the wireless communications industry) or time-to-science (e.g., in radio astronomy). DSP system architectures have evolved from being based on application specific integrated circuits (ASICs) to incorporate reconfigurable off-the-shelf field programmable gate arrays (FPGAs), the latest multiprocessors such as graphics processing units (GPUs), or heterogeneous combinations of such devices. We, thus, have a vast design space to explore based on performance trade-offs, and expanded by the multitude of possibilities for target platforms. In order to allow systematic design space exploration, and develop scalable and portable prototypes, model based design tools are increasingly used in design and implementation of embedded systems. These tools allow scalable high-level representations, model based semantics for analysis and optimization, and portable implementations that can be verified at higher levels of abstractions and targeted toward multiple platforms for implementation. The designer can experiment using such tools at an early stage in the design cycle, and employ the latest hardware at later stages. In this thesis, we have focused on dataflow-based approaches for rapid DSP system prototyping. This thesis contributes to various aspects of dataflow-based design flows and tools as follows: 1. We have introduced the concept of topological patterns, which exploits commonly found repetitive patterns in DSP algorithms to allow scalable, concise, and parameterizable representations of large scale dataflow graphs in high-level languages. We have shown how an underlying design tool can systematically exploit a high-level application specification consisting of topological patterns in various aspects of the design flow. 2. We have formulated the core functional dataflow (CFDF) model of computation, which can be used to model a wide variety of deterministic dynamic dataflow behaviors. We have also presented key features of the CFDF model and tools based on these features. These tools provide support for heterogeneous dataflow behaviors, an intuitive and common framework for functional specification, support for functional simulation, portability from several existing dataflow models to CFDF, integrated emphasis on minimally-restricted specification of actor functionality, and support for efficient static, quasi-static, and dynamic scheduling techniques. 3. We have developed a generalized scheduling technique for CFDF graphs based on decomposition of a CFDF graph into static graphs that interact at run-time. Furthermore, we have refined this generalized scheduling technique using a new notion of "mode grouping," which better exposes the underlying static behavior. We have also developed a scheduling technique for a class of dynamic applications that generates parameterized looped schedules (PLSs), which can handle dynamic dataflow behavior without major limitations on compile-time predictability. 4. We have demonstrated the use of dataflow-based approaches for design and implementation of radio astronomy DSP systems using an application example of a tunable digital downconverter (TDD) for spectrometers. Design and implementation of this module has been an integral part of this thesis work. This thesis demonstrates a design flow that consists of a high-level software prototype, analysis, and simulation using the dataflow interchange format (DIF) tool, and integration of this design with the existing tool flow for the target implementation on an FPGA platform, called interconnect break-out board (IBOB). We have also explored the trade-off between low hardware cost for fixed configurations of digital downconverters and flexibility offered by TDD designs. 5. This thesis has contributed significantly to the development and release of the latest version of a graph package oriented toward models of computation (MoCGraph). Our enhancements to this package include support for tree data structures, and generalized schedule trees (GSTs), which provide a useful data structure for a wide variety of schedule representations. Our extensions to the MoCGraph package provided key support for the CFDF model, and functional simulation capabilities in the DIF package.
Adjusting protein graphs based on graph entropy.
Peng, Sheng-Lung; Tsay, Yu-Wei
2014-01-01
Measuring protein structural similarity attempts to establish a relationship of equivalence between polymer structures based on their conformations. In several recent studies, researchers have explored protein-graph remodeling, instead of looking a minimum superimposition for pairwise proteins. When graphs are used to represent structured objects, the problem of measuring object similarity become one of computing the similarity between graphs. Graph theory provides an alternative perspective as well as efficiency. Once a protein graph has been created, its structural stability must be verified. Therefore, a criterion is needed to determine if a protein graph can be used for structural comparison. In this paper, we propose a measurement for protein graph remodeling based on graph entropy. We extend the concept of graph entropy to determine whether a graph is suitable for representing a protein. The experimental results suggest that when applied, graph entropy helps a conformational on protein graph modeling. Furthermore, it indirectly contributes to protein structural comparison if a protein graph is solid.
Adjusting protein graphs based on graph entropy
2014-01-01
Measuring protein structural similarity attempts to establish a relationship of equivalence between polymer structures based on their conformations. In several recent studies, researchers have explored protein-graph remodeling, instead of looking a minimum superimposition for pairwise proteins. When graphs are used to represent structured objects, the problem of measuring object similarity become one of computing the similarity between graphs. Graph theory provides an alternative perspective as well as efficiency. Once a protein graph has been created, its structural stability must be verified. Therefore, a criterion is needed to determine if a protein graph can be used for structural comparison. In this paper, we propose a measurement for protein graph remodeling based on graph entropy. We extend the concept of graph entropy to determine whether a graph is suitable for representing a protein. The experimental results suggest that when applied, graph entropy helps a conformational on protein graph modeling. Furthermore, it indirectly contributes to protein structural comparison if a protein graph is solid. PMID:25474347
NASA Astrophysics Data System (ADS)
Zhang, K.; Sheng, Y. H.; Li, Y. Q.; Han, B.; Liang, Ch.; Sha, W.
2006-10-01
In the field of digital photogrammetry and computer vision, the determination of conjugate points in a stereo image pair, referred to as "image matching," is the critical step to realize automatic surveying and recognition. Traditional matching methods encounter some problems in the digital close-range stereo photogrammetry, because the change of gray-scale or texture is not obvious in the close-range stereo images. The main shortcoming of traditional matching methods is that geometric information of matching points is not fully used, which will lead to wrong matching results in regions with poor texture. To fully use the geometry and gray-scale information, a new stereo image matching algorithm is proposed in this paper considering the characteristics of digital close-range photogrammetry. Compared with the traditional matching method, the new algorithm has three improvements on image matching. Firstly, shape factor, fuzzy maths and gray-scale projection are introduced into the design of synthetical matching measure. Secondly, the topology connecting relations of matching points in Delaunay triangulated network and epipolar-line are used to decide matching order and narrow the searching scope of conjugate point of the matching point. Lastly, the theory of parameter adjustment with constraint is introduced into least square image matching to carry out subpixel level matching under epipolar-line constraint. The new algorithm is applied to actual stereo images of a building taken by digital close-range photogrammetric system. The experimental result shows that the algorithm has a higher matching speed and matching accuracy than pyramid image matching algorithm based on gray-scale correlation.
Overview of a FPGA-based nuclear instrumentation dedicated to primary activity measurements.
Bobin, C; Bouchard, J; Pierre, S; Thiam, C
2012-09-01
In National Metrology Institutes like LNE-LNHB, renewal and improvement of the instrumentation is an important task. Nowadays, the current trend is to adopt digital boards, which present numerous advantages over the standard electronics. The feasibility of an on-line fulfillment of nuclear-instrumentation functionalities using a commercial FPGA-based (Field-Programmable Gate Array) board has been validated in the case of TDCR primary measurements (Triple to Double Coincidence Ratio method based on liquid scintillation). The new applications presented in this paper have been included to allow either an on-line processing of the information or a raw-data acquisition for an off-line treatment. Developed as a complementary tool for TDCR counting, a time-to-digital converter specifically designed for this technique has been added. In addition, the description is given of a spectrometry channel based on the connection between conventional shaping amplifiers and the analog-to-digital converter (ADC) input available on the same digital board. First results are presented in the case of α- and γ-counting related to, respectively, the defined solid angle and well-type NaI(Tl) primary activity techniques. The combination of two different channels (liquid scintillation and γ-spectrometry) implementing the live-time anticoincidence processing is also described for the application of the 4πβ-γ coincidence method. The need for an optimized coupling between the analog chain and the ADC stage is emphasized. The straight processing of the signals delivered by the preamplifier connected to a HPGe detector is also presented along with the first development of digital filtering. Copyright © 2012 Elsevier Ltd. All rights reserved.
Assessment of gingival symmetry with digital measuring tools and its reproducibility.
Wilson, David; Soileau, Kristi; Esquivel, Jonathan; Cordero, Adriana; Buchman, Wes; Maney, Pooja; Archontia Palaiologou, A
The aim of this study was to investigate the accuracy of digital measuring tools to measure the position of gingival zeniths and to assess its reproducibility between different examiners. A total of 108 subjects were photographed at the Louisiana State University School of Dentistry. The settings, positioning of the digital camera, and subjects' Frankfurt levels were standardized. A photograph was taken of the six anterior maxillary teeth of each subject, and their corresponding free gingival margins. Digital caliper measurements were taken intraorally from the zenith to the incisal edge of the right maxillary central incisor. A reference line was drawn across the screen on each image at the level of the zenith of tooth 8. Three calibrated examiners then measured the distance from the reference line to the zeniths of the other five anterior maxillary teeth. There was no statistically significant difference between the examiners regarding any of the measurements. Central incisors were at the same level in 84.24% of the subjects, and lateral incisors were within 0.5 mm of central incisors in only 58% of the subjects. Canine zeniths were within 0.5 mm of each other in 43% of the subjects. Only 28% of the subjects presented with zeniths of tooth 6 to tooth 11 within 0.5 mm of each other. Lateral incisors were at or beneath the line drawn from central incisors to cuspids in 90.8% of the subjects. Standardized digital photography taken with the aid of a stadiometer and used to evaluate esthetic parameters allowed for reproducible measurements.
DMA (Defense Mapping Agency): The Digital Revolution,
1984-06-21
8217 , DEFENSE APPING AGENCY: * TIlE DIGITAL REVOLUTION .. ..4 8 .O44 9 5* ; n : . . ., -. n I , . , : ’ . , , - n .’ ’.’- ;; : - COVER FIGURES Top... orthophoto bases. Development of the Continental Control Network and the procurement of an Off- Line Ortho-Photo System has expanded our ability to obtain
Do "Digital Certificates" Hold the Key to Colleges' On-Line Activities?
ERIC Educational Resources Information Center
Olsen, Florence
1999-01-01
Examines the increasing use of "digital certificates" to validate computer user identity in various applications on college and university campuses, including letting students register for courses, monitoring access to Internet2, and monitoring access to databases and electronic journals. The methodology has been developed by the…
Analysis of Return and Forward Links from STARS' Flight Demonstration 1
NASA Technical Reports Server (NTRS)
Gering, James A.
2003-01-01
Space-based Telemetry And Range Safety (STARS) is a Kennedy Space Center (KSC) led proof-of-concept demonstration, which utilizes NASA's space network of Tracking and Data Relay Satellites (TDRS) as a pathway for launch and mission related information streams. Flight Demonstration 1 concluded on July 15,2003 with the seventh flight of a Low Power Transmitter (LPT) a Command and Data Handler (C&DH), a twelve channel GPS receiver and associated power supplies and amplifiers. The equipment flew on NASA's F-I5 aircraft at the Dryden Flight Research Center located at Edwards Air Force Base in California. During this NASA-ASEE Faculty Fellowship, the author participated in the collection and analysis of data from the seven flights comprising Flight Demonstration 1. Specifically, the author examined the forward and return links bit energy E(sub B) (in Watt-seconds) divided by the ambient radio frequency noise N(sub 0) (in Watts / Hertz). E(sub b)/N(sub 0) is commonly thought of as a signal-to-noise parameter, which characterizes a particular received radio frequency (RF) link. Outputs from the data analysis include the construction of time lines for all flights, production of graphs of range safety values for all seven flights, histograms of range safety E(sub b)/N(sub 0) values in five dB increments, calculation of associated averages and standard deviations, production of graphs of range user E(sub b)/N(sub 0) values for the all flights, production of graphs of AGC's and E(sub b)/N(sub 0) estimates for flight 1, recorded onboard, transmitted directly to the launch head and transmitted through TDRS. The data and graphs are being used to draw conclusions related to a lower than expected signal strength seen in the range safety return link.
Probabilistic Survivability Versus Time Modeling
NASA Technical Reports Server (NTRS)
Joyner, James J., Sr.
2015-01-01
This technical paper documents Kennedy Space Centers Independent Assessment team work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer (CSO) and GSDO management during key programmatic reviews. The assessments provided the GSDO Program with an analysis of how egress time affects the likelihood of astronaut and worker survival during an emergency. For each assessment, the team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedys Vehicle Assembly Building (VAB).Based on the composite survivability versus time graphs from the first two assessments, there was a soft knee in the Figure of Merit graphs at eight minutes (ten minutes after egress ordered). Thus, the graphs illustrated to the decision makers that the final emergency egress design selected should have the capability of transporting the flight crew from the top of LC 39B to a safe location in eight minutes or less. Results for the third assessment were dominated by hazards that were classified as instantaneous in nature (e.g. stacking mishaps) and therefore had no effect on survivability vs time to egress the VAB. VAB emergency scenarios that degraded over time (e.g. fire) produced survivability vs time graphs that were line with aerospace industry norms.
Pedrini, Giancarlo; Alexeenko, Igor; Osten, Wolfgang; Schnars, Ulf
2006-02-10
A method based on pulsed digital holographic interferometry for the measurement of dynamic deformations of a surface by using a moving system is presented. The measuring system may move with a speed of several meters per minute and can measure deformation of the surface with an accuracy of better than 50 nm. The deformation is obtained by comparison of the wavefronts recorded at different times with different laser pulses produced by a Nd:YAG laser. The effect due to the movement of the measuring system is compensated for by digital processing of the different holograms. The system is well suited for on-line surveillance of a dynamic process such as laser welding and friction stir welding. Experimental results are presented, and the advantages of the method are discussed.
Don't Break the Memory Line: Social Memory, Digital Storytelling and Local Communities
NASA Astrophysics Data System (ADS)
Ferri, Paolo; Mangiatordi, Andrea; Pozzali, Andrea
In this paper we present and analyze some of the main results obtained by the empirical research carried out within the scope of the Socrates Grundtvig Project "Memory Line", that aimed at developing instruments and methodologies in order to help overcoming the intergenerational divide. The project aimed at training groups of elderly and young citizens, resident in the project partner countries, to collect records (stories, songs, poems, experiences, etc.) and to save them in a digital form, mainly by using the methodology of digital storytelling. Focus groups and interviews with people involved in the intergenerational ateliers have been carried out in order to gather a series of first hand evidences directly from the voices of people who were actively involved in the project, and to enable an ongoing monitoring and self evaluation of the project itself.
NASA Astrophysics Data System (ADS)
Scarano, Grace Hotchkiss
2000-10-01
Current reform documents in science and mathematics call for teachers to include inquiry and data analysis in their teaching. This interpretive quasi-ethnographic case study examined two middle school science teachers as they planned and implemented inquiry and graphing in their science curricula. The focus question for this research was: What are middle school science teachers' experiences as they include graphing and inquiry-based student research projects in their curricula? How is teaching these areas different from usual teaching? The research examined two teachers teaching their favorite unit, parts of other familiar units, graphing, and student inquiry. Four main types of data were gathered: (1) observations of teachers' instruction, (2) interviews and meetings with the teachers, (3) curricular artifacts, and (4) questionnaires and other written material. The study took place over a seven-month period. The findings revealed that these two teachers had different ideologies of schooling and that these ideologies shaped the teachers' planning and implementation of their usual content as well as graphing and inquiry. One teacher's ideology was technical, and the other's was constructive. Six themes emerged as salient features of their teaching: (1) the role of developing a vision for curricular implementation, (2) curricular decisions: internal and external authority, (3) views of knowing and learning, (4) perceptions of the nature of science, (5) attending to a personal concern in teaching, and (6) reflection. The textures of these themes varied between the two teachers, and formed a coherent yet dynamic system within which each teacher maneuvered. This study found that both teachers found it challenging to include inquiry in their curricula, even though both had attended workshops designed to help teachers use student inquiry. The constructive teacher's implementation was more in line with the notions that are central to constructivism and current non-traditional views of the nature of science than was that of the technical teacher. The teacher with a technical ideology relied on the scientific method to organize student projects and implemented inquiry as a technique to increase student-centered work. It is proposed that teachers with technical ideologies need to undergo an ideological shift toward constructive ideologies of schooling in order to teach graphing and inquiry in ways that are aligned with current reform efforts.
A digital protection system incorporating knowledge based learning
NASA Astrophysics Data System (ADS)
Watson, Karan; Russell, B. Don; McCall, Kurt
A digital system architecture used to diagnoses the operating state and health of electric distribution lines and to generate actions for line protection is presented. The architecture is described functionally and to a limited extent at the hardware level. This architecture incorporates multiple analysis and fault-detection techniques utilizing a variety of parameters. In addition, a knowledge-based decision maker, a long-term memory retention and recall scheme, and a learning environment are described. Preliminary laboratory implementations of the system elements have been completed. Enhanced protection for electric distribution feeders is provided by this system. Advantages of the system are enumerated.
An alternative database approach for management of SNOMED CT and improved patient data queries.
Campbell, W Scott; Pedersen, Jay; McClay, James C; Rao, Praveen; Bastola, Dhundy; Campbell, James R
2015-10-01
SNOMED CT is the international lingua franca of terminologies for human health. Based in Description Logics (DL), the terminology enables data queries that incorporate inferences between data elements, as well as, those relationships that are explicitly stated. However, the ontologic and polyhierarchical nature of the SNOMED CT concept model make it difficult to implement in its entirety within electronic health record systems that largely employ object oriented or relational database architectures. The result is a reduction of data richness, limitations of query capability and increased systems overhead. The hypothesis of this research was that a graph database (graph DB) architecture using SNOMED CT as the basis for the data model and subsequently modeling patient data upon the semantic core of SNOMED CT could exploit the full value of the terminology to enrich and support advanced data querying capability of patient data sets. The hypothesis was tested by instantiating a graph DB with the fully classified SNOMED CT concept model. The graph DB instance was tested for integrity by calculating the transitive closure table for the SNOMED CT hierarchy and comparing the results with transitive closure tables created using current, validated methods. The graph DB was then populated with 461,171 anonymized patient record fragments and over 2.1 million associated SNOMED CT clinical findings. Queries, including concept negation and disjunction, were then run against the graph database and an enterprise Oracle relational database (RDBMS) of the same patient data sets. The graph DB was then populated with laboratory data encoded using LOINC, as well as, medication data encoded with RxNorm and complex queries performed using LOINC, RxNorm and SNOMED CT to identify uniquely described patient populations. A graph database instance was successfully created for two international releases of SNOMED CT and two US SNOMED CT editions. Transitive closure tables and descriptive statistics generated using the graph database were identical to those using validated methods. Patient queries produced identical patient count results to the Oracle RDBMS with comparable times. Database queries involving defining attributes of SNOMED CT concepts were possible with the graph DB. The same queries could not be directly performed with the Oracle RDBMS representation of the patient data and required the creation and use of external terminology services. Further, queries of undefined depth were successful in identifying unknown relationships between patient cohorts. The results of this study supported the hypothesis that a patient database built upon and around the semantic model of SNOMED CT was possible. The model supported queries that leveraged all aspects of the SNOMED CT logical model to produce clinically relevant query results. Logical disjunction and negation queries were possible using the data model, as well as, queries that extended beyond the structural IS_A hierarchy of SNOMED CT to include queries that employed defining attribute-values of SNOMED CT concepts as search parameters. As medical terminologies, such as SNOMED CT, continue to expand, they will become more complex and model consistency will be more difficult to assure. Simultaneously, consumers of data will increasingly demand improvements to query functionality to accommodate additional granularity of clinical concepts without sacrificing speed. This new line of research provides an alternative approach to instantiating and querying patient data represented using advanced computable clinical terminologies. Copyright © 2015 Elsevier Inc. All rights reserved.
Transient digitizer with displacement current samplers
McEwan, T.E.
1996-05-21
A low component count, high speed sample gate, and digitizer architecture using the sample gates is based on use of a signal transmission line, a strobe transmission line and a plurality of sample gates connected to the sample transmission line at a plurality of positions. The sample gates include a strobe pickoff structure near the strobe transmission line which generates a charge displacement current in response to propagation of the strobe signal on the strobe transmission line sufficient to trigger the sample gate. The sample gate comprises a two-diode sampling bridge and is connected to a meandered signal transmission line at one end and to a charge-holding cap at the other. The common cathodes are reverse biased. A voltage step is propagated down the strobe transmission line. As the step propagates past a capacitive pickoff, displacement current i=c(dv/dT), flows into the cathodes, driving the bridge into conduction and thereby charging the charge-holding capacitor to a value related to the signal. A charge amplifier converts the charge on the charge-holding capacitor to an output voltage. The sampler is mounted on a printed circuit board, and the sample transmission line and strobe transmission line comprise coplanar microstrips formed on a surface of the substrate. Also, the strobe pickoff structure may comprise a planar pad adjacent the strobe transmission line on the printed circuit board. 16 figs.
Transient digitizer with displacement current samplers
McEwan, Thomas E.
1996-01-01
A low component count, high speed sample gate, and digitizer architecture using the sample gates is based on use of a signal transmission line, a strobe transmission line and a plurality of sample gates connected to the sample transmission line at a plurality of positions. The sample gates include a strobe pickoff structure near the strobe transmission line which generates a charge displacement current in response to propagation of the strobe signal on the strobe transmission line sufficient to trigger the sample gate. The sample gate comprises a two-diode sampling bridge and is connected to a meandered signal transmission line at one end and to a charge-holding cap at the other. The common cathodes are reverse biased. A voltage step is propagated down the strobe transmission line. As the step propagates past a capacitive pickoff, displacement current i=c(dv/dT), flows into the cathodes, driving the bridge into conduction and thereby charging the charge-holding capacitor to a value related to the signal. A charge amplifier converts the charge on the charge-holding capacitor to an output voltage. The sampler is mounted on a printed circuit board, and the sample transmission line and strobe transmission line comprise coplanar microstrips formed on a surface of the substrate. Also, the strobe pickoff structure may comprise a planar pad adjacent the strobe transmission line on the printed circuit board.