Wireless Wide Area Networks for School Districts.
ERIC Educational Resources Information Center
Nair, Prakash
This paper considers a basic question that many schools districts face in attempting to develop affordable, expandable district-wide computer networks that are resistant to obsolescence: Should these wide area networks (WANs) employ wireless technology, stick to venerable hard-wired solutions, or combine both. This publication explores the…
The Use of Microcomputers in Distance Teaching Systems. ZIFF Papiere 70.
ERIC Educational Resources Information Center
Rumble, Greville
Microcomputers have revolutionized distance education in virtually every area. Used alone, personal computers provide students with a wide range of utilities, including word processing, graphics packages, and spreadsheets. When linked to a mainframe computer or connected to other personal computers in local area networks, microcomputers can…
NASA Astrophysics Data System (ADS)
Cheok, Adrian David
This chapter details the Human Pacman system to illuminate entertainment computing which ventures to embed the natural physical world seamlessly with a fantasy virtual playground by capitalizing on infrastructure provided by mobile computing, wireless LAN, and ubiquitous computing. With Human Pacman, we have a physical role-playing computer fantasy together with real human-social and mobile-gaming that emphasizes on collaboration and competition between players in a wide outdoor physical area that allows natural wide-area human-physical movements. Pacmen and Ghosts are now real human players in the real world experiencing mixed computer graphics fantasy-reality provided by using the wearable computers on them. Virtual cookies and actual tangible physical objects are incorporated into the game play to provide novel experiences of seamless transitions between the real and virtual worlds. This is an example of a new form of gaming that anchors on physicality, mobility, social interaction, and ubiquitous computing.
ERIC Educational Resources Information Center
Peelle, Howard A.
Computers have undoubtedly entered the educational arena, mainly in the areas of computer-assisted instruction (CAI) and artificial intelligence, but whether educators should embrace computers and exactly how they should use them are matters of great debate. The use of computers in support of educational administration is widely accepted.…
Wide-area-distributed storage system for a multimedia database
NASA Astrophysics Data System (ADS)
Ueno, Masahiro; Kinoshita, Shigechika; Kuriki, Makato; Murata, Setsuko; Iwatsu, Shigetaro
1998-12-01
We have developed a wide-area-distribution storage system for multimedia databases, which minimizes the possibility of simultaneous failure of multiple disks in the event of a major disaster. It features a RAID system, whose member disks are spatially distributed over a wide area. Each node has a device, which includes the controller of the RAID and the controller of the member disks controlled by other nodes. The devices in the node are connected to a computer, using fiber optic cables and communicate using fiber-channel technology. Any computer at a node can utilize multiple devices connected by optical fibers as a single 'virtual disk.' The advantage of this system structure is that devices and fiber optic cables are shared by the computers. In this report, we first described our proposed system, and a prototype was used for testing. We then discussed its performance; i.e., how to read and write throughputs are affected by data-access delay, the RAID level, and queuing.
Studies in Mathematics, Volume 22. Studies in Computer Science.
ERIC Educational Resources Information Center
Pollack, Seymour V., Ed.
The nine articles in this collection were selected because they represent concerns central to computer science, emphasize topics of particular interest to mathematicians, and underscore the wide range of areas deeply and continually affected by computer science. The contents consist of: "Introduction" (S. V. Pollack), "The…
Device 2F112 (F-14A WST (Weapon System Trainers)) Instructor Console Review.
1983-12-01
Cockpit Section-Trainee Station, b. Instructor Operator Station (OS), c. Computer System, d. Wide-Angle Visual System (WAVS), e. Auxiliary Systems. The...relationship of the three stations can be seen in Figure 1. The stations will be reviewed in greater detail in following sections. Fhe computer system...d) Printer 2) TRAINEE AREA 3) HYDRAULIC POWFR ROOM 4) ELEC. POWER/AIR COMPRESSORS 5) COMPUTER /PERIPHERAL AREA Figure 1. Device 2FI12 general layout
Defense strategies for cloud computing multi-site server infrastructures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S.; Ma, Chris Y. T.; He, Fei
We consider cloud computing server infrastructures for big data applications, which consist of multiple server sites connected over a wide-area network. The sites house a number of servers, network elements and local-area connections, and the wide-area network plays a critical, asymmetric role of providing vital connectivity between them. We model this infrastructure as a system of systems, wherein the sites and wide-area network are represented by their cyber and physical components. These components can be disabled by cyber and physical attacks, and also can be protected against them using component reinforcements. The effects of attacks propagate within the systems, andmore » also beyond them via the wide-area network.We characterize these effects using correlations at two levels using: (a) aggregate failure correlation function that specifies the infrastructure failure probability given the failure of an individual site or network, and (b) first-order differential conditions on system survival probabilities that characterize the component-level correlations within individual systems. We formulate a game between an attacker and a provider using utility functions composed of survival probability and cost terms. At Nash Equilibrium, we derive expressions for the expected capacity of the infrastructure given by the number of operational servers connected to the network for sum-form, product-form and composite utility functions.« less
Towards a real-time wide area motion imagery system
NASA Astrophysics Data System (ADS)
Young, R. I.; Foulkes, S. B.
2015-10-01
It is becoming increasingly important in both the defence and security domains to conduct persistent wide area surveillance (PWAS) of large populations of targets. Wide Area Motion Imagery (WAMI) is a key technique for achieving this wide area surveillance. The recent development of multi-million pixel sensors has provided sensors with wide field of view replete with sufficient resolution for detection and tracking of objects of interest to be achieved across these extended areas of interest. WAMI sensors simultaneously provide high spatial and temporal resolutions, giving extreme pixel counts over large geographical areas. The high temporal resolution is required to enable effective tracking of targets. The provision of wide area coverage with high frame rates generates data deluge issues; these are especially profound if the sensor is mounted on an airborne platform, with finite data-link bandwidth and processing power that is constrained by size, weight and power (SWAP) limitations. These issues manifest themselves either as bottlenecks in the transmission of the imagery off-board or as latency in the time taken to analyse the data due to limited computational processing power.
Breaking Free with Wireless Networks.
ERIC Educational Resources Information Center
Fleischman, John
2002-01-01
Discusses wireless local area networks (LANs) which typically consist of laptop computers that connect to fixed access points via infrared or radio signals. Topics include wide area networks; personal area networks; problems, including limitations of available bandwidth, interference, and security concerns; use in education; interoperability;…
Sum and mean. Standard programs for activation analysis.
Lindstrom, R M
1994-01-01
Two computer programs in use for over a decade in the Nuclear Methods Group at NIST illustrate the utility of standard software: programs widely available and widely used, in which (ideally) well-tested public algorithms produce results that are well understood, and thereby capable of comparison, within the community of users. Sum interactively computes the position, net area, and uncertainty of the area of spectral peaks, and can give better results than automatic peak search programs when peaks are very small, very large, or unusually shaped. Mean combines unequal measurements of a single quantity, tests for consistency, and obtains the weighted mean and six measures of its uncertainty.
Wide-area, real-time monitoring and visualization system
Budhraja, Vikram S.; Dyer, James D.; Martinez Morales, Carlos A.
2013-03-19
A real-time performance monitoring system for monitoring an electric power grid. The electric power grid has a plurality of grid portions, each grid portion corresponding to one of a plurality of control areas. The real-time performance monitoring system includes a monitor computer for monitoring at least one of reliability metrics, generation metrics, transmission metrics, suppliers metrics, grid infrastructure security metrics, and markets metrics for the electric power grid. The data for metrics being monitored by the monitor computer are stored in a data base, and a visualization of the metrics is displayed on at least one display computer having a monitor. The at least one display computer in one said control area enables an operator to monitor the grid portion corresponding to a different said control area.
Wide-area, real-time monitoring and visualization system
Budhraja, Vikram S [Los Angeles, CA; Dyer, James D [La Mirada, CA; Martinez Morales, Carlos A [Upland, CA
2011-11-15
A real-time performance monitoring system for monitoring an electric power grid. The electric power grid has a plurality of grid portions, each grid portion corresponding to one of a plurality of control areas. The real-time performance monitoring system includes a monitor computer for monitoring at least one of reliability metrics, generation metrics, transmission metrics, suppliers metrics, grid infrastructure security metrics, and markets metrics for the electric power grid. The data for metrics being monitored by the monitor computer are stored in a data base, and a visualization of the metrics is displayed on at least one display computer having a monitor. The at least one display computer in one said control area enables an operator to monitor the grid portion corresponding to a different said control area.
Additional Security Considerations for Grid Management
NASA Technical Reports Server (NTRS)
Eidson, Thomas M.
2003-01-01
The use of Grid computing environments is growing in popularity. A Grid computing environment is primarily a wide area network that encompasses multiple local area networks, where some of the local area networks are managed by different organizations. A Grid computing environment also includes common interfaces for distributed computing software so that the heterogeneous set of machines that make up the Grid can be used more easily. The other key feature of a Grid is that the distributed computing software includes appropriate security technology. The focus of most Grid software is on the security involved with application execution, file transfers, and other remote computing procedures. However, there are other important security issues related to the management of a Grid and the users who use that Grid. This note discusses these additional security issues and makes several suggestions as how they can be managed.
An Ethernet Java Applet for a Course for Non-Majors.
ERIC Educational Resources Information Center
Holliday, Mark A.
1997-01-01
Details the topics of a new course that introduces computing and communication technology to students not majoring in computer science. Discusses the process of developing a Java applet (a program that can be invoked through a World Wide Web browser) that illustrates the protocol used by ethernet local area networks to determine which computer can…
Liu, Kui; Wei, Sixiao; Chen, Zhijiang; Jia, Bin; Chen, Genshe; Ling, Haibin; Sheaff, Carolyn; Blasch, Erik
2017-01-01
This paper presents the first attempt at combining Cloud with Graphic Processing Units (GPUs) in a complementary manner within the framework of a real-time high performance computation architecture for the application of detecting and tracking multiple moving targets based on Wide Area Motion Imagery (WAMI). More specifically, the GPU and Cloud Moving Target Tracking (GC-MTT) system applied a front-end web based server to perform the interaction with Hadoop and highly parallelized computation functions based on the Compute Unified Device Architecture (CUDA©). The introduced multiple moving target detection and tracking method can be extended to other applications such as pedestrian tracking, group tracking, and Patterns of Life (PoL) analysis. The cloud and GPUs based computing provides an efficient real-time target recognition and tracking approach as compared to methods when the work flow is applied using only central processing units (CPUs). The simultaneous tracking and recognition results demonstrate that a GC-MTT based approach provides drastically improved tracking with low frame rates over realistic conditions. PMID:28208684
Liu, Kui; Wei, Sixiao; Chen, Zhijiang; Jia, Bin; Chen, Genshe; Ling, Haibin; Sheaff, Carolyn; Blasch, Erik
2017-02-12
This paper presents the first attempt at combining Cloud with Graphic Processing Units (GPUs) in a complementary manner within the framework of a real-time high performance computation architecture for the application of detecting and tracking multiple moving targets based on Wide Area Motion Imagery (WAMI). More specifically, the GPU and Cloud Moving Target Tracking (GC-MTT) system applied a front-end web based server to perform the interaction with Hadoop and highly parallelized computation functions based on the Compute Unified Device Architecture (CUDA©). The introduced multiple moving target detection and tracking method can be extended to other applications such as pedestrian tracking, group tracking, and Patterns of Life (PoL) analysis. The cloud and GPUs based computing provides an efficient real-time target recognition and tracking approach as compared to methods when the work flow is applied using only central processing units (CPUs). The simultaneous tracking and recognition results demonstrate that a GC-MTT based approach provides drastically improved tracking with low frame rates over realistic conditions.
2015-06-01
system accuracy. The AnRAD system was also generalized for the additional application of network intrusion detection . A self-structuring technique...to Host- based Intrusion Detection Systems using Contiguous and Discontiguous System Call Patterns,” IEEE Transactions on Computer, 63(4), pp. 807...square kilometer areas. The anomaly recognition and detection (AnRAD) system was built as a cogent confabulation network . It represented road
ERIC Educational Resources Information Center
Pierre, J. W.; Tuffner, F. K.; Anderson, J. R.; Whitman, D. L.; Ula, A. H. M. S.; Kubichek, R. F.; Wright, C. H. G.; Barrett, S. F.; Cupal, J. J.; Hamann, J. C.
2009-01-01
This paper describes a one-credit laboratory course for freshmen majoring in electrical and computer engineering (ECE). The course is motivational in nature and exposes the students to a wide range of areas of electrical and computer engineering. The authors believe it is important to give freshmen a broad perspective of what ECE is all about, and…
Entanglement in a Quantum Annealing Processor
2016-09-07
that QA is a viable technology for large- scale quantum computing . DOI: 10.1103/PhysRevX.4.021041 Subject Areas: Quantum Physics, Quantum Information...Superconductivity I. INTRODUCTION The past decade has been exciting for the field of quantum computation . A wide range of physical imple- mentations...measurements used in studying prototype universal quantum computers [9–14]. These constraints make it challenging to experimentally determine whether a scalable
Overview of the NASA Dryden Flight Research Facility aeronautical flight projects
NASA Technical Reports Server (NTRS)
Meyer, Robert R., Jr.
1992-01-01
Several principal aerodynamics flight projects of the NASA Dryden Flight Research Facility are discussed. Key vehicle technology areas from a wide range of flight vehicles are highlighted. These areas include flight research data obtained for ground facility and computation correlation, applied research in areas not well suited to ground facilities (wind tunnels), and concept demonstration.
2015-12-04
51 6.6 Power Consumption: Communications ...simulations executing on mobile computing platforms, an area not widely studied to date in the distributed simulation research community . A...simulation community . These initial studies focused on two conservative synchronization algorithms widely used in the distributed simulation field
Campus-Wide Computing: Early Results Using Legion at the University of Virginia
2006-01-01
Bernard et al., “Primitives for Distributed Computing in a Heterogeneous Local Area Network Environ- ment”, IEEE Trans on Soft. Eng. vol. 15, no. 12...1994. [16] F. Ferstl, “CODINE Technical Overview,” Genias, April, 1993. [17] R. F. Freund and D. S. Cornwell , “Superconcurrency: A form of distributed
LMFBR system-wide transient analysis: the state of the art and US validation needs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khatib-Rahbar, M.; Guppy, J.G.; Cerbone, R.J.
1982-01-01
This paper summarizes the computational capabilities in the area of liquid metal fast breeder reactor (LMFBR) system-wide transient analysis in the United States, identifies various numerical and physical approximations, the degree of empiricism, range of applicability, model verification and experimental needs for a wide class of protected transients, in particular, natural circulation shutdown heat removal for both loop- and pool-type plants.
Sound propagation through a variable area duct - Experiment and theory
NASA Technical Reports Server (NTRS)
Silcox, R. J.; Lester, H. C.
1981-01-01
A comparison of experiment and theory has been made for the propagation of sound through a variable area axisymmetric duct with zero mean flow. Measurement of the acoustic pressure field on both sides of the constricted test section was resolved on a modal basis for various spinning mode sources. Transmitted and reflected modal amplitudes and phase angles were compared with finite element computations. Good agreement between experiment and computation was obtained over a wide range of frequencies and modal transmission variations. The study suggests that modal transmission through a variable area duct is governed by the throat modal cut-off ratio.
NASA Technical Reports Server (NTRS)
Nelson, Robert L.; Welsh, Clement J.
1960-01-01
The experimental wave drags of bodies and wing-body combinations over a wide range of Mach numbers are compared with the computed drags utilizing a 24-term Fourier series application of the supersonic area rule and with the results of equivalent-body tests. The results indicate that the equivalent-body technique provides a good method for predicting the wave drag of certain wing-body combinations at and below a Mach number of 1. At Mach numbers greater than 1, the equivalent-body wave drags can be misleading. The wave drags computed using the supersonic area rule are shown to be in best agreement with the experimental results for configurations employing the thinnest wings. The wave drags for the bodies of revolution presented in this report are predicted to a greater degree of accuracy by using the frontal projections of oblique areas than by using normal areas. A rapid method of computing wing area distributions and area-distribution slopes is given in an appendix.
Applications of computer-aided text analysis in natural resources.
David N. Bengston
2000-01-01
Ten contributed papers describe the use of a variety of approaches to computer-aided text analysis and their application to a wide range of research questions related to natural resources and the environment. Taken together, these papers paint a picture of a growing and vital area of research on the human dimensions of natural resource management.
Evolution of Embedded Processing for Wide Area Surveillance
2014-01-01
future vision . 15. SUBJECT TERMS Embedded processing; high performance computing; general-purpose graphical processing units (GPGPUs) 16. SECURITY...recon- naissance (ISR) mission capabilities. The capabilities these advancements are achieving include the ability to provide persistent all...fighters to support and positively affect their mission . Significant improvements in high-performance computing (HPC) technology make it possible to
Chacón, Enrique; Tarazona, Pedro; Bresme, Fernando
2015-07-21
We present a new computational approach to quantify the area per lipid and the area compressibility modulus of biological membranes. Our method relies on the analysis of the membrane fluctuations using our recently introduced coupled undulatory (CU) mode [Tarazona et al., J. Chem. Phys. 139, 094902 (2013)], which provides excellent estimates of the bending modulus of model membranes. Unlike the projected area, widely used in computer simulations of membranes, the CU area is thermodynamically consistent. This new area definition makes it possible to accurately estimate the area of the undulating bilayer, and the area per lipid, by excluding any contributions related to the phospholipid protrusions. We find that the area per phospholipid and the area compressibility modulus features a negligible dependence with system size, making possible their computation using truly small bilayers, involving a few hundred lipids. The area compressibility modulus obtained from the analysis of the CU area fluctuations is fully consistent with the Hooke's law route. Unlike existing methods, our approach relies on a single simulation, and no a priori knowledge of the bending modulus is required. We illustrate our method by analyzing 1-palmitoyl-2-oleoyl-sn-glycero-3-phosphocholine bilayers using the coarse grained MARTINI force-field. The area per lipid and area compressibility modulus obtained with our method and the MARTINI forcefield are consistent with previous studies of these bilayers.
NASA Astrophysics Data System (ADS)
Chacón, Enrique; Tarazona, Pedro; Bresme, Fernando
2015-07-01
We present a new computational approach to quantify the area per lipid and the area compressibility modulus of biological membranes. Our method relies on the analysis of the membrane fluctuations using our recently introduced coupled undulatory (CU) mode [Tarazona et al., J. Chem. Phys. 139, 094902 (2013)], which provides excellent estimates of the bending modulus of model membranes. Unlike the projected area, widely used in computer simulations of membranes, the CU area is thermodynamically consistent. This new area definition makes it possible to accurately estimate the area of the undulating bilayer, and the area per lipid, by excluding any contributions related to the phospholipid protrusions. We find that the area per phospholipid and the area compressibility modulus features a negligible dependence with system size, making possible their computation using truly small bilayers, involving a few hundred lipids. The area compressibility modulus obtained from the analysis of the CU area fluctuations is fully consistent with the Hooke's law route. Unlike existing methods, our approach relies on a single simulation, and no a priori knowledge of the bending modulus is required. We illustrate our method by analyzing 1-palmitoyl-2-oleoyl-sn-glycero-3-phosphocholine bilayers using the coarse grained MARTINI force-field. The area per lipid and area compressibility modulus obtained with our method and the MARTINI forcefield are consistent with previous studies of these bilayers.
TeleMed: Wide-area, secure, collaborative object computing with Java and CORBA for healthcare
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forslund, D.W.; George, J.E.; Gavrilov, E.M.
1998-12-31
Distributed computing is becoming commonplace in a variety of industries with healthcare being a particularly important one for society. The authors describe the development and deployment of TeleMed in a few healthcare domains. TeleMed is a 100% Java distributed application build on CORBA and OMG standards enabling the collaboration on the treatment of chronically ill patients in a secure manner over the Internet. These standards enable other systems to work interoperably with TeleMed and provide transparent access to high performance distributed computing to the healthcare domain. The goal of wide scale integration of electronic medical records is a grand-challenge scalemore » problem of global proportions with far-reaching social benefits.« less
Use of computers in dysmorphology.
Diliberti, J H
1988-01-01
As a consequence of the increasing power and decreasing cost of digital computers, dysmorphologists have begun to explore a wide variety of computerised applications in clinical genetics. Of considerable interest are developments in the areas of syndrome databases, expert systems, literature searches, image processing, and pattern recognition. Each of these areas is reviewed from the perspective of the underlying computer principles, existing applications, and the potential for future developments. Particular emphasis is placed on the analysis of the tasks performed by the dysmorphologist and the design of appropriate tools to facilitate these tasks. In this context the computer and associated software are considered paradigmatically as tools for the dysmorphologist and should be designed accordingly. Continuing improvements in the ability of computers to manipulate vast amounts of data rapidly makes the development of increasingly powerful tools for the dysmorphologist highly probable. PMID:3050092
ERIC Educational Resources Information Center
Association for Educational Data Systems, Washington, DC.
The 98 papers in this collection examine a wide variety of topics related to the latest technological developments as they apply to the educational process. Papers are grouped to reflect common, broad areas of interest, representing the instructional, administrative, and computer science divisions of the Association for Educational Data Systems…
ERIC Educational Resources Information Center
Tuttle, Francis
Twenty-three instructors participated in an 8-week summer institute to develop their technical competency to teach the second year of a 2-year Technical Education Computer Science Program. Instructional material covered the following areas: (1) compiler languages and systems design, (2) cost studies, (3) business organization, (4) advanced…
Increasing Access and Usability of Remote Sensing Data: The NASA Protected Area Archive
NASA Technical Reports Server (NTRS)
Geller, Gary N.
2004-01-01
Although remote sensing data are now widely available, much of it at low or no-cost, many managers of protected conservation areas do not have the expertise or tools to view or analyze it. Thus access to it by the protected area management community is effectively blocked. The Protected Area Archive will increase access to remote sensing data by creating collections of satellite images of protected areas and packaging them with simple-to-use visualization and analytical tools. The user can easily locate the area and image of interest on a map, then display, roam, and zoom the image. A set of simple tools will be provided so the user can explore the data and employ it to assist in management and monitoring of their area. The 'Phase 1 ' version requires only a Windows-based computer and basic computer skills, and may be of particular help to protected area managers in developing countries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chacón, Enrique, E-mail: echacon@icmm.csic.es; Tarazona, Pedro, E-mail: pedro.tarazona@uam.es; Bresme, Fernando, E-mail: f.bresme@imperial.ac.uk
We present a new computational approach to quantify the area per lipid and the area compressibility modulus of biological membranes. Our method relies on the analysis of the membrane fluctuations using our recently introduced coupled undulatory (CU) mode [Tarazona et al., J. Chem. Phys. 139, 094902 (2013)], which provides excellent estimates of the bending modulus of model membranes. Unlike the projected area, widely used in computer simulations of membranes, the CU area is thermodynamically consistent. This new area definition makes it possible to accurately estimate the area of the undulating bilayer, and the area per lipid, by excluding any contributionsmore » related to the phospholipid protrusions. We find that the area per phospholipid and the area compressibility modulus features a negligible dependence with system size, making possible their computation using truly small bilayers, involving a few hundred lipids. The area compressibility modulus obtained from the analysis of the CU area fluctuations is fully consistent with the Hooke’s law route. Unlike existing methods, our approach relies on a single simulation, and no a priori knowledge of the bending modulus is required. We illustrate our method by analyzing 1-palmitoyl-2-oleoyl-sn-glycero-3-phosphocholine bilayers using the coarse grained MARTINI force-field. The area per lipid and area compressibility modulus obtained with our method and the MARTINI forcefield are consistent with previous studies of these bilayers.« less
Computational predictions of the new Gallium nitride nanoporous structures
NASA Astrophysics Data System (ADS)
Lien, Le Thi Hong; Tuoc, Vu Ngoc; Duong, Do Thi; Thu Huyen, Nguyen
2018-05-01
Nanoporous structural prediction is emerging area of research because of their advantages for a wide range of materials science and technology applications in opto-electronics, environment, sensors, shape-selective and bio-catalysis, to name just a few. We propose a computationally and technically feasible approach for predicting Gallium nitride nanoporous structures with hollows at the nano scale. The designed porous structures are studied with computations using the density functional tight binding (DFTB) and conventional density functional theory methods, revealing a variety of promising mechanical and electronic properties, which can potentially find future realistic applications. Their stability is discussed by means of the free energy computed within the lattice-dynamics approach. Our calculations also indicate that all the reported hollow structures are wide band gap semiconductors in the same fashion with their parent’s bulk stable phase. The electronic band structures of these nanoporous structures are finally examined in detail.
Chiral phosphoric acid catalysis: from numbers to insights.
Maji, Rajat; Mallojjala, Sharath Chandra; Wheeler, Steven E
2018-02-19
Chiral phosphoric acids (CPAs) have emerged as powerful organocatalysts for asymmetric reactions, and applications of computational quantum chemistry have revealed important insights into the activity and selectivity of these catalysts. In this tutorial review, we provide an overview of computational tools at the disposal of computational organic chemists and demonstrate their application to a wide array of CPA catalysed reactions. Predictive models of the stereochemical outcome of these reactions are discussed along with specific examples of representative reactions and an outlook on remaining challenges in this area.
ESTCP Pilot Project Wide Area Assessment for Munitions Response
2008-07-01
Data A broadband normalized difference vegetation index ( NDVI ) was computed from the high- resolution spectral data to provide a detection of canopy...chlorophyll content. The NDVI strongly correlates with the green yucca, cactus, juniper, and other SAR-responsive vegetation species on the site...Vegetation Index. NDVI is broadband normalized difference vegetation index computed from high resolution spectral data using (RED-NIR) / (RED +NIR) to
Broca's area: a supramodal hierarchical processor?
Tettamanti, Marco; Weniger, Dorothea
2006-05-01
Despite the presence of shared characteristics across the different domains modulating Broca's area activity (e.g., structural analogies, as between language and music, or representational homologies, as between action execution and action observation), the question of what exactly the common denominator of such diverse brain functions is, with respect to the function of Broca's area, remains largely a debated issue. Here, we suggest that an important computational role of Broca's area may be to process hierarchical structures in a wide range of functional domains.
Can Music and Animation Improve the Flow and Attainment in Online Learning?
ERIC Educational Resources Information Center
Grice, Sue; Hughes, Janet
2009-01-01
Despite the wide use of music in various areas of society to influence listeners in different ways, one area often neglected is the use of music within online learning environments. This paper describes a study of the effects of music and animation upon learners in a computer mediated environment. A test was developed in which each learner was…
Slonecker, E.T.; Tilley, J.S.
2004-01-01
The percentage of impervious surface area in a watershed has been widely recognized as a key indicator of terrestrial and aquatic ecosystem condition. Although the use of the impervious indicator is widespread, there is currently no consistent or mutually accepted method of computing impervious area and the approach of various commonly used techniques varies widely. Further, we do not have reliable information on the components of impervious surfaces, which would be critical in any future planning attempts to remediate problems associated with impervious surface coverage. In cooperation with the USGS Geographic Analysis and Monitoring Program (GAM) and The National Map, and the EPA Landscape Ecology Program, this collaborative research project utilized very high resolution imagery and GIS techniques to map and quantify the individual components of total impervious area in six urban/suburban watersheds in different parts of the United States. These data were served as ground reference, or "truth," for the evaluation for four techniques used to compute impervious area. The results show some important aspects about the component make-up of impervious cover and the variability of methods commonly used to compile this critical emerging indicator of ecosystem condition. ?? 2004 by V. H. Winston and Sons, Inc. All rights reserved.
Ogawa, Takaya; Iyoki, Kenta; Fukushima, Tomohiro; Kajikawa, Yuya
2017-12-14
The field of porous materials is widely spreading nowadays, and researchers need to read tremendous numbers of papers to obtain a "bird's eye" view of a given research area. However, it is difficult for researchers to obtain an objective database based on statistical data without any relation to subjective knowledge related to individual research interests. Here, citation network analysis was applied for a comparative analysis of the research areas for zeolites and metal-organic frameworks as examples for porous materials. The statistical and objective data contributed to the analysis of: (1) the computational screening of research areas; (2) classification of research stages to a certain domain; (3) "well-cited" research areas; and (4) research area preferences of specific countries. Moreover, we proposed a methodology to assist researchers to gain potential research ideas by reviewing related research areas, which is based on the detection of unfocused ideas in one area but focused in the other area by a bibliometric approach.
Ogawa, Takaya; Fukushima, Tomohiro; Kajikawa, Yuya
2017-01-01
The field of porous materials is widely spreading nowadays, and researchers need to read tremendous numbers of papers to obtain a “bird’s eye” view of a given research area. However, it is difficult for researchers to obtain an objective database based on statistical data without any relation to subjective knowledge related to individual research interests. Here, citation network analysis was applied for a comparative analysis of the research areas for zeolites and metal-organic frameworks as examples for porous materials. The statistical and objective data contributed to the analysis of: (1) the computational screening of research areas; (2) classification of research stages to a certain domain; (3) “well-cited” research areas; and (4) research area preferences of specific countries. Moreover, we proposed a methodology to assist researchers to gain potential research ideas by reviewing related research areas, which is based on the detection of unfocused ideas in one area but focused in the other area by a bibliometric approach. PMID:29240708
ERIC Educational Resources Information Center
1983
Delegates to this training computer conference agreed that the scope of economic change is both accelerating and profound and, therefore, will require a wide variety of approaches to human resource development. Training is only a small part of this development. To meet future needs, the conferees discussed and made recommendations in four areas:…
Robust Real-Time Wide-Area Differential GPS Navigation
NASA Technical Reports Server (NTRS)
Yunck, Thomas P. (Inventor); Bertiger, William I. (Inventor); Lichten, Stephen M. (Inventor); Mannucci, Anthony J. (Inventor); Muellerschoen, Ronald J. (Inventor); Wu, Sien-Chong (Inventor)
1998-01-01
The present invention provides a method and a device for providing superior differential GPS positioning data. The system includes a group of GPS receiving ground stations covering a wide area of the Earth's surface. Unlike other differential GPS systems wherein the known position of each ground station is used to geometrically compute an ephemeris for each GPS satellite. the present system utilizes real-time computation of satellite orbits based on GPS data received from fixed ground stations through a Kalman-type filter/smoother whose output adjusts a real-time orbital model. ne orbital model produces and outputs orbital corrections allowing satellite ephemerides to be known with considerable greater accuracy than from die GPS system broadcasts. The modeled orbits are propagated ahead in time and differenced with actual pseudorange data to compute clock offsets at rapid intervals to compensate for SA clock dither. The orbital and dock calculations are based on dual frequency GPS data which allow computation of estimated signal delay at each ionospheric point. These delay data are used in real-time to construct and update an ionospheric shell map of total electron content which is output as part of the orbital correction data. thereby allowing single frequency users to estimate ionospheric delay with an accuracy approaching that of dual frequency users.
Synthetic Analog and Digital Circuits for Cellular Computation and Memory
Purcell, Oliver; Lu, Timothy K.
2014-01-01
Biological computation is a major area of focus in synthetic biology because it has the potential to enable a wide range of applications. Synthetic biologists have applied engineering concepts to biological systems in order to construct progressively more complex gene circuits capable of processing information in living cells. Here, we review the current state of computational genetic circuits and describe artificial gene circuits that perform digital and analog computation. We then discuss recent progress in designing gene circuits that exhibit memory, and how memory and computation have been integrated to yield more complex systems that can both process and record information. Finally, we suggest new directions for engineering biological circuits capable of computation. PMID:24794536
A computer program for analyzing channel geometry
Regan, R.S.; Schaffranek, R.W.
1985-01-01
The Channel Geometry Analysis Program (CGAP) provides the capability to process, analyze, and format cross-sectional data for input to flow/transport simulation models or other computational programs. CGAP allows for a variety of cross-sectional data input formats through use of variable format specification. The program accepts data from various computer media and provides for modification of machine-stored parameter values. CGAP has been devised to provide a rapid and efficient means of computing and analyzing the physical properties of an open-channel reach defined by a sequence of cross sections. CGAP 's 16 options provide a wide range of methods by which to analyze and depict a channel reach and its individual cross-sectional properties. The primary function of the program is to compute the area, width, wetted perimeter, and hydraulic radius of cross sections at successive increments of water surface elevation (stage) from data that consist of coordinate pairs of cross-channel distances and land surface or channel bottom elevations. Longitudinal rates-of-change of cross-sectional properties are also computed, as are the mean properties of a channel reach. Output products include tabular lists of cross-sectional area, channel width, wetted perimeter, hydraulic radius, average depth, and cross-sectional symmetry computed as functions of stage; plots of cross sections; plots of cross-sectional area and (or) channel width as functions of stage; tabular lists of cross-sectional area and channel width computed as functions of stage for subdivisions of a cross section; plots of cross sections in isometric projection; and plots of cross-sectional area at a fixed stage as a function of longitudinal distance along an open-channel reach. A Command Procedure Language program and Job Control Language procedure exist to facilitate program execution on the U.S. Geological Survey Prime and Amdahl computer systems respectively. (Lantz-PTT)
Data Handling and Communication
NASA Astrophysics Data System (ADS)
Hemmer, FréDéRic Giorgio Innocenti, Pier
The following sections are included: * Introduction * Computing Clusters and Data Storage: The New Factory and Warehouse * Local Area Networks: Organizing Interconnection * High-Speed Worldwide Networking: Accelerating Protocols * Detector Simulation: Events Before the Event * Data Analysis and Programming Environment: Distilling Information * World Wide Web: Global Networking * References
Synthetic analog and digital circuits for cellular computation and memory.
Purcell, Oliver; Lu, Timothy K
2014-10-01
Biological computation is a major area of focus in synthetic biology because it has the potential to enable a wide range of applications. Synthetic biologists have applied engineering concepts to biological systems in order to construct progressively more complex gene circuits capable of processing information in living cells. Here, we review the current state of computational genetic circuits and describe artificial gene circuits that perform digital and analog computation. We then discuss recent progress in designing gene networks that exhibit memory, and how memory and computation have been integrated to yield more complex systems that can both process and record information. Finally, we suggest new directions for engineering biological circuits capable of computation. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Computational predictions of zinc oxide hollow structures
NASA Astrophysics Data System (ADS)
Tuoc, Vu Ngoc; Huan, Tran Doan; Thao, Nguyen Thi
2018-03-01
Nanoporous materials are emerging as potential candidates for a wide range of technological applications in environment, electronic, and optoelectronics, to name just a few. Within this active research area, experimental works are predominant while theoretical/computational prediction and study of these materials face some intrinsic challenges, one of them is how to predict porous structures. We propose a computationally and technically feasible approach for predicting zinc oxide structures with hollows at the nano scale. The designed zinc oxide hollow structures are studied with computations using the density functional tight binding and conventional density functional theory methods, revealing a variety of promising mechanical and electronic properties, which can potentially find future realistic applications.
A review and a framework of handheld computer adoption in healthcare.
Lu, Yen-Chiao; Xiao, Yan; Sears, Andrew; Jacko, Julie A
2005-06-01
Wide adoption of mobile computing technology can potentially improve information access, enhance workflow, and promote evidence-based practice to make informed and effective decisions at the point of care. Handheld computers or personal digital assistants (PDAs) offer portable and unobtrusive access to clinical data and relevant information at the point of care. This article reviews the literature on issues related to adoption of PDAs in health care and barriers to PDA adoption. Studies showed that PDAs were used widely in health care providers' practice, and the level of use is expected to rise rapidly. Most care providers found PDAs to be functional and useful in areas of documentation, medical reference, and access to patient data. Major barriers to adoption were identified as usability, security concerns, and lack of technical and organizational support. PDAs offer health care practitioners advantages to enhance their clinical practice. However, better designed PDA hardware and software applications, more institutional support, seamless integration of PDA technology with hospital information systems, and satisfactory security measures are necessary to increase acceptance and wide use of PDAs in healthcare.
NASA Astrophysics Data System (ADS)
Liu, Jiping; Kang, Xiaochen; Dong, Chun; Xu, Shenghua
2017-12-01
Surface area estimation is a widely used tool for resource evaluation in the physical world. When processing large scale spatial data, the input/output (I/O) can easily become the bottleneck in parallelizing the algorithm due to the limited physical memory resources and the very slow disk transfer rate. In this paper, we proposed a stream tilling approach to surface area estimation that first decomposed a spatial data set into tiles with topological expansions. With these tiles, the one-to-one mapping relationship between the input and the computing process was broken. Then, we realized a streaming framework towards the scheduling of the I/O processes and computing units. Herein, each computing unit encapsulated a same copy of the estimation algorithm, and multiple asynchronous computing units could work individually in parallel. Finally, the performed experiment demonstrated that our stream tilling estimation can efficiently alleviate the heavy pressures from the I/O-bound work, and the measured speedup after being optimized have greatly outperformed the directly parallel versions in shared memory systems with multi-core processors.
Quaranta, Alessandro; DʼIsidoro, Orlando; Bambini, Fabrizio; Putignano, Angelo
2016-02-01
To compare the available potential bone-implant contact (PBIC) area of standard and short dental implants by micro-computed tomography (μCT) assessment. Three short implants with different diameters (4.5 × 6 mm, 4.1 × 7 mm, and 4.1 × 6 mm) and 2 standard implants (3.5 × 10 mm and 3.3 × 9 mm) with diverse design and surface features were scanned with μCT. Cross-sectional images were obtained. Image data were manually processed to find the plane that corresponds to the most coronal contact point between the crestal bone and implant. The available PBIC was calculated for each sample. Later on, the cross-sectional slices were processed by a 3-dimensional (3D) software, and 3D images of each sample were used for descriptive analysis and display the microtopography and macrotopography. The wide-diameter short implant (4.5 × 6 mm) showed the higher PBIC (210.89 mm) value followed by the standard (178.07 mm and 185.37 mm) and short implants (130.70 mm and 110.70 mm). Wide-diameter short implants show a surface area comparable with standard implants. Micro-CT analysis is a promising technique to evaluate surface area in dental implants with different macrodesign, microdesign, and surface features.
Distance Learning and the Future of Kamehameha Schools Bishop Estate.
ERIC Educational Resources Information Center
Meyer, Henry E.
1995-01-01
This article details some of the ways that the Kamehameha Schools Bishop Estate (Hawaii) is dealing with the challenge of education in the computer age, including distance learning, Internet linkups, the Hawaii Educational Wide Area Network, and campus closed-circuit and cable television. (SM)
COMPUTER PROGRAM DOCUMENTATION FOR THE ENHANCED STREAM WATER QUALITY MODEL QUAL2E
Presented in the manual are recent modifications and improvements to the widely used stream water quality model QUAL-II. Called QUAL2E, the enhanced model incorporates improvements in eight areas: (1) algal, nitrogen, phosphorus, and dissolved oxygen interactions; (2) algal growt...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ameme, Dan Selorm Kwami; Guttromson, Ross
This report characterizes communications network latency under various network topologies and qualities of service (QoS). The characterizations are probabilistic in nature, allowing deeper analysis of stability for Internet Protocol (IP) based feedback control systems used in grid applications. The work involves the use of Raspberry Pi computers as a proxy for a controlled resource, and an ns-3 network simulator on a Linux server to create an experimental platform (testbed) that can be used to model wide-area grid control network communications in smart grid. Modbus protocol is used for information transport, and Routing Information Protocol is used for dynamic route selectionmore » within the simulated network.« less
Chinellato, Eris; Del Pobil, Angel P
2009-06-01
The topic of vision-based grasping is being widely studied in humans and in other primates using various techniques and with different goals. The fundamental related findings are reviewed in this paper, with the aim of providing researchers from different fields, including intelligent robotics and neural computation, a comprehensive but accessible view on the subject. A detailed description of the principal sensorimotor processes and the brain areas involved is provided following a functional perspective, in order to make this survey especially useful for computational modeling and bio-inspired robotic applications.
DepositScan, a Scanning Program to Measure Spray Deposition Distributions
USDA-ARS?s Scientific Manuscript database
DepositScan, a scanning program was developed to quickly measure spray deposit distributions on water sensitive papers or Kromekote cards which are widely used for determinations of pesticide spray deposition quality on target areas. The program is installed in a portable computer and works with a ...
Texas Agricultural Science Teachers' Attitudes toward Information Technology
ERIC Educational Resources Information Center
Anderson, Ryan; Williams, Robert
2012-01-01
The researchers sought to find the Agricultural Science teachers' attitude toward five innovations (Computer-Aided Design, Record Books, E-Mail Career Development Event Registration, and World Wide Web) of information technology. The population for this study consisted of all 333 secondary Agricultural science teachers from Texas FFA Areas V and…
Rush Health Systems and Meridian Community College: People Serving People
ERIC Educational Resources Information Center
Willis, Jean H.
2007-01-01
Meridian Community College and Rush Health Systems are partners in delivering training focused on Rush's mission statement of hospital-wide commitment to "excellence in service management." Rush and MCC have delivered customized classes in the following areas: medical billing, leadership management, computer training, admissions clerk,…
A review of computer-aided oral and maxillofacial surgery: planning, simulation and navigation.
Chen, Xiaojun; Xu, Lu; Sun, Yi; Politis, Constantinus
2016-11-01
Currently, oral and maxillofacial surgery (OMFS) still poses a significant challenge for surgeons due to the anatomic complexity and limited field of view of the oral cavity. With the great development of computer technologies, he computer-aided surgery has been widely used for minimizing the risks and improving the precision of surgery. Areas covered: The major goal of this paper is to provide a comprehensive reference source of current and future development of computer-aided OMFS including surgical planning, simulation and navigation for relevant researchers. Expert commentary: Compared with the traditional OMFS, computer-aided OMFS overcomes the disadvantage that the treatment on the region of anatomically complex maxillofacial depends almost exclusively on the experience of the surgeon.
Computational and mathematical methods in brain atlasing.
Nowinski, Wieslaw L
2017-12-01
Brain atlases have a wide range of use from education to research to clinical applications. Mathematical methods as well as computational methods and tools play a major role in the process of brain atlas building and developing atlas-based applications. Computational methods and tools cover three areas: dedicated editors for brain model creation, brain navigators supporting multiple platforms, and atlas-assisted specific applications. Mathematical methods in atlas building and developing atlas-aided applications deal with problems in image segmentation, geometric body modelling, physical modelling, atlas-to-scan registration, visualisation, interaction and virtual reality. Here I overview computational and mathematical methods in atlas building and developing atlas-assisted applications, and share my contribution to and experience in this field.
Modeling the pharyngeal pressure during adult nasal high flow therapy.
Kumar, Haribalan; Spence, Callum J T; Tawhai, Merryn H
2015-12-01
Subjects receiving nasal high flow (NHF) via wide-bore nasal cannula may experience different levels of positive pressure depending on the individual response to NHF. In this study, airflow in the nasal airway during NHF-assisted breathing is simulated and nasopharyngeal airway pressure numerically computed, to determine whether the relationship between NHF and pressure can be described by a simple equation. Two geometric models are used for analysis. In the first, 3D airway geometry is reconstructed from computed tomography images of an adult nasal airway. For the second, a simplified geometric model is derived that has the same cross-sectional area as the complex model, but is more readily amenable to analysis. Peak airway pressure is correlated as a function of nasal valve area, nostril area and cannula flow rate, for NHF rates of 20, 40 and 60 L/min. Results show that airway pressure is related by a power law to NHF rate, valve area, and nostril area. Copyright © 2015 Elsevier B.V. All rights reserved.
Integration of the White Sands Complex into a Wide Area Network
NASA Technical Reports Server (NTRS)
Boucher, Phillip Larry; Horan, Sheila, B.
1996-01-01
The NASA White Sands Complex (WSC) satellite communications facility consists of two main ground stations, an auxiliary ground station, a technical support facility, and a power plant building located on White Sands Missile Range. When constructed, terrestrial communication access to these facilities was limited to copper telephone circuits. There was no local or wide area communications network capability. This project incorporated a baseband local area network (LAN) topology at WSC and connected it to NASA's wide area network using the Program Support Communications Network-Internet (PSCN-I). A campus-style LAN is configured in conformance with the International Standards Organization (ISO) Open Systems Interconnect (ISO) model. Ethernet provides the physical and data link layers. Transmission Control Protocol and Internet Protocol (TCP/IP) are used for the network and transport layers. The session, presentation, and application layers employ commercial software packages. Copper-based Ethernet collision domains are constructed in each of the primary facilities and these are interconnected by routers over optical fiber links. The network and each of its collision domains are shown to meet IEEE technical configuration guidelines. The optical fiber links are analyzed for the optical power budget and bandwidth allocation and are found to provide sufficient margin for this application. Personal computers and work stations attached to the LAN communicate with and apply a wide variety of local and remote administrative software tools. The Internet connection provides wide area network (WAN) electronic access to other NASA centers and the world wide web (WWW). The WSC network reduces and simplifies the administrative workload while providing enhanced and advanced inter-communications capabilities among White Sands Complex departments and with other NASA centers.
The development of a multi-target compiler-writing system for flight software development
NASA Technical Reports Server (NTRS)
Feyock, S.; Donegan, M. K.
1977-01-01
A wide variety of systems designed to assist the user in the task of writing compilers has been developed. A survey of these systems reveals that none is entirely appropriate to the purposes of the MUST project, which involves the compilation of one or at most a small set of higher-order languages to a wide variety of target machines offering little or no software support. This requirement dictates that any compiler writing system employed must provide maximal support in the areas of semantics specification and code generation, the areas in which existing compiler writing systems as well as theoretical underpinnings are weakest. This paper describes an ongoing research and development effort to create a compiler writing system which will overcome these difficulties, thus providing a software system which makes possible the fast, trouble-free creation of reliable compilers for a wide variety of target computers.
LensFlow: A Convolutional Neural Network in Search of Strong Gravitational Lenses
NASA Astrophysics Data System (ADS)
Pourrahmani, Milad; Nayyeri, Hooshang; Cooray, Asantha
2018-03-01
In this work, we present our machine learning classification algorithm for identifying strong gravitational lenses from wide-area surveys using convolutional neural networks; LENSFLOW. We train and test the algorithm using a wide variety of strong gravitational lens configurations from simulations of lensing events. Images are processed through multiple convolutional layers that extract feature maps necessary to assign a lens probability to each image. LENSFLOW provides a ranking scheme for all sources that could be used to identify potential gravitational lens candidates by significantly reducing the number of images that have to be visually inspected. We apply our algorithm to the HST/ACS i-band observations of the COSMOS field and present our sample of identified lensing candidates. The developed machine learning algorithm is more computationally efficient and complimentary to classical lens identification algorithms and is ideal for discovering such events across wide areas from current and future surveys such as LSST and WFIRST.
Interoperating Cloud-based Virtual Farms
NASA Astrophysics Data System (ADS)
Bagnasco, S.; Colamaria, F.; Colella, D.; Casula, E.; Elia, D.; Franco, A.; Lusso, S.; Luparello, G.; Masera, M.; Miniello, G.; Mura, D.; Piano, S.; Vallero, S.; Venaruzzo, M.; Vino, G.
2015-12-01
The present work aims at optimizing the use of computing resources available at the grid Italian Tier-2 sites of the ALICE experiment at CERN LHC by making them accessible to interactive distributed analysis, thanks to modern solutions based on cloud computing. The scalability and elasticity of the computing resources via dynamic (“on-demand”) provisioning is essentially limited by the size of the computing site, reaching the theoretical optimum only in the asymptotic case of infinite resources. The main challenge of the project is to overcome this limitation by federating different sites through a distributed cloud facility. Storage capacities of the participating sites are seen as a single federated storage area, preventing the need of mirroring data across them: high data access efficiency is guaranteed by location-aware analysis software and storage interfaces, in a transparent way from an end-user perspective. Moreover, the interactive analysis on the federated cloud reduces the execution time with respect to grid batch jobs. The tests of the investigated solutions for both cloud computing and distributed storage on wide area network will be presented.
A novel algorithm for determining contact area between a respirator and a headform.
Lei, Zhipeng; Yang, James; Zhuang, Ziqing
2014-01-01
The contact area, as well as the contact pressure, is created when a respiratory protection device (a respirator or surgical mask) contacts a human face. A computer-based algorithm for determining the contact area between a headform and N95 filtering facepiece respirator (FFR) was proposed. Six N95 FFRs were applied to five sizes of standard headforms (large, medium, small, long/narrow, and short/wide) to simulate respirator donning. After the contact simulation between a headform and an N95 FFR was conducted, a contact area was determined by extracting the intersection surfaces of the headform and the N95 FFR. Using computer-aided design tools, a superimposed contact area and an average contact area, which are non-uniform rational basis spline (NURBS) surfaces, were developed for each headform. Experiments that directly measured dimensions of the contact areas between headform prototypes and N95 FFRs were used to validate the simulation results. Headform sizes influenced all contact area dimensions (P < 0.0001), and N95 FFR sizing systems influenced all contact area dimensions (P < 0.05) except the left and right chin regions. The medium headform produced the largest contact area, while the large and small headforms produced the smallest.
Current status and future prospects for enabling chemistry technology in the drug discovery process.
Djuric, Stevan W; Hutchins, Charles W; Talaty, Nari N
2016-01-01
This review covers recent advances in the implementation of enabling chemistry technologies into the drug discovery process. Areas covered include parallel synthesis chemistry, high-throughput experimentation, automated synthesis and purification methods, flow chemistry methodology including photochemistry, electrochemistry, and the handling of "dangerous" reagents. Also featured are advances in the "computer-assisted drug design" area and the expanding application of novel mass spectrometry-based techniques to a wide range of drug discovery activities.
The Caltech Concurrent Computation Program - Project description
NASA Technical Reports Server (NTRS)
Fox, G.; Otto, S.; Lyzenga, G.; Rogstad, D.
1985-01-01
The Caltech Concurrent Computation Program wwhich studies basic issues in computational science is described. The research builds on initial work where novel concurrent hardware, the necessary systems software to use it and twenty significant scientific implementations running on the initial 32, 64, and 128 node hypercube machines have been constructed. A major goal of the program will be to extend this work into new disciplines and more complex algorithms including general packages that decompose arbitrary problems in major application areas. New high-performance concurrent processors with up to 1024-nodes, over a gigabyte of memory and multigigaflop performance are being constructed. The implementations cover a wide range of problems in areas such as high energy and astrophysics, condensed matter, chemical reactions, plasma physics, applied mathematics, geophysics, simulation, CAD for VLSI, graphics and image processing. The products of the research program include the concurrent algorithms, hardware, systems software, and complete program implementations.
Language Origin from an Emergentist Perspective
ERIC Educational Resources Information Center
Ke, Jinyun; Holland, John H.
2006-01-01
In recent decades, there has been a surge of interest in the origin of language across a wide range of disciplines. Emergentism provides a new perspective to integrate investigations from different areas of study. This paper discusses how the study of language acquisition can contribute to the inquiry, in particular when computer modeling is…
George A. James
1971-01-01
Part I is a general discussion about the estimation of recreation use, with descriptions of selected sampling techniques for estimating recreation use on a wide variety of different sites and areas. Part II is a brief discussion of an operational computer oriented information system designed and developed by the USDA Forest Service to fully utilize the inventories of...
Night Attack Workload Steering Group. Volume 3. Simulation and Human Factors Subgroup
1982-06-01
information intepretation . The second is the use of pictorial formats or computer generated displays that combine many present-day displays into a small number...base exists in any form (digital, film , or model) which supports the wide area, long track, low level requirements levied by night attack training
Network Design: Best Practices for Alberta School Jurisdictions.
ERIC Educational Resources Information Center
Schienbein, Ralph
This report examines subsections of the computer network topology that relate to end-to-end performance and capacity planning in schools. Active star topology, Category 5 wiring, Ethernet, and intelligent devices are assumed. The report describes a model that can be used to project WAN (wide area network) connection speeds based on user traffic,…
The impact of computer science in molecular medicine: enabling high-throughput research.
de la Iglesia, Diana; García-Remesal, Miguel; de la Calle, Guillermo; Kulikowski, Casimir; Sanz, Ferran; Maojo, Víctor
2013-01-01
The Human Genome Project and the explosion of high-throughput data have transformed the areas of molecular and personalized medicine, which are producing a wide range of studies and experimental results and providing new insights for developing medical applications. Research in many interdisciplinary fields is resulting in data repositories and computational tools that support a wide diversity of tasks: genome sequencing, genome-wide association studies, analysis of genotype-phenotype interactions, drug toxicity and side effects assessment, prediction of protein interactions and diseases, development of computational models, biomarker discovery, and many others. The authors of the present paper have developed several inventories covering tools, initiatives and studies in different computational fields related to molecular medicine: medical informatics, bioinformatics, clinical informatics and nanoinformatics. With these inventories, created by mining the scientific literature, we have carried out several reviews of these fields, providing researchers with a useful framework to locate, discover, search and integrate resources. In this paper we present an analysis of the state-of-the-art as it relates to computational resources for molecular medicine, based on results compiled in our inventories, as well as results extracted from a systematic review of the literature and other scientific media. The present review is based on the impact of their related publications and the available data and software resources for molecular medicine. It aims to provide information that can be useful to support ongoing research and work to improve diagnostics and therapeutics based on molecular-level insights.
NASA Astrophysics Data System (ADS)
Kumar, Manoj; Srivastava, Akanksha
2013-01-01
This paper presents a survey of innovative approaches of the most effective computational techniques for solving singular perturbed partial differential equations, which are useful because of their numerical and computer realizations. Many applied problems appearing in semiconductors theory, biochemistry, kinetics, theory of electrical chains, economics, solid mechanics, fluid dynamics, quantum mechanics, and many others can be modelled as singularly perturbed systems. Here, we summarize a wide range of research articles published by numerous researchers during the last ten years to get a better view of the present scenario in this area of research.
Structural biology computing: Lessons for the biomedical research sciences.
Morin, Andrew; Sliz, Piotr
2013-11-01
The field of structural biology, whose aim is to elucidate the molecular and atomic structures of biological macromolecules, has long been at the forefront of biomedical sciences in adopting and developing computational research methods. Operating at the intersection between biophysics, biochemistry, and molecular biology, structural biology's growth into a foundational framework on which many concepts and findings of molecular biology are interpreted1 has depended largely on parallel advancements in computational tools and techniques. Without these computing advances, modern structural biology would likely have remained an exclusive pursuit practiced by few, and not become the widely practiced, foundational field it is today. As other areas of biomedical research increasingly embrace research computing techniques, the successes, failures and lessons of structural biology computing can serve as a useful guide to progress in other biomedically related research fields. Copyright © 2013 Wiley Periodicals, Inc.
Using Mosix for Wide-Area Compuational Resources
Maddox, Brian G.
2004-01-01
One of the problems with using traditional Beowulf-type distributed processing clusters is that they require an investment in dedicated computer resources. These resources are usually needed in addition to pre-existing ones such as desktop computers and file servers. Mosix is a series of modifications to the Linux kernel that creates a virtual computer, featuring automatic load balancing by migrating processes from heavily loaded nodes to less used ones. An extension of the Beowulf concept is to run a Mosixenabled Linux kernel on a large number of computer resources in an organization. This configuration would provide a very large amount of computational resources based on pre-existing equipment. The advantage of this method is that it provides much more processing power than a traditional Beowulf cluster without the added costs of dedicating resources.
The UK Human Genome Mapping Project online computing service.
Rysavy, F R; Bishop, M J; Gibbs, G P; Williams, G W
1992-04-01
This paper presents an overview of computing and networking facilities developed by the Medical Research Council to provide online computing support to the Human Genome Mapping Project (HGMP) in the UK. The facility is connected to a number of other computing facilities in various centres of genetics and molecular biology research excellence, either directly via high-speed links or through national and international wide-area networks. The paper describes the design and implementation of the current system, a 'client/server' network of Sun, IBM, DEC and Apple servers, gateways and workstations. A short outline of online computing services currently delivered by this system to the UK human genetics research community is also provided. More information about the services and their availability could be obtained by a direct approach to the UK HGMP-RC.
A rapid local singularity analysis algorithm with applications
NASA Astrophysics Data System (ADS)
Chen, Zhijun; Cheng, Qiuming; Agterberg, Frits
2015-04-01
The local singularity model developed by Cheng is fast gaining popularity in characterizing mineralization and detecting anomalies of geochemical, geophysical and remote sensing data. However in one of the conventional algorithms involving the moving average values with different scales is time-consuming especially while analyzing a large dataset. Summed area table (SAT), also called as integral image, is a fast algorithm used within the Viola-Jones object detection framework in computer vision area. Historically, the principle of SAT is well-known in the study of multi-dimensional probability distribution functions, namely in computing 2D (or ND) probabilities (area under the probability distribution) from the respective cumulative distribution functions. We introduce SAT and it's variation Rotated Summed Area Table in the isotropic, anisotropic or directional local singularity mapping in this study. Once computed using SAT, any one of the rectangular sum can be computed at any scale or location in constant time. The area for any rectangular region in the image can be computed by using only 4 array accesses in constant time independently of the size of the region; effectively reducing the time complexity from O(n) to O(1). New programs using Python, Julia, matlab and C++ are implemented respectively to satisfy different applications, especially to the big data analysis. Several large geochemical and remote sensing datasets are tested. A wide variety of scale changes (linear spacing or log spacing) for non-iterative or iterative approach are adopted to calculate the singularity index values and compare the results. The results indicate that the local singularity analysis with SAT is more robust and superior to traditional approach in identifying anomalies.
Intelligent tutoring systems research in the training systems division: Space applications
NASA Technical Reports Server (NTRS)
Regian, J. Wesley
1988-01-01
Computer-Aided Instruction (CAI) is a mature technology used to teach students in a wide variety of domains. The introduction of Artificial Intelligence (AI) technology of the field of CAI has prompted research and development efforts in an area known as Intelligent Computer-Aided Instruction (ICAI). In some cases, ICAI has been touted as a revolutionary alternative to traditional CAI. With the advent of powerful, inexpensive school computers, ICAI is emerging as a potential rival to CAI. In contrast to this, one may conceive of Computer-Based Training (CBT) systems as lying along a continuum which runs from CAI to ICAI. Although the key difference between the two is intelligence, there is not commonly accepted definition of what constitutes an intelligent instructional system.
Current status and future prospects for enabling chemistry technology in the drug discovery process
Djuric, Stevan W.; Hutchins, Charles W.; Talaty, Nari N.
2016-01-01
This review covers recent advances in the implementation of enabling chemistry technologies into the drug discovery process. Areas covered include parallel synthesis chemistry, high-throughput experimentation, automated synthesis and purification methods, flow chemistry methodology including photochemistry, electrochemistry, and the handling of “dangerous” reagents. Also featured are advances in the “computer-assisted drug design” area and the expanding application of novel mass spectrometry-based techniques to a wide range of drug discovery activities. PMID:27781094
Fractional Order and Dynamic Simulation of a System Involving an Elastic Wide Plate
NASA Astrophysics Data System (ADS)
David, S. A.; Balthazar, J. M.; Julio, B. H. S.; Oliveira, C.
2011-09-01
Numerous researchers have studied about nonlinear dynamics in several areas of science and engineering. However, in most cases, these concepts have been explored mainly from the standpoint of analytical and computational methods involving integer order calculus (IOC). In this paper we have examined the dynamic behavior of an elastic wide plate induced by two electromagnets of a point of view of the fractional order calculus (FOC). The primary focus of this study is on to help gain a better understanding of nonlinear dynamic in fractional order systems.
Information infrastructure for emergency medical services.
Orthner, Helmuth; Mishra, Ninad; Terndrup, Thomas; Acker, Joseph; Grimes, Gary; Gemmill, Jill; Battles, Marcie
2005-01-01
The pre-hospital emergency medical and public safety information environment is nearing a threshold of significant change. The change is driven in part by several emerging technologies such as secure, high-speed wireless communication in the local and wide area networks (wLAN, 3G), Geographic Information Systems (GIS), Global Positioning Systems (GPS), and powerful handheld computing and communication services, that are of sufficient utility to be more widely adopted. We propose a conceptual model to enable improved clinical decision making in the pre-hospital environment using these change agents.
Molecular dynamics simulations through GPU video games technologies
Loukatou, Styliani; Papageorgiou, Louis; Fakourelis, Paraskevas; Filntisi, Arianna; Polychronidou, Eleftheria; Bassis, Ioannis; Megalooikonomou, Vasileios; Makałowski, Wojciech; Vlachakis, Dimitrios; Kossida, Sophia
2016-01-01
Bioinformatics is the scientific field that focuses on the application of computer technology to the management of biological information. Over the years, bioinformatics applications have been used to store, process and integrate biological and genetic information, using a wide range of methodologies. One of the most de novo techniques used to understand the physical movements of atoms and molecules is molecular dynamics (MD). MD is an in silico method to simulate the physical motions of atoms and molecules under certain conditions. This has become a state strategic technique and now plays a key role in many areas of exact sciences, such as chemistry, biology, physics and medicine. Due to their complexity, MD calculations could require enormous amounts of computer memory and time and therefore their execution has been a big problem. Despite the huge computational cost, molecular dynamics have been implemented using traditional computers with a central memory unit (CPU). A graphics processing unit (GPU) computing technology was first designed with the goal to improve video games, by rapidly creating and displaying images in a frame buffer such as screens. The hybrid GPU-CPU implementation, combined with parallel computing is a novel technology to perform a wide range of calculations. GPUs have been proposed and used to accelerate many scientific computations including MD simulations. Herein, we describe the new methodologies developed initially as video games and how they are now applied in MD simulations. PMID:27525251
The Role of Theory and Technology in Learning Video Production: The Challenge of Change
ERIC Educational Resources Information Center
Shewbridge, William; Berge, Zane L.
2004-01-01
The video production field has evolved beyond being exclusively relevant to broadcast television. The convergence of low-cost consumer cameras and desktop computer editing has led to new applications of video in a wide range of areas, including the classroom. This presents educators with an opportunity to rethink how students learn video…
Wide-Area Network Resources for Teacher Education.
ERIC Educational Resources Information Center
Aust, Ronald
A central feature of the High Performance Computing Act of 1991 is the establishment of a National Research and Education Network (NREN). The level of access that teachers and teacher educators will need to benefit from the NREN and the types of network resources that are most useful for educators are explored, along with design issues that are…
Language Maintenance on the Internet
ERIC Educational Resources Information Center
Ward, Judit Hajnal; Agocs, Laszlo
2004-01-01
Due to the expanding use of computer networks in Hungary, the Hungarian language has become a grown-up member of the World Wide Web and the Internet. In the past few years, the number of web pages written in Hungarian has significantly increased, since all areas of business, science, education, culture, etc., are eager to make use of the evolving…
Finding Waves: Techniques for a Successful Wireless Site Survey
ERIC Educational Resources Information Center
Shanafelt, Michael
2004-01-01
Wireless Local Area Networks are the most widely adopted networking technology to hit the market in the last three years. They have the potential to make network applications and the Internet available anywhere on a campus so that students and faculty are no longer tethered to their offices or shared computer laboratories in order to connect to a…
Stop the World--West Georgia Is Getting On.
ERIC Educational Resources Information Center
Mitchell, Phyllis R.
1996-01-01
In 5 years, the schools and community of Carrollton, Georgia, created a school systemwide network of 1,400 computers and 70 CD-ROMs connected by a fiber wide-area network to other city institutions and the Internet with grants from local, state, and national industry. After incorporating the new technologies into the curriculum, the dropout rate…
Constructing probabilistic scenarios for wide-area solar power generation
Woodruff, David L.; Deride, Julio; Staid, Andrea; ...
2017-12-22
Optimizing thermal generation commitments and dispatch in the presence of high penetrations of renewable resources such as solar energy requires a characterization of their stochastic properties. In this study, we describe novel methods designed to create day-ahead, wide-area probabilistic solar power scenarios based only on historical forecasts and associated observations of solar power production. Each scenario represents a possible trajectory for solar power in next-day operations with an associated probability computed by algorithms that use historical forecast errors. Scenarios are created by segmentation of historic data, fitting non-parametric error distributions using epi-splines, and then computing specific quantiles from these distributions.more » Additionally, we address the challenge of establishing an upper bound on solar power output. Our specific application driver is for use in stochastic variants of core power systems operations optimization problems, e.g., unit commitment and economic dispatch. These problems require as input a range of possible future realizations of renewables production. However, the utility of such probabilistic scenarios extends to other contexts, e.g., operator and trader situational awareness. Finally, we compare the performance of our approach to a recently proposed method based on quantile regression, and demonstrate that our method performs comparably to this approach in terms of two widely used methods for assessing the quality of probabilistic scenarios: the Energy score and the Variogram score.« less
Constructing probabilistic scenarios for wide-area solar power generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woodruff, David L.; Deride, Julio; Staid, Andrea
Optimizing thermal generation commitments and dispatch in the presence of high penetrations of renewable resources such as solar energy requires a characterization of their stochastic properties. In this study, we describe novel methods designed to create day-ahead, wide-area probabilistic solar power scenarios based only on historical forecasts and associated observations of solar power production. Each scenario represents a possible trajectory for solar power in next-day operations with an associated probability computed by algorithms that use historical forecast errors. Scenarios are created by segmentation of historic data, fitting non-parametric error distributions using epi-splines, and then computing specific quantiles from these distributions.more » Additionally, we address the challenge of establishing an upper bound on solar power output. Our specific application driver is for use in stochastic variants of core power systems operations optimization problems, e.g., unit commitment and economic dispatch. These problems require as input a range of possible future realizations of renewables production. However, the utility of such probabilistic scenarios extends to other contexts, e.g., operator and trader situational awareness. Finally, we compare the performance of our approach to a recently proposed method based on quantile regression, and demonstrate that our method performs comparably to this approach in terms of two widely used methods for assessing the quality of probabilistic scenarios: the Energy score and the Variogram score.« less
Practical applications of hand-held computers in dermatology.
Goldblum, Orin M
2002-09-01
For physicians, hand-held computers are gaining popularity as point of care reference tools. The convergence of hand-held computers, the Internet, and wireless networks will enable these devices to assume more essential roles as mobile transmitters and receivers of digital medical Information. In addition to serving as portable medical reference sources, these devices can be Internet-enabled, allowing them to communicate over wireless wide and local area networks. With enhanced wireless connectivity, hand-held computers can be used at the point of patient care for charge capture, electronic prescribing, laboratory test ordering, laboratory result retrieval, web access, e-mail communication, and other clinical and administrative tasks. Physicians In virtually every medical specialty have begun using these devices in various ways. This review of hand-held computer use in dermatology illustrates practical examples of the many different ways hand-held computers can be effectively used by the practicing dermatologist.
Karanovic, Marinko; Muffels, Christopher T.; Tonkin, Matthew J.; Hunt, Randall J.
2012-01-01
Models of environmental systems have become increasingly complex, incorporating increasingly large numbers of parameters in an effort to represent physical processes on a scale approaching that at which they occur in nature. Consequently, the inverse problem of parameter estimation (specifically, model calibration) and subsequent uncertainty analysis have become increasingly computation-intensive endeavors. Fortunately, advances in computing have made computational power equivalent to that of dozens to hundreds of desktop computers accessible through a variety of alternate means: modelers have various possibilities, ranging from traditional Local Area Networks (LANs) to cloud computing. Commonly used parameter estimation software is well suited to take advantage of the availability of such increased computing power. Unfortunately, logistical issues become increasingly important as an increasing number and variety of computers are brought to bear on the inverse problem. To facilitate efficient access to disparate computer resources, the PESTCommander program documented herein has been developed to provide a Graphical User Interface (GUI) that facilitates the management of model files ("file management") and remote launching and termination of "slave" computers across a distributed network of computers ("run management"). In version 1.0 described here, PESTCommander can access and ascertain resources across traditional Windows LANs: however, the architecture of PESTCommander has been developed with the intent that future releases will be able to access computing resources (1) via trusted domains established in Wide Area Networks (WANs) in multiple remote locations and (2) via heterogeneous networks of Windows- and Unix-based operating systems. The design of PESTCommander also makes it suitable for extension to other computational resources, such as those that are available via cloud computing. Version 1.0 of PESTCommander was developed primarily to work with the parameter estimation software PEST; the discussion presented in this report focuses on the use of the PESTCommander together with Parallel PEST. However, PESTCommander can be used with a wide variety of programs and models that require management, distribution, and cleanup of files before or after model execution. In addition to its use with the Parallel PEST program suite, discussion is also included in this report regarding the use of PESTCommander with the Global Run Manager GENIE, which was developed simultaneously with PESTCommander.
A Novel Algorithm for Determining Contact Area Between a Respirator and a Headform
Lei, Zhipeng; Yang, James; Zhuang, Ziqing
2016-01-01
The contact area, as well as the contact pressure, is created when a respiratory protection device (a respirator or surgical mask) contacts a human face. A computer-based algorithm for determining the contact area between a headform and N95 filtering facepiece respirator (FFR) was proposed. Six N95 FFRs were applied to five sizes of standard headforms (large, medium, small, long/narrow, and short/wide) to simulate respirator donning. After the contact simulation between a headform and an N95 FFR was conducted, a contact area was determined by extracting the intersection surfaces of the headform and the N95 FFR. Using computer-aided design tools, a superimposed contact area and an average contact area, which are non-uniform rational basis spline (NURBS) surfaces, were developed for each headform. Experiments that directly measured dimensions of the contact areas between headform prototypes and N95 FFRs were used to validate the simulation results. Headform sizes influenced all contact area dimensions (P < 0.0001), and N95 FFR sizing systems influenced all contact area dimensions (P < 0.05) except the left and right chin regions. The medium headform produced the largest contact area, while the large and small headforms produced the smallest. PMID:24579752
Consolidation and development roadmap of the EMI middleware
NASA Astrophysics Data System (ADS)
Kónya, B.; Aiftimiei, C.; Cecchi, M.; Field, L.; Fuhrmann, P.; Nilsen, J. K.; White, J.
2012-12-01
Scientific research communities have benefited recently from the increasing availability of computing and data infrastructures with unprecedented capabilities for large scale distributed initiatives. These infrastructures are largely defined and enabled by the middleware they deploy. One of the major issues in the current usage of research infrastructures is the need to use similar but often incompatible middleware solutions. The European Middleware Initiative (EMI) is a collaboration of the major European middleware providers ARC, dCache, gLite and UNICORE. EMI aims to: deliver a consolidated set of middleware components for deployment in EGI, PRACE and other Distributed Computing Infrastructures; extend the interoperability between grids and other computing infrastructures; strengthen the reliability of the services; establish a sustainable model to maintain and evolve the middleware; fulfil the requirements of the user communities. This paper presents the consolidation and development objectives of the EMI software stack covering the last two years. The EMI development roadmap is introduced along the four technical areas of compute, data, security and infrastructure. The compute area plan focuses on consolidation of standards and agreements through a unified interface for job submission and management, a common format for accounting, the wide adoption of GLUE schema version 2.0 and the provision of a common framework for the execution of parallel jobs. The security area is working towards a unified security model and lowering the barriers to Grid usage by allowing users to gain access with their own credentials. The data area is focusing on implementing standards to ensure interoperability with other grids and industry components and to reuse already existing clients in operating systems and open source distributions. One of the highlights of the infrastructure area is the consolidation of the information system services via the creation of a common information backbone.
How medical students use the computer and Internet at a Turkish military medical school.
Kir, Tayfun; Ogur, Recai; Kilic, Selim; Tekbas, Omer Faruk; Hasde, Metin
2004-12-01
The aim of this study was to determine how medical students use the computer and World Wide Web at a Turkish military medical school and to discuss characteristics related to this computer use. The study was conducted in 2003 in the Department of Public Health at the Gulhane Military Medical School in Ankara, Turkey. A survey developed by the authors was distributed to 508 students, after pretest. Responses were analyzed statistically by using a computer. Most of the students (86.4%) could access a computer and the Internet and all of the computers that were used by students had Internet connections, and a small group (8.9%) had owned their own computers. One-half of the students use notes provided by attending stuff and textbooks as assistant resources for their studies. The most common usage of computers was connecting to the Internet (91.9%), and the most common use of the Internet was e-mail communication (81.6%). The most preferred site category for daily visit was newspaper sites (62.8%). Approximately 44.1% of students visited medical sites when they were surfing. Also, there was a negative correlation between school performance and the time spent for computer and Internet use (-0.056 and -0.034, respectively). It was observed that medical students used the computer and Internet essentially for nonmedical purposes. To encourage students to use the computer and Internet for medical purposes, tutors should use the computer and Internet during their teaching activities, and software companies should produce assistant applications for medical students. Also, medical schools should build interactive World Wide Web sites, e-mail groups, discussion boards, and study areas for medical students.
Multi-GPGPU Tsunami simulation at Toyama-bay
NASA Astrophysics Data System (ADS)
Furuyama, Shoichi; Ueda, Yuki
2017-07-01
Accelerated multi General Purpose Graphics Processing Unit (GPGPU) calculation for Tsunami run-up simulation was achieved at the wide area (whole Toyama-bay in Japan) by faster computation technique. Toyama-bay has active-faults at the sea-bed. It has a high possibility to occur earthquakes and Tsunami waves in the case of the huge earthquake, that's why to predict the area of Tsunami run-up is important for decreasing damages to residents by the disaster. However it is very hard task to achieve the simulation by the computer resources problem. A several meter's order of the high resolution calculation is required for the running-up Tsunami simulation because artificial structures on the ground such as roads, buildings, and houses are very small. On the other hand the huge area simulation is also required. In the Toyama-bay case the area is 42 [km] × 15 [km]. When 5 [m] × 5 [m] size computational cells are used for the simulation, over 26,000,000 computational cells are generated. To calculate the simulation, a normal CPU desktop computer took about 10 hours for the calculation. An improvement of calculation time is important problem for the immediate prediction system of Tsunami running-up, as a result it will contribute to protect a lot of residents around the coastal region. The study tried to decrease this calculation time by using multi GPGPU system which is equipped with six NVIDIA TESLA K20xs, InfiniBand network connection between computer nodes by MVAPICH library. As a result 5.16 times faster calculation was achieved on six GPUs than one GPU case and it was 86% parallel efficiency to the linear speed up.
Computing, Information and Communications Technology (CICT) Website
NASA Technical Reports Server (NTRS)
Hardman, John; Tu, Eugene (Technical Monitor)
2002-01-01
The Computing, Information and Communications Technology Program (CICT) was established in 2001 to ensure NASA's Continuing leadership in emerging technologies. It is a coordinated, Agency-wide effort to develop and deploy key enabling technologies for a broad range of mission-critical tasks. The NASA CICT program is designed to address Agency-specific computing, information, and communications technology requirements beyond the projected capabilities of commercially available solutions. The areas of technical focus have been chosen for their impact on NASA's missions, their national importance, and the technical challenge they provide to the Program. In order to meet its objectives, the CICT Program is organized into the following four technology focused projects: 1) Computing, Networking and Information Systems (CNIS); 2) Intelligent Systems (IS); 3) Space Communications (SC); 4) Information Technology Strategic Research (ITSR).
ERIC Educational Resources Information Center
California Postsecondary Education Commission, 2005
2005-01-01
California has close to 2,000 privately owned schools that offer vocational programs not leading to a degree. These include schools providing training and certification for a wide variety of careers in such areas as computer technology, cosmetology, health care, and other business and technical occupations. In addition, there are over 300…
Role Play in Blended Learning: A Case Study Exploring the Impact of Story and Other Elements
ERIC Educational Resources Information Center
Dracup, Mary
2008-01-01
Role play is an increasingly popular technique in tertiary education, being student centred, constructivist and suitable for a range of subject areas. The choice of formats is wide open, with options ranging from the traditional face to face performance through to multi-user online computer games. Some teachers prefer to take advantage of features…
Wayne Tlusty
1979-01-01
The concept of Visual Absorption Capability (VAC) is widely used by Forest Service Landscape Architects. The use of computer generated graphics can aid in combining times an area is seen, distance from observer and land aspect relative viewer; to determine visual magnitude. Perspective Plot allows both fast and inexpensive graphic analysis of VAC allocations, for...
A Direct Position-Determination Approach for Multiple Sources Based on Neural Network Computation.
Chen, Xin; Wang, Ding; Yin, Jiexin; Wu, Ying
2018-06-13
The most widely used localization technology is the two-step method that localizes transmitters by measuring one or more specified positioning parameters. Direct position determination (DPD) is a promising technique that directly localizes transmitters from sensor outputs and can offer superior localization performance. However, existing DPD algorithms such as maximum likelihood (ML)-based and multiple signal classification (MUSIC)-based estimations are computationally expensive, making it difficult to satisfy real-time demands. To solve this problem, we propose the use of a modular neural network for multiple-source DPD. In this method, the area of interest is divided into multiple sub-areas. Multilayer perceptron (MLP) neural networks are employed to detect the presence of a source in a sub-area and filter sources in other sub-areas, and radial basis function (RBF) neural networks are utilized for position estimation. Simulation results show that a number of appropriately trained neural networks can be successfully used for DPD. The performance of the proposed MLP-MLP-RBF method is comparable to the performance of the conventional MUSIC-based DPD algorithm for various signal-to-noise ratios and signal power ratios. Furthermore, the MLP-MLP-RBF network is less computationally intensive than the classical DPD algorithm and is therefore an attractive choice for real-time applications.
NASA Technical Reports Server (NTRS)
Allen, L. H., Jr. (Principal Investigator); Chen, E.; Martsolf, J. D.; Jones, P. H.
1981-01-01
Transparencies, prints, and computer compatible tapes of temperature differential and thermal inertia for the winter of 1978 to 1979 were obtained. Thermal inertial differences in the South Florida depicted include: drained organic soils of the Everglades agricultural area, undrained organic soils of the managed water conservation areas of the South Florida water management district, the urbanized area around Miami, Lake Okeechobee, and the mineral soil west of the Everglades agricultural area. The range of wetlands and uplands conditions within the Suwanee River basin was also identified. It is shown that the combination of wetlands uplands surface features of Florida yield a wide range of surface temperatures related to wetness of the surface features.
Photogrammetry and Its Potential Application in Medical Science on the Basis of Selected Literature.
Ey-Chmielewska, Halina; Chruściel-Nogalska, Małgorzata; Frączak, Bogumiła
2015-01-01
Photogrammetry is a science and technology which allows quantitative traits to be determined, i.e. the reproduction of object shapes, sizes and positions on the basis of their photographs. Images can be recorded in a wide range of wavelengths of electromagnetic radiation. The most common is the visible range, but near- and medium-infrared, thermal infrared, microwaves and X-rays are also used. The importance of photogrammetry has increased with the development of computer software. Digital image processing and real-time measurement have allowed the automation of many complex manufacturing processes. Photogrammetry has been widely used in many areas, especially in geodesy and cartography. In medicine, this method is used for measuring the widely understood human body for the planning and monitoring of therapeutic treatment and its results. Digital images obtained from optical-electronic sensors combined with computer technology have the potential of objective measurement thanks to the remote nature of the data acquisition, with no contact with the measured object and with high accuracy. Photogrammetry also allows the adoption of common standards for archiving and processing patient data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miyayama, Shiro, E-mail: s-miyayama@fukui.saiseikai.or.jp; Yamashiro, Masashi; Okuda, Miho
2009-03-15
This study evaluated the usefulness of cone-beam computed tomography (CBCT) during ultraselective transcatheter arterial chemoembolization (TACE) for hepatocellular carcinomas (HCC) that could not be demonstrated on angiography. Twenty-eight patients with 33 angiographically occult tumors (mean diameter 1.3 {+-} 0.3 cm) were enrolled in the study. The ability of CBCT during arterial portography (CBCTAP), during hepatic arteriography (CBCTHA), and after iodized oil injection (LipCBCT) to detect HCC lesions was retrospectively analyzed. The technical success of TACE was divided into three grades: complete (the embolized area included the entire tumor with at least a 5-mm wide margin), adequate (the embolized area includedmore » the entire tumor but without a 5-mm wide margin in parts), and incomplete (the embolized area did not include the entire tumor) according to computed axial tomographic (CAT) images obtained 1 week after TACE. Local tumor progression was also evaluated. CBCTAP, CBCTHA, and LipCBCT detected HCC lesions in 93.9% (31 of 33), 96.7% (29 of 30), and 100% (29 of 29) of patients, respectively. A single branch was embolized in 28 tumors, and 2 branches were embolized in five tumors. Twenty-seven tumors (81.8%) were classed as complete, and 6 (18.2%) were classed as adequate. None of the tumors were classed as incomplete. Twenty-five tumors (75.8%) had not recurred during 12.0 {+-} 6.2 months. Eight tumors (24.2%), 5 (18.5%) of 27 complete success and 3 (50%) of 6 adequate success, recurred during 10.1 {+-} 6.2 months. CBCT during TACE is useful in detecting and treating small HCC lesions that cannot not be demonstrated on angiography.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Song
CFD (Computational Fluid Dynamics) is a widely used technique in engineering design field. It uses mathematical methods to simulate and predict flow characteristics in a certain physical space. Since the numerical result of CFD computation is very hard to understand, VR (virtual reality) and data visualization techniques are introduced into CFD post-processing to improve the understandability and functionality of CFD computation. In many cases CFD datasets are very large (multi-gigabytes), and more and more interactions between user and the datasets are required. For the traditional VR application, the limitation of computing power is a major factor to prevent visualizing largemore » dataset effectively. This thesis presents a new system designing to speed up the traditional VR application by using parallel computing and distributed computing, and the idea of using hand held device to enhance the interaction between a user and VR CFD application as well. Techniques in different research areas including scientific visualization, parallel computing, distributed computing and graphical user interface designing are used in the development of the final system. As the result, the new system can flexibly be built on heterogeneous computing environment, dramatically shorten the computation time.« less
The evolution of the ISOLDE control system
NASA Astrophysics Data System (ADS)
Jonsson, O. C.; Catherall, R.; Deloose, I.; Drumm, P.; Evensen, A. H. M.; Gase, K.; Focker, G. J.; Fowler, A.; Kugler, E.; Lettry, J.; Olesen, G.; Ravn, H. L.; Isolde Collaboration
The ISOLDE on-line mass separator facility is operating on a Personal Computer based control system since spring 1992. Front End Computers accessing the hardware are controlled from consoles running Microsoft Windows ™ through a Novell NetWare4 ™ local area network. The control system is transparently integrated in the CERN wide office network and makes heavy use of the CERN standard office application programs to control and to document the running of the ISOLDE isotope separators. This paper recalls the architecture of the control system, shows its recent developments and gives some examples of its graphical user interface.
The evolution of the ISOLDE control system
NASA Astrophysics Data System (ADS)
Jonsson, O. C.; Catherall, R.; Deloose, I.; Evensen, A. H. M.; Gase, K.; Focker, G. J.; Fowler, A.; Kugler, E.; Lettry, J.; Olesen, G.; Ravn, H. L.; Drumm, P.
1996-04-01
The ISOLDE on-line mass separator facility is operating on a Personal Computer based control system since spring 1992. Front End Computers accessing the hardware are controlled from consoles running Microsoft Windows® through a Novell NetWare4® local area network. The control system is transparently integrated in the CERN wide office network and makes heavy use of the CERN standard office application programs to control and to document the running of the ISOLDE isotope separators. This paper recalls the architecture of the control system, shows its recent developments and gives some examples of its graphical user interface.
RAPPORT: running scientific high-performance computing applications on the cloud.
Cohen, Jeremy; Filippis, Ioannis; Woodbridge, Mark; Bauer, Daniela; Hong, Neil Chue; Jackson, Mike; Butcher, Sarah; Colling, David; Darlington, John; Fuchs, Brian; Harvey, Matt
2013-01-28
Cloud computing infrastructure is now widely used in many domains, but one area where there has been more limited adoption is research computing, in particular for running scientific high-performance computing (HPC) software. The Robust Application Porting for HPC in the Cloud (RAPPORT) project took advantage of existing links between computing researchers and application scientists in the fields of bioinformatics, high-energy physics (HEP) and digital humanities, to investigate running a set of scientific HPC applications from these domains on cloud infrastructure. In this paper, we focus on the bioinformatics and HEP domains, describing the applications and target cloud platforms. We conclude that, while there are many factors that need consideration, there is no fundamental impediment to the use of cloud infrastructure for running many types of HPC applications and, in some cases, there is potential for researchers to benefit significantly from the flexibility offered by cloud platforms.
Assessing the use of computers in industrial occupational health departments.
Owen, J P
1995-04-01
Computers are widely used in business and industry and the benefits of computerizing occupational health (OH) departments have been advocated by several authors. The requirements for successful computerization of an OH department are reviewed. Having identified the theoretical benefits, the real picture in industry is assessed by surveying 52 firms with over 1000 employees in a large urban area. Only 15 (29%) of the companies reported having any OH service, of which six used computers in the OH department, reflecting the business priorities of most of the companies. The types of software systems used and their main use are examined, along with perceived benefits or disadvantages. With the decreasing costs of computers and increasingly 'user-friendly' software, there is a real cost benefit to be gained from using computers in OH departments, although the concept may have to be 'sold' to management.
Computational Tools for Metabolic Engineering
Copeland, Wilbert B.; Bartley, Bryan A.; Chandran, Deepak; Galdzicki, Michal; Kim, Kyung H.; Sleight, Sean C.; Maranas, Costas D.; Sauro, Herbert M.
2012-01-01
A great variety of software applications are now employed in the metabolic engineering field. These applications have been created to support a wide range of experimental and analysis techniques. Computational tools are utilized throughout the metabolic engineering workflow to extract and interpret relevant information from large data sets, to present complex models in a more manageable form, and to propose efficient network design strategies. In this review, we present a number of tools that can assist in modifying and understanding cellular metabolic networks. The review covers seven areas of relevance to metabolic engineers. These include metabolic reconstruction efforts, network visualization, nucleic acid and protein engineering, metabolic flux analysis, pathway prospecting, post-structural network analysis and culture optimization. The list of available tools is extensive and we can only highlight a small, representative portion of the tools from each area. PMID:22629572
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The Computing and Communications (C) Division is responsible for the Laboratory's Integrated Computing Network (ICN) as well as Laboratory-wide communications. Our computing network, used by 8,000 people distributed throughout the nation, constitutes one of the most powerful scientific computing facilities in the world. In addition to the stable production environment of the ICN, we have taken a leadership role in high-performance computing and have established the Advanced Computing Laboratory (ACL), the site of research on experimental, massively parallel computers; high-speed communication networks; distributed computing; and a broad variety of advanced applications. The computational resources available in the ACL are ofmore » the type needed to solve problems critical to national needs, the so-called Grand Challenge'' problems. The purpose of this publication is to inform our clients of our strategic and operating plans in these important areas. We review major accomplishments since late 1990 and describe our strategic planning goals and specific projects that will guide our operations over the next few years. Our mission statement, planning considerations, and management policies and practices are also included.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The Computing and Communications (C) Division is responsible for the Laboratory`s Integrated Computing Network (ICN) as well as Laboratory-wide communications. Our computing network, used by 8,000 people distributed throughout the nation, constitutes one of the most powerful scientific computing facilities in the world. In addition to the stable production environment of the ICN, we have taken a leadership role in high-performance computing and have established the Advanced Computing Laboratory (ACL), the site of research on experimental, massively parallel computers; high-speed communication networks; distributed computing; and a broad variety of advanced applications. The computational resources available in the ACL are ofmore » the type needed to solve problems critical to national needs, the so-called ``Grand Challenge`` problems. The purpose of this publication is to inform our clients of our strategic and operating plans in these important areas. We review major accomplishments since late 1990 and describe our strategic planning goals and specific projects that will guide our operations over the next few years. Our mission statement, planning considerations, and management policies and practices are also included.« less
Community Seismic Network (CSN)
NASA Astrophysics Data System (ADS)
Clayton, R. W.; Heaton, T. H.; Kohler, M. D.; Chandy, M.; Krause, A.
2010-12-01
In collaboration with computer science and earthquake engineering, we are developing a dense network of low-cost accelerometers that send their data via the Internet to a cloud-based center. The goal is to make block-by-block measurements of ground shaking in urban areas, which will provide emergency response information in the case of large earthquakes, and an unprecedented high-frequency seismic array to study structure and the earthquake process with moderate shaking. When deployed in high-rise buildings they can be used to monitor the state of health of the structure. The sensors are capable of a resolution of approximately 80 micro-g, connect via USB ports to desktop computers, and cost about $100 each. The network will adapt to its environment by using network-wide machine learning to adjust the picking sensitivity. We are also looking into using other motion sensing devices such as cell phones. For a pilot project, we plan to deploy more than 1000 sensors in the greater Pasadena area. The system is easily adaptable to other seismically vulnerable urban areas.
Temporal and spatial organization of doctors' computer usage in a UK hospital department.
Martins, H M G; Nightingale, P; Jones, M R
2005-06-01
This paper describes the use of an application accessible via distributed desktop computing and wireless mobile devices in a specialist department of a UK acute hospital. Data (application logs, in-depth interviews, and ethnographic observation) were simultaneously collected to study doctors' work via this application, when and where they accessed different areas of it, and from what computing devices. These show that the application is widely used, but in significantly different ways over time and space. For example, physicians and surgeons differ in how they use the application and in their choice of mobile or desktop computing. Consultants and junior doctors in the same teams also seem to access different sources of patient information, at different times, and from different locations. Mobile technology was used almost exclusively during the morning by groups of clinicians, predominantly for ward rounds.
Linking and integrating computers for maternity care.
Lumb, M; Fawdry, R
1990-12-01
Functionally separate computer systems have been developed for many different areas relevant to maternity care, e.g. maternity data collection, pathology and imaging reports, staff rostering, personnel, accounting, audit, primary care etc. Using land lines, modems and network gateways, many such quite distinct computer programs or databases can be made accessible from a single terminal. If computer systems are to attain their full potential for the improvement of the maternity care, there will be a need not only for terminal emulation but also for more complex integration. Major obstacles must be overcome before such integration is widely achieved. Technical and conceptual progress towards overcoming these problems is discussed, with particular reference to the OSI (open systems interconnection) initiative, to the Read clinical classification and to the MUMMIES CBS (Common Basic Specification) Maternity Care Project. The issue of confidentiality is also briefly explored.
1995 Joseph E. Whitley, MD, Award. A World Wide Web gateway to the radiologic learning file.
Channin, D S
1995-12-01
Computer networks in general, and the Internet specifically, are changing the way information is manipulated in the world at large and in radiology. The goal of this project was to develop a computer system in which images from the Radiologic Learning File, available previously only via a single-user laser disc, are made available over a generic, high-availability computer network to many potential users simultaneously. Using a networked workstation in our laboratory and freely available distributed hypertext software, we established a World Wide Web (WWW) information server for radiology. Images from the Radiologic Learning File are requested through the WWW client software, digitized from a single laser disc containing the entire teaching file and then transmitted over the network to the client. The text accompanying each image is incorporated into the transmitted document. The Radiologic Learning File is now on-line, and requests to view the cases result in the delivery of the text and images. Image digitization via a frame grabber takes 1/30th of a second. Conversion of the image to a standard computer graphic format takes 45-60 sec. Text and image transmission speed on a local area network varies between 200 and 400 kilobytes (KB) per second depending on the network load. We have made images from a laser disc of the Radiologic Learning File available through an Internet-based hypertext server. The images previously available through a single-user system located in a remote section of our department are now ubiquitously available throughout our department via the department's computer network. We have thus converted a single-user, limited functionality system into a multiuser, widely available resource.
Computer-Based and Paper-Based Measurement of Recognition Performance
1989-03-01
domains (e.g., ship silhouettes, electronic schemata, human anatomy ) to ascertain the universality of the validity and reliability results...specific graphic 4 database (e.g., ship silhouettes, human anatomy , electronic circuits, topography), con- tributes to its wide applicability. The game, then...seek implementation of FLASH and PICTURE in other content areas or subject-matter domains (e.g., ship silhouettes, electronic schemata, human anatomy ) to
Approaches and possible improvements in the area of multibody dynamics modeling
NASA Technical Reports Server (NTRS)
Lips, K. W.; Singh, R.
1987-01-01
A wide ranging look is taken at issues involved in the dynamic modeling of complex, multibodied orbiting space systems. Capabilities and limitations of two major codes (DISCOS, TREETOPS) are assessed and possible extensions to the CONTOPS software are outlined. In addition, recommendations are made concerning the direction future development should take in order to achieve higher fidelity, more computationally efficient multibody software solutions.
NASA Astrophysics Data System (ADS)
Marotta, G. S.
2017-12-01
Currently, there are several methods to determine geoid models. They can be based on terrestrial gravity data, geopotential coefficients, astrogeodetic data or a combination of them. Among the techniques to compute a precise geoid model, the Remove Compute Restore (RCR) has been widely applied. It considers short, medium and long wavelengths derived from altitude data provided by Digital Terrain Models (DTM), terrestrial gravity data and Global Geopotential Model (GGM), respectively. In order to apply this technique, it is necessary to create procedures that compute gravity anomalies and geoid models, by the integration of different wavelengths, and adjust these models to one local vertical datum. This research presents the advances on the package called GRAVTool to compute geoid models path by the RCR, following Helmert's condensation method, and its application in a study area. The studied area comprehends the federal district of Brazil, with 6000 km², wavy relief, heights varying from 600 m to 1340 m, located between the coordinates 48.25ºW, 15.45ºS and 47.33ºW, 16.06ºS. The results of the numerical example on the studied area show a geoid model computed by the GRAVTool package, after analysis of the density, DTM and GGM values, more adequate to the reference values used on the study area. The accuracy of the computed model (σ = ± 0.058 m, RMS = 0.067 m, maximum = 0.124 m and minimum = -0.155 m), using density value of 2.702 g/cm³ ±0.024 g/cm³, DTM SRTM Void Filled 3 arc-second and GGM EIGEN-6C4 up to degree and order 250, matches the uncertainty (σ =± 0.073) of 26 points randomly spaced where the geoid was computed by geometrical leveling technique supported by positioning GNSS. The results were also better than those achieved by Brazilian official regional geoid model (σ = ± 0.076 m, RMS = 0.098 m, maximum = 0.320 m and minimum = -0.061 m).
A Study of Quality of Service Communication for High-Speed Packet-Switching Computer Sub-Networks
NASA Technical Reports Server (NTRS)
Cui, Zhenqian
1999-01-01
With the development of high-speed networking technology, computer networks, including local-area networks (LANs), wide-area networks (WANs) and the Internet, are extending their traditional roles of carrying computer data. They are being used for Internet telephony, multimedia applications such as conferencing and video on demand, distributed simulations, and other real-time applications. LANs are even used for distributed real-time process control and computing as a cost-effective approach. Differing from traditional data transfer, these new classes of high-speed network applications (video, audio, real-time process control, and others) are delay sensitive. The usefulness of data depends not only on the correctness of received data, but also the time that data are received. In other words, these new classes of applications require networks to provide guaranteed services or quality of service (QoS). Quality of service can be defined by a set of parameters and reflects a user's expectation about the underlying network's behavior. Traditionally, distinct services are provided by different kinds of networks. Voice services are provided by telephone networks, video services are provided by cable networks, and data transfer services are provided by computer networks. A single network providing different services is called an integrated-services network.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyonnais, Marc; Smith, Matt; Mace, Kate P.
SCinet is the purpose-built network that operates during the International Conference for High Performance Computing,Networking, Storage and Analysis (Super Computing or SC). Created each year for the conference, SCinet brings to life a high-capacity network that supports applications and experiments that are a hallmark of the SC conference. The network links the convention center to research and commercial networks around the world. This resource serves as a platform for exhibitors to demonstrate the advanced computing resources of their home institutions and elsewhere by supporting a wide variety of applications. Volunteers from academia, government and industry work together to design andmore » deliver the SCinet infrastructure. Industry vendors and carriers donate millions of dollars in equipment and services needed to build and support the local and wide area networks. Planning begins more than a year in advance of each SC conference and culminates in a high intensity installation in the days leading up to the conference. The SCinet architecture for SC16 illustrates a dramatic increase in participation from the vendor community, particularly those that focus on network equipment. Software-Defined Networking (SDN) and Data Center Networking (DCN) are present in nearly all aspects of the design.« less
A framework for activity detection in wide-area motion imagery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Porter, Reid B; Ruggiero, Christy E; Morrison, Jack D
2009-01-01
Wide-area persistent imaging systems are becoming increasingly cost effective and now large areas of the earth can be imaged at relatively high frame rates (1-2 fps). The efficient exploitation of the large geo-spatial-temporal datasets produced by these systems poses significant technical challenges for image and video analysis and data mining. In recent years there has been significant progress made on stabilization, moving object detection and tracking and automated systems now generate hundreds to thousands of vehicle tracks from raw data, with little human intervention. However, the tracking performance at this scale, is unreliable and average track length is much smallermore » than the average vehicle route. This is a limiting factor for applications which depend heavily on track identity, i.e. tracking vehicles from their points of origin to their final destination. In this paper we propose and investigate a framework for wide-area motion imagery (W AMI) exploitation that minimizes the dependence on track identity. In its current form this framework takes noisy, incomplete moving object detection tracks as input, and produces a small set of activities (e.g. multi-vehicle meetings) as output. The framework can be used to focus and direct human users and additional computation, and suggests a path towards high-level content extraction by learning from the human-in-the-loop.« less
GRAVTool, a Package to Compute Geoid Model by Remove-Compute-Restore Technique
NASA Astrophysics Data System (ADS)
Marotta, G. S.; Blitzkow, D.; Vidotti, R. M.
2015-12-01
Currently, there are several methods to determine geoid models. They can be based on terrestrial gravity data, geopotential coefficients, astro-geodetic data or a combination of them. Among the techniques to compute a precise geoid model, the Remove-Compute-Restore (RCR) has been widely applied. It considers short, medium and long wavelengths derived from altitude data provided by Digital Terrain Models (DTM), terrestrial gravity data and global geopotential coefficients, respectively. In order to apply this technique, it is necessary to create procedures that compute gravity anomalies and geoid models, by the integration of different wavelengths, and that adjust these models to one local vertical datum. This research presents a developed package called GRAVTool based on MATLAB software to compute local geoid models by RCR technique and its application in a study area. The studied area comprehends the federal district of Brazil, with ~6000 km², wavy relief, heights varying from 600 m to 1340 m, located between the coordinates 48.25ºW, 15.45ºS and 47.33ºW, 16.06ºS. The results of the numerical example on the studied area show the local geoid model computed by the GRAVTool package (Figure), using 1377 terrestrial gravity data, SRTM data with 3 arc second of resolution, and geopotential coefficients of the EIGEN-6C4 model to degree 360. The accuracy of the computed model (σ = ± 0.071 m, RMS = 0.069 m, maximum = 0.178 m and minimum = -0.123 m) matches the uncertainty (σ =± 0.073) of 21 points randomly spaced where the geoid was computed by geometrical leveling technique supported by positioning GNSS. The results were also better than those achieved by Brazilian official regional geoid model (σ = ± 0.099 m, RMS = 0.208 m, maximum = 0.419 m and minimum = -0.040 m).
Probabilistic evaluation of SSME structural components
NASA Astrophysics Data System (ADS)
Rajagopal, K. R.; Newell, J. F.; Ho, H.
1991-05-01
The application is described of Composite Load Spectra (CLS) and Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) family of computer codes to the probabilistic structural analysis of four Space Shuttle Main Engine (SSME) space propulsion system components. These components are subjected to environments that are influenced by many random variables. The applications consider a wide breadth of uncertainties encountered in practice, while simultaneously covering a wide area of structural mechanics. This has been done consistent with the primary design requirement for each component. The probabilistic application studies are discussed using finite element models that have been typically used in the past in deterministic analysis studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Isozaki, Toshikuni; Shibata, Katsuyuki
1997-04-01
Experimental and computed results applicable to Leak Before Break analysis are presented. The specific area of investigation is the effect of the temperature distribution changes due to wetting of the test pipe near the crack on the increase in the crack opening area and leak rate. Two 12-inch straight pipes subjected to both internal pressure and thermal load, but not to bending load, are modelled. The leak rate was found to be very susceptible to the metal temperature of the piping. In leak rate tests, therefore, it is recommended that temperature distribution be measured precisely for a wide area.
The impact of the Internet on cancer outcomes.
Eysenbach, Gunther
2003-01-01
Each day, more than 12.5 million health-related computer searches are conducted on the World Wide Web. Based on a meta-analysis of 24 published surveys, the author estimates that in the developed world, about 39% of persons with cancer are using the Internet, and approximately 2.3 million persons living with cancer worldwide are online. In addition, 15% to 20% of persons with cancer use the Internet "indirectly" through family and friends. Based on a comprehensive review of the literature, the available evidence on how persons with cancer are using the Internet and the effect of Internet use on persons with cancer is summarized. The author distinguishes four areas of Internet use: communication (electronic mail), community (virtual support groups), content (health information on the World Wide Web), and e-commerce. A conceptual framework summarizing the factors involved in a possible link between Internet use and cancer outcomes is presented, and future areas for research are highlighted.
Gadkowski, L. Beth; Stout, Jason E.
2008-01-01
Summary: A pulmonary cavity is a gas-filled area of the lung in the center of a nodule or area of consolidation and may be clinically observed by use of plain chest radiography or computed tomography. Cavities are present in a wide variety of infectious and noninfectious processes. This review discusses the differential diagnosis of pathological processes associated with lung cavities, focusing on infections associated with lung cavities. The goal is to provide the clinician and clinical microbiologist with an overview of the diseases most commonly associated with lung cavities, with attention to the epidemiology and clinical characteristics of the host. PMID:18400799
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.
2005-12-27
Graph theory is a branch of discrete combinatorial mathematics that studies the properties of graphs. The theory was pioneered by the Swiss mathematician Leonhard Euler in the 18th century, commenced its formal development during the second half of the 19th century, and has witnessed substantial growth during the last seventy years, with applications in areas as diverse as engineering, computer science, physics, sociology, chemistry and biology. Graph theory has also had a strong impact in computational linguistics by providing the foundations for the theory of features structures that has emerged as one of the most widely used frameworks for themore » representation of grammar formalisms.« less
Amen, Daniel G; Hanks, Chris; Prunella, Jill R; Green, Aisa
2007-01-01
The authors explored differences in regional cerebral blood flow in 11 impulsive murderers and 11 healthy comparison subjects using single photon emission computed tomography. The authors assessed subjects at rest and during a computerized go/no-go concentration task. Using statistical parametric mapping software, the authors performed voxel-by-voxel t tests to assess significant differences, making family-wide error corrections for multiple comparisons. Murderers were found to have significantly lower relative rCBF during concentration, particularly in areas associated with concentration and impulse control. These results indicate that nonemotionally laden stimuli may result in frontotemporal dysregulation in people predisposed to impulsive violence.
Sand deposition in the Colorado River in the Grand Canyon from flooding of the Little Colorado River
Wiele, S.M.; Graf, J.B.; Smith, J.D.
1996-01-01
Methods for computing the volume of sand deposited in the Colorado River in Grand Canyon National Park by floods in major tributaries and for determining redistribution of that sand by main-channel flows are required for successful management of sand-dependent riparian resources. We have derived flow, sediment transport, and bed evolution models based on a gridded topography developed from measured channel topography and used these models to compute deposition in a short reach of the river just downstream from the Little Colorado River, the largest tributary in the park. Model computations of deposition from a Little Colorado River flood in January 1993 were compared to bed changes measured at 15 cross sections. The total difference between changes in cross-sectional area due to deposition computed by the model and the measured changes was 6%. A wide reach with large areas of recirculating flow and large depressions in the main channel accumulated the most sand, whereas a reach with similar planimetric area but a long, narrow shape and relatively small areas of recirculating flow and small depressions in the main channel accumulated only about a seventh as much sand. About 32% of the total deposition was in recirculation zones, 65% was in the main channel, and 3% was deposited along the channel margin away from the recirculation zone. Overall, about 15% of the total input of sand from this Little Colorado River flood was deposited in the first 3 km below the confluence, suggesting that deposition of the flood-derived material extended for only several tens of kilometers downstream from the confluence.
Tanaka, Yoshihisa; Nakamura, Shinichiro; Kuriyama, Shinichi; Ito, Hiromu; Furu, Moritoshi; Komistek, Richard D; Matsuda, Shuichi
2016-11-01
It is unknown whether a computer simulation with simple models can estimate individual in vivo knee kinematics, although some complex models have predicted the knee kinematics. The purposes of this study are first, to validate the accuracy of the computer simulation with our developed model during a squatting activity in a weight-bearing deep knee bend and then, to analyze the contact area and the contact stress of the tri-condylar implants for individual patients. We compared the anteroposterior (AP) contact positions of medial and lateral condyles calculated by the computer simulation program with the positions measured from the fluoroscopic analysis for three implanted knees. Then the contact area and the stress including the third condyle were calculated individually using finite element (FE) analysis. The motion patterns were similar in the simulation program and the fluoroscopic surveillance. Our developed model could nearly estimate the individual in vivo knee kinematics. The mean and maximum differences of the AP contact positions were 1.0mm and 2.5mm, respectively. At 120° of knee flexion, the contact area at the third condyle was wider than the both condyles. The mean maximum contact stress at the third condyle was lower than the both condyles at 90° and 120° of knee flexion. Individual bone models are required to estimate in vivo knee kinematics in our simple model. The tri-condylar implant seems to be safe for deep flexion activities due to the wide contact area and low contact stress. Copyright © 2016 Elsevier Ltd. All rights reserved.
Methodology, status and plans for development and assessment of TUF and CATHENA codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luxat, J.C.; Liu, W.S.; Leung, R.K.
1997-07-01
An overview is presented of the Canadian two-fluid computer codes TUF and CATHENA with specific focus on the constraints imposed during development of these codes and the areas of application for which they are intended. Additionally a process for systematic assessment of these codes is described which is part of a broader, industry based initiative for validation of computer codes used in all major disciplines of safety analysis. This is intended to provide both the licensee and the regulator in Canada with an objective basis for assessing the adequacy of codes for use in specific applications. Although focused specifically onmore » CANDU reactors, Canadian experience in developing advanced two-fluid codes to meet wide-ranging application needs while maintaining past investment in plant modelling provides a useful contribution to international efforts in this area.« less
Mathematical modelling of Bit-Level Architecture using Reciprocal Quantum Logic
NASA Astrophysics Data System (ADS)
Narendran, S.; Selvakumar, J.
2018-04-01
Efficiency of high-performance computing is on high demand with both speed and energy efficiency. Reciprocal Quantum Logic (RQL) is one of the technology which will produce high speed and zero static power dissipation. RQL uses AC power supply as input rather than DC input. RQL has three set of basic gates. Series of reciprocal transmission lines are placed in between each gate to avoid loss of power and to achieve high speed. Analytical model of Bit-Level Architecture are done through RQL. Major drawback of reciprocal Quantum Logic is area, because of lack in proper power supply. To achieve proper power supply we need to use splitters which will occupy large area. Distributed arithmetic uses vector- vector multiplication one is constant and other is signed variable and each word performs as a binary number, they rearranged and mixed to form distributed system. Distributed arithmetic is widely used in convolution and high performance computational devices.
CIMOSA process classification for business process mapping in non-manufacturing firms: A case study
NASA Astrophysics Data System (ADS)
Latiffianti, Effi; Siswanto, Nurhadi; Wiratno, Stefanus Eko; Saputra, Yudha Andrian
2017-11-01
A business process mapping is one important means to enable an enterprise to effectively manage the value chain. One of widely used approaches to classify business process for mapping purpose is Computer Integrated Manufacturing System Open Architecture (CIMOSA). CIMOSA was initially designed for Computer Integrated Manufacturing (CIM) system based enterprises. This paper aims to analyze the use of CIMOSA process classification for business process mapping in the firms that do not fall within the area of CIM. Three firms of different business area that have used CIMOSA process classification were observed: an airline firm, a marketing and trading firm for oil and gas products, and an industrial estate management firm. The result of the research has shown that CIMOSA can be used in non-manufacturing firms with some adjustment. The adjustment includes addition, reduction, or modification of some processes suggested by CIMOSA process classification as evidenced by the case studies.
Kaliniene, Gintare; Ustinaviciene, Ruta; Skemiene, Lina; Vaiciulis, Vidmantas; Vasilavicius, Paulius
2016-10-07
Information technologies in occupational activities have been developing very rapid. Epidemiological studies have shown that musculoskeletal disorders are widely prevalent among employees working with a computer. The aim of this study was to evaluate the prevalence of musculoskeletal pain in various anatomical areas and its associations with individual, ergonomic, and psychosocial factors among computer workers of the public sector in Kaunas County, Lithuania. The investigation consisting of two parts - questionnaire study (Nordic Musculoskeletal Questionnaire and Copenhagen Psychosocial Questionnaire) and direct observation (evaluation of work ergonomics using the Rapid Upper Limb Assessment [RULA]) - was carried out in three randomly selected public sector companies of Kaunas County. The representative study sample comprised 513 public service office workers. The prevalence of musculoskeletal pain in five anatomical areas of the body (shoulders, elbows, wrists/hands, as well as upper and low back) was evaluated. The prevalence rates of shoulder, elbow, wrist/hand, upper and low back pain were 50.5 %, 20.3 %, 26.3 %, 44.8 %, and 56.1 %, respectively. Individual factors such as gender, age, computer work experience, and body mass index were found as significant for musculoskeletal pain in various musculoskeletal regions. The respondents reporting pain in shoulder, wrist/hand, upper back, and low back areas had a statistically significantly higher mean RULA score. The duration of working with a computer was found as a significant factor for shoulder pain. High quantitative demands were related to musculoskeletal pain in all investigated anatomical areas expect for the low back; weak social support was a significant predictor for complaints in upper and low back areas. This study confirmed associations between musculoskeletal pain and work ergonomics; therefore, preventive measures at the workplace should be directed to the improvement in ergonomic work environment, education, and workload optimization.
Massively parallel sparse matrix function calculations with NTPoly
NASA Astrophysics Data System (ADS)
Dawson, William; Nakajima, Takahito
2018-04-01
We present NTPoly, a massively parallel library for computing the functions of sparse, symmetric matrices. The theory of matrix functions is a well developed framework with a wide range of applications including differential equations, graph theory, and electronic structure calculations. One particularly important application area is diagonalization free methods in quantum chemistry. When the input and output of the matrix function are sparse, methods based on polynomial expansions can be used to compute matrix functions in linear time. We present a library based on these methods that can compute a variety of matrix functions. Distributed memory parallelization is based on a communication avoiding sparse matrix multiplication algorithm. OpenMP task parallellization is utilized to implement hybrid parallelization. We describe NTPoly's interface and show how it can be integrated with programs written in many different programming languages. We demonstrate the merits of NTPoly by performing large scale calculations on the K computer.
NASA Astrophysics Data System (ADS)
Strzałka, Dominik; Dymora, Paweł; Mazurek, Mirosław
2018-02-01
In this paper we present some preliminary results in the field of computer systems management with relation to Tsallis thermostatistics and the ubiquitous problem of hardware limited resources. In the case of systems with non-deterministic behaviour, management of their resources is a key point that guarantees theirs acceptable performance and proper working. This is very wide problem that stands for many challenges in financial, transport, water and food, health, etc. areas. We focus on computer systems with attention paid to cache memory and propose to use an analytical model that is able to connect non-extensive entropy formalism, long-range dependencies, management of system resources and queuing theory. Obtained analytical results are related to the practical experiment showing interesting and valuable results.
Stride, E.; Cheema, U.
2017-01-01
The growth of bubbles within the body is widely believed to be the cause of decompression sickness (DCS). Dive computer algorithms that aim to prevent DCS by mathematically modelling bubble dynamics and tissue gas kinetics are challenging to validate. This is due to lack of understanding regarding the mechanism(s) leading from bubble formation to DCS. In this work, a biomimetic in vitro tissue phantom and a three-dimensional computational model, comprising a hyperelastic strain-energy density function to model tissue elasticity, were combined to investigate key areas of bubble dynamics. A sensitivity analysis indicated that the diffusion coefficient was the most influential material parameter. Comparison of computational and experimental data revealed the bubble surface's diffusion coefficient to be 30 times smaller than that in the bulk tissue and dependent on the bubble's surface area. The initial size, size distribution and proximity of bubbles within the tissue phantom were also shown to influence their subsequent dynamics highlighting the importance of modelling bubble nucleation and bubble–bubble interactions in order to develop more accurate dive algorithms. PMID:29263127
New Parallel Algorithms for Landscape Evolution Model
NASA Astrophysics Data System (ADS)
Jin, Y.; Zhang, H.; Shi, Y.
2017-12-01
Most landscape evolution models (LEM) developed in the last two decades solve the diffusion equation to simulate the transportation of surface sediments. This numerical approach is difficult to parallelize due to the computation of drainage area for each node, which needs huge amount of communication if run in parallel. In order to overcome this difficulty, we developed two parallel algorithms for LEM with a stream net. One algorithm handles the partition of grid with traditional methods and applies an efficient global reduction algorithm to do the computation of drainage areas and transport rates for the stream net; the other algorithm is based on a new partition algorithm, which partitions the nodes in catchments between processes first, and then partitions the cells according to the partition of nodes. Both methods focus on decreasing communication between processes and take the advantage of massive computing techniques, and numerical experiments show that they are both adequate to handle large scale problems with millions of cells. We implemented the two algorithms in our program based on the widely used finite element library deal.II, so that it can be easily coupled with ASPECT.
Group implicit concurrent algorithms in nonlinear structural dynamics
NASA Technical Reports Server (NTRS)
Ortiz, M.; Sotelino, E. D.
1989-01-01
During the 70's and 80's, considerable effort was devoted to developing efficient and reliable time stepping procedures for transient structural analysis. Mathematically, the equations governing this type of problems are generally stiff, i.e., they exhibit a wide spectrum in the linear range. The algorithms best suited to this type of applications are those which accurately integrate the low frequency content of the response without necessitating the resolution of the high frequency modes. This means that the algorithms must be unconditionally stable, which in turn rules out explicit integration. The most exciting possibility in the algorithms development area in recent years has been the advent of parallel computers with multiprocessing capabilities. So, this work is mainly concerned with the development of parallel algorithms in the area of structural dynamics. A primary objective is to devise unconditionally stable and accurate time stepping procedures which lend themselves to an efficient implementation in concurrent machines. Some features of the new computer architecture are summarized. A brief survey of current efforts in the area is presented. A new class of concurrent procedures, or Group Implicit algorithms is introduced and analyzed. The numerical simulation shows that GI algorithms hold considerable promise for application in coarse grain as well as medium grain parallel computers.
Computing at DESY — current setup, trends and strategic directions
NASA Astrophysics Data System (ADS)
Ernst, Michael
1998-05-01
Since the HERA experiments H1 and ZEUS started data taking in '92, the computing environment at DESY has changed dramatically. Running a mainframe centred computing for more than 20 years, DESY switched to a heterogeneous, fully distributed computing environment within only about two years in almost every corner where computing has its applications. The computing strategy was highly influenced by the needs of the user community. The collaborations are usually limited by current technology and their ever increasing demands is the driving force for central computing to always move close to the technology edge. While DESY's central computing has a multidecade experience in running Central Data Recording/Central Data Processing for HEP experiments, the most challenging task today is to provide for clear and homogeneous concepts in the desktop area. Given that lowest level commodity hardware draws more and more attention, combined with the financial constraints we are facing already today, we quickly need concepts for integrated support of a versatile device which has the potential to move into basically any computing area in HEP. Though commercial solutions, especially addressing the PC management/support issues, are expected to come to market in the next 2-3 years, we need to provide for suitable solutions now. Buying PC's at DESY currently at a rate of about 30/month will otherwise absorb any available manpower in central computing and still will leave hundreds of unhappy people alone. Though certainly not the only region, the desktop issue is one of the most important one where we need HEP-wide collaboration to a large extent, and right now. Taking into account that there is traditionally no room for R&D at DESY, collaboration, meaning sharing experience and development resources within the HEP community, is a predominant factor for us.
Semantics-based distributed I/O with the ParaMEDIC framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balaji, P.; Feng, W.; Lin, H.
2008-01-01
Many large-scale applications simultaneously rely on multiple resources for efficient execution. For example, such applications may require both large compute and storage resources; however, very few supercomputing centers can provide large quantities of both. Thus, data generated at the compute site oftentimes has to be moved to a remote storage site for either storage or visualization and analysis. Clearly, this is not an efficient model, especially when the two sites are distributed over a wide-area network. Thus, we present a framework called 'ParaMEDIC: Parallel Metadata Environment for Distributed I/O and Computing' which uses application-specific semantic information to convert the generatedmore » data to orders-of-magnitude smaller metadata at the compute site, transfer the metadata to the storage site, and re-process the metadata at the storage site to regenerate the output. Specifically, ParaMEDIC trades a small amount of additional computation (in the form of data post-processing) for a potentially significant reduction in data that needs to be transferred in distributed environments.« less
A survey of GPU-based medical image computing techniques
Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming
2012-01-01
Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine. PMID:23256080
Unified Computational Methods for Regression Analysis of Zero-Inflated and Bound-Inflated Data
Yang, Yan; Simpson, Douglas
2010-01-01
Bounded data with excess observations at the boundary are common in many areas of application. Various individual cases of inflated mixture models have been studied in the literature for bound-inflated data, yet the computational methods have been developed separately for each type of model. In this article we use a common framework for computing these models, and expand the range of models for both discrete and semi-continuous data with point inflation at the lower boundary. The quasi-Newton and EM algorithms are adapted and compared for estimation of model parameters. The numerical Hessian and generalized Louis method are investigated as means for computing standard errors after optimization. Correlated data are included in this framework via generalized estimating equations. The estimation of parameters and effectiveness of standard errors are demonstrated through simulation and in the analysis of data from an ultrasound bioeffect study. The unified approach enables reliable computation for a wide class of inflated mixture models and comparison of competing models. PMID:20228950
Ultra wide band 3-D cross section (RCS) holography
NASA Astrophysics Data System (ADS)
Collins, H. D.; Hall, T. E.
1992-07-01
Ultra wide band impulse holography is an exciting new concept for predictive radar cross section (RCS) evaluation employing near-field measurements. Reconstruction of the near-field hologram data maps the target's scattering areas, and uniquely identifies the 'hot spot' locations on the target. In addition, the target and calibration sphere's plane wave angular spectrums are computed (via digital algorithm) and used to generate the target's far-field RCS values in three dimensions for each frequency component in the impulse. Thin and thick targets are defined in terms of their near-field amplitude variations in range. Range gating and computer holographic techniques are applied to correct these variations. Preliminary experimental results on various targets verify the concept of RCS holography. The unique 3-D presentation (i.e., typically containing 524,288 RCS values for a 1024 (times) 512 sampled aperture for every frequency component) illustrates the efficacy of target recognition in terms of its far-field plane wave angular spectrum image. RCS images can then be viewed at different angles for target recognition, etc.
Calibrating a Rainfall-Runoff and Routing Model for the Continental United States
NASA Astrophysics Data System (ADS)
Jankowfsky, S.; Li, S.; Assteerawatt, A.; Tillmanns, S.; Hilberts, A.
2014-12-01
Catastrophe risk models are widely used in the insurance industry to estimate the cost of risk. The models consist of hazard models linked to vulnerability and financial loss models. In flood risk models, the hazard model generates inundation maps. In order to develop country wide inundation maps for different return periods a rainfall-runoff and routing model is run using stochastic rainfall data. The simulated discharge and runoff is then input to a two dimensional inundation model, which produces the flood maps. In order to get realistic flood maps, the rainfall-runoff and routing models have to be calibrated with observed discharge data. The rainfall-runoff model applied here is a semi-distributed model based on the Topmodel (Beven and Kirkby, 1979) approach which includes additional snowmelt and evapotranspiration models. The routing model is based on the Muskingum-Cunge (Cunge, 1969) approach and includes the simulation of lakes and reservoirs using the linear reservoir approach. Both models were calibrated using the multiobjective NSGA-II (Deb et al., 2002) genetic algorithm with NLDAS forcing data and around 4500 USGS discharge gauges for the period from 1979-2013. Additional gauges having no data after 1979 were calibrated using CPC rainfall data. The model performed well in wetter regions and shows the difficulty of simulating areas with sinks such as karstic areas or dry areas. Beven, K., Kirkby, M., 1979. A physically based, variable contributing area model of basin hydrology. Hydrol. Sci. Bull. 24 (1), 43-69. Cunge, J.A., 1969. On the subject of a flood propagation computation method (Muskingum method), J. Hydr. Research, 7(2), 205-230. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T., 2002. A fast and elitist multiobjective genetic algorithm: NSGA-II, IEEE Transactions on evolutionary computation, 6(2), 182-197.
Fast Particle Methods for Multiscale Phenomena Simulations
NASA Technical Reports Server (NTRS)
Koumoutsakos, P.; Wray, A.; Shariff, K.; Pohorille, Andrew
2000-01-01
We are developing particle methods oriented at improving computational modeling capabilities of multiscale physical phenomena in : (i) high Reynolds number unsteady vortical flows, (ii) particle laden and interfacial flows, (iii)molecular dynamics studies of nanoscale droplets and studies of the structure, functions, and evolution of the earliest living cell. The unifying computational approach involves particle methods implemented in parallel computer architectures. The inherent adaptivity, robustness and efficiency of particle methods makes them a multidisciplinary computational tool capable of bridging the gap of micro-scale and continuum flow simulations. Using efficient tree data structures, multipole expansion algorithms, and improved particle-grid interpolation, particle methods allow for simulations using millions of computational elements, making possible the resolution of a wide range of length and time scales of these important physical phenomena.The current challenges in these simulations are in : [i] the proper formulation of particle methods in the molecular and continuous level for the discretization of the governing equations [ii] the resolution of the wide range of time and length scales governing the phenomena under investigation. [iii] the minimization of numerical artifacts that may interfere with the physics of the systems under consideration. [iv] the parallelization of processes such as tree traversal and grid-particle interpolations We are conducting simulations using vortex methods, molecular dynamics and smooth particle hydrodynamics, exploiting their unifying concepts such as : the solution of the N-body problem in parallel computers, highly accurate particle-particle and grid-particle interpolations, parallel FFT's and the formulation of processes such as diffusion in the context of particle methods. This approach enables us to transcend among seemingly unrelated areas of research.
2008 Homeland Security S and T Stakeholders Conference West-Volume 3 Tuesday
2008-01-16
Architecture ( PNNL SRS) • Online data collection / entry • Data Warehouse • On Demand Analysis and Reporting Tools • Reports, Charts & Graphs • Visual / Data...Sustainability 2007– 2016 Our region wide investment include all PANYNJ business areas Computer Statistical Analysis COMPSTAT •NYPD 1990’s •Personnel Management...Coast Guard, and public health Expertise, Depth, Agility Staff Degrees 6 Our Value Added Capabilities • Risk Analysis • Operations Analysis
Real-Time, Wide Area Dispatch of Mobil Tank Trucks
1987-01-01
human dispatchers it assists. Using CAD, Mobil has substantially re- duced costs and staff while improving customer service. I n the spring of 1985, a...process by establishing the Mobil order response center (MORC). To use MORC, the customer dials a toll-free number, available 24 hours a day, seven...MATS Figwe 3: Mobil light products order and dispatch information flow. Customers call an audio re- sponse computer system named MORC ( Mobil order
Transonic Unsteady Aerodynamics and Aeroelasticity 1987, part 1
NASA Technical Reports Server (NTRS)
Bland, Samuel R. (Compiler)
1989-01-01
Computational fluid dynamics methods have been widely accepted for transonic aeroelastic analysis. Previously, calculations with the TSD methods were used for 2-D airfoils, but now the TSD methods are applied to the aeroelastic analysis of the complete aircraft. The Symposium papers are grouped into five subject areas, two of which are covered in this part: (1) Transonic Small Disturbance (TSD) theory for complete aircraft configurations; and (2) Full potential and Euler equation methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosso, A.
Since the large North Eastern power system blackout on August 14, 2003, U.S. electric utilities have spent lot of effort on preventing power system cascading outages. Two of the main causes of the August 14, 2003 blackout were inadequate situational awareness and inadequate operator training In addition to the enhancements of the infrastructure of the interconnected power systems, more research and development of advanced power system applications are required for improving the wide-area security monitoring, operation and planning in order to prevent large- scale cascading outages of interconnected power systems. It is critically important for improving the wide-area situation awarenessmore » of the operators or operational engineers and regional reliability coordinators of large interconnected systems. With the installation of large number of phasor measurement units (PMU) and the related communication infrastructure, it will be possible to improve the operators’ situation awareness and to quickly identify the sequence of events during a large system disturbance for the post-event analysis using the real-time or historical synchrophasor data. The purpose of this project was to develop and demonstrate a novel synchrophasor-based comprehensive situational awareness system for control centers of power transmission systems. The developed system named WASA (Wide Area Situation Awareness) is intended to improve situational awareness at control centers of the power system operators and regional reliability coordinators. It consists of following main software modules: • Wide-area visualizations of real-time frequency, voltage, and phase angle measurements and their contour displays for security monitoring. • Online detection and location of a major event (location, time, size, and type, such as generator or line outage). • Near-real-time event replay (in seconds) after a major event occurs. • Early warning of potential wide-area stability problems. The system has been deployed and demonstrated at the Tennessee Valley Authority (TVA) and ISO New England system using real-time synchrophasor data from openPDC. Apart from the software product, the outcome of this project consists of a set of technical reports and papers describing the mathematical foundations and computational approaches of different tools and modules, implementation issues and considerations, lessons learned, and the results of lidation processes.« less
Dimitrakopoulos, P; Kuriakose, S
2015-04-14
Determination of the elastic properties of the membrane of artificial capsules is essential for the better design of the various devices that are utilized in their engineering and biomedical applications. However this task is complicated owing to the combined effects of the shear and area-dilatation moduli on the capsule deformation. Based on computational investigation, we propose a new methodology to determine a membrane's shear modulus, independent of its area-dilatation modulus, by flowing strain-hardening capsules in a converging micro-capillary of comparable size under Stokes flow conditions, and comparing the experimental measurements of the capsule elongation overshooting with computational data. The capsule prestress, if any, can also be determined with the same methodology. The elongation overshooting is practically independent of the viscosity ratio for low and moderate viscosity ratios, and thus a wide range of capsule fluids can be employed. Our proposed experimental device can be readily produced via glass fabrication while owing to the continuous flow in the micro-capillary, the characterization of a large number of artificial capsules is possible.
Earth-Space Link Attenuation Estimation via Ground Radar Kdp
NASA Technical Reports Server (NTRS)
Bolen, Steven M.; Benjamin, Andrew L.; Chandrasekar, V.
2003-01-01
A method of predicting attenuation on microwave Earth/spacecraft communication links, over wide areas and under various atmospheric conditions, has been developed. In the area around the ground station locations, a nearly horizontally aimed polarimetric S-band ground radar measures the specific differential phase (Kdp) along the Earth-space path. The specific attenuation along a path of interest is then computed by use of a theoretical model of the relationship between the measured S-band specific differential phase and the specific attenuation at the frequency to be used on the communication link. The model includes effects of rain, wet ice, and other forms of precipitation. The attenuation on the path of interest is then computed by integrating the specific attenuation over the length of the path. This method can be used to determine statistics of signal degradation on Earth/spacecraft communication links. It can also be used to obtain real-time estimates of attenuation along multiple Earth/spacecraft links that are parts of a communication network operating within the radar coverage area, thereby enabling better management of the network through appropriate dynamic routing along the best combination of links.
Martins, Luciana Flaquer; Vigorito, Julio Wilson
2013-01-01
To determine the characteristics of facial soft tissues at rest and wide smile, and their possible relation to the facial type. We analyzed a sample of forty-eight young female adults, aged between 19.10 and 40 years old, with a mean age of 30.9 years, who had balanced profile and passive lip seal. Cone beam computed tomographies were performed at rest and wide smile postures on the entire sample which was divided into three groups according to individual facial types. Soft tissue features analysis of the lips, nose, zygoma and chin were done in sagittal, axial and frontal axis tomographic views. No differences were observed in any of the facial type variables for the static analysis of facial structures at both rest and wide smile postures. Dynamic analysis showed that brachifacial types are more sensitive to movement, presenting greater sagittal lip contraction. However, the lip movement produced by this type of face results in a narrow smile, with smaller tooth exposure area when compared with other facial types. Findings pointed out that the position of the upper lip should be ahead of the lower lip, and the latter, ahead of the pogonion. It was also found that the facial type does not impact the positioning of these structures. Additionally, the use of cone beam computed tomography may be a valuable method to study craniofacial features.
National Fusion Collaboratory: Grid Computing for Simulations and Experiments
NASA Astrophysics Data System (ADS)
Greenwald, Martin
2004-05-01
The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.
Brown, C Hendricks; Mohr, David C; Gallo, Carlos G; Mader, Christopher; Palinkas, Lawrence; Wingood, Gina; Prado, Guillermo; Kellam, Sheppard G; Pantin, Hilda; Poduska, Jeanne; Gibbons, Robert; McManus, John; Ogihara, Mitsunori; Valente, Thomas; Wulczyn, Fred; Czaja, Sara; Sutcliffe, Geoff; Villamar, Juan; Jacobs, Christopher
2013-06-01
African Americans and Hispanics in the United States have much higher rates of HIV than non-minorities. There is now strong evidence that a range of behavioral interventions are efficacious in reducing sexual risk behavior in these populations. Although a handful of these programs are just beginning to be disseminated widely, we still have not implemented effective programs to a level that would reduce the population incidence of HIV for minorities. We proposed that innovative approaches involving computational technologies be explored for their use in both developing new interventions and in supporting wide-scale implementation of effective behavioral interventions. Mobile technologies have a place in both of these activities. First, mobile technologies can be used in sensing contexts and interacting to the unique preferences and needs of individuals at times where intervention to reduce risk would be most impactful. Second, mobile technologies can be used to improve the delivery of interventions by facilitators and their agencies. Systems science methods including social network analysis, agent-based models, computational linguistics, intelligent data analysis, and systems and software engineering all have strategic roles that can bring about advances in HIV prevention in minority communities. Using an existing mobile technology for depression and 3 effective HIV prevention programs, we illustrated how 8 areas in the intervention/implementation process can use innovative computational approaches to advance intervention adoption, fidelity, and sustainability.
Parallelization of fine-scale computation in Agile Multiscale Modelling Methodology
NASA Astrophysics Data System (ADS)
Macioł, Piotr; Michalik, Kazimierz
2016-10-01
Nowadays, multiscale modelling of material behavior is an extensively developed area. An important obstacle against its wide application is high computational demands. Among others, the parallelization of multiscale computations is a promising solution. Heterogeneous multiscale models are good candidates for parallelization, since communication between sub-models is limited. In this paper, the possibility of parallelization of multiscale models based on Agile Multiscale Methodology framework is discussed. A sequential, FEM based macroscopic model has been combined with concurrently computed fine-scale models, employing a MatCalc thermodynamic simulator. The main issues, being investigated in this work are: (i) the speed-up of multiscale models with special focus on fine-scale computations and (ii) on decreasing the quality of computations enforced by parallel execution. Speed-up has been evaluated on the basis of Amdahl's law equations. The problem of `delay error', rising from the parallel execution of fine scale sub-models, controlled by the sequential macroscopic sub-model is discussed. Some technical aspects of combining third-party commercial modelling software with an in-house multiscale framework and a MPI library are also discussed.
The Center for Computational Biology: resources, achievements, and challenges
Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott
2011-01-01
The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains. PMID:22081221
The Center for Computational Biology: resources, achievements, and challenges.
Toga, Arthur W; Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott
2012-01-01
The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains.
Research and Development Annual Report, 1992
NASA Technical Reports Server (NTRS)
1993-01-01
Issued as a companion to Johnson Space Center's Research and Technology Annual Report, which reports JSC accomplishments under NASA Research and Technology Operating Plan (RTOP) funding, this report describes 42 additional JSC projects that are funded through sources other than the RTOP. Emerging technologies in four major disciplines are summarized: space systems technology, medical and life sciences, mission operations, and computer systems. Although these projects focus on support of human spacecraft design, development, and safety, most have wide civil and commercial applications in areas such as advanced materials, superconductors, advanced semiconductors, digital imaging, high density data storage, high performance computers, optoelectronics, artificial intelligence, robotics and automation, sensors, biotechnology, medical devices and diagnosis, and human factors engineering.
NASA Astrophysics Data System (ADS)
Neradilová, Hana; Fedorko, Gabriel
2016-12-01
Automated logistic systems are becoming more widely used within enterprise logistics processes. Their main advantage is that they allow increasing the efficiency and reliability of logistics processes. In terms of evaluating their effectiveness, it is necessary to take into account the economic aspect of the entire process. However, many users ignore and underestimate this area,which is not correct. One of the reasons why the economic aspect is overlooked is the fact that obtaining information for such an analysis is not easy. The aim of this paper is to present the possibilities of computer simulation methods for obtaining data for full-scale economic analysis implementation.
The JSC Research and Development Annual Report 1993
NASA Technical Reports Server (NTRS)
1994-01-01
Issued as a companion to Johnson Space Center's Research and Technology Annual Report, which reports JSC accomplishments under NASA Research and Technology Operating Plan (RTOP) funding, this report describes 47 additional projects that are funded through sources other than the RTOP. Emerging technologies in four major disciplines are summarized: space systems technology, medical and life sciences, mission operations, and computer systems. Although these projects focus on support of human spacecraft design, development, and safety, most have wide civil and commercial applications in areas such as advanced materials, superconductors, advanced semiconductors, digital imaging, high density data storage, high performance computers, optoelectronics, artificial intelligence, robotics and automation, sensors, biotechnology, medical devices and diagnosis, and human factors engineering.
Implementation of an optimum profile guidance system on STOLAND
NASA Technical Reports Server (NTRS)
Flanagan, P. F.
1978-01-01
The implementation on the STOLAND airborne digital computer of an optimum profile guidance system for the augmentor wing jet STOL research aircraft is described. Major tasks were to implement the guidance and control logic to airborne computer software and to integrate the module with the existing STOLAND navigation, display, and autopilot routines. The optimum profile guidance system comprises an algorithm for synthesizing mimimum fuel trajectories for a wide range of starting positions in the terminal area and a control law for flying the aircraft automatically along the trajectory. The avionics software developed is described along with a FORTRAN program that was constructed to reflect the modular nature and algorthms implemented in the avionics software.
NASA Technical Reports Server (NTRS)
Kwak, Dochan
2005-01-01
Over the past 30 years, numerical methods and simulation tools for fluid dynamic problems have advanced as a new discipline, namely, computational fluid dynamics (CFD). Although a wide spectrum of flow regimes are encountered in many areas of science and engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to a large demand for predicting the aerodynamic performance characteristics of flight vehicles, such as commercial, military, and space vehicles. As flow analysis is required to be more accurate and computationally efficient for both commercial and mission-oriented applications (such as those encountered in meteorology, aerospace vehicle development, general fluid engineering and biofluid analysis) CFD tools for engineering become increasingly important for predicting safety, performance and cost. This paper presents the author's perspective on the maturity of CFD, especially from an aerospace engineering point of view.
The Berlin Brain-Computer Interface: Progress Beyond Communication and Control
Blankertz, Benjamin; Acqualagna, Laura; Dähne, Sven; Haufe, Stefan; Schultze-Kraft, Matthias; Sturm, Irene; Ušćumlic, Marija; Wenzel, Markus A.; Curio, Gabriel; Müller, Klaus-Robert
2016-01-01
The combined effect of fundamental results about neurocognitive processes and advancements in decoding mental states from ongoing brain signals has brought forth a whole range of potential neurotechnological applications. In this article, we review our developments in this area and put them into perspective. These examples cover a wide range of maturity levels with respect to their applicability. While we assume we are still a long way away from integrating Brain-Computer Interface (BCI) technology in general interaction with computers, or from implementing neurotechnological measures in safety-critical workplaces, results have already now been obtained involving a BCI as research tool. In this article, we discuss the reasons why, in some of the prospective application domains, considerable effort is still required to make the systems ready to deal with the full complexity of the real world. PMID:27917107
The Berlin Brain-Computer Interface: Progress Beyond Communication and Control.
Blankertz, Benjamin; Acqualagna, Laura; Dähne, Sven; Haufe, Stefan; Schultze-Kraft, Matthias; Sturm, Irene; Ušćumlic, Marija; Wenzel, Markus A; Curio, Gabriel; Müller, Klaus-Robert
2016-01-01
The combined effect of fundamental results about neurocognitive processes and advancements in decoding mental states from ongoing brain signals has brought forth a whole range of potential neurotechnological applications. In this article, we review our developments in this area and put them into perspective. These examples cover a wide range of maturity levels with respect to their applicability. While we assume we are still a long way away from integrating Brain-Computer Interface (BCI) technology in general interaction with computers, or from implementing neurotechnological measures in safety-critical workplaces, results have already now been obtained involving a BCI as research tool. In this article, we discuss the reasons why, in some of the prospective application domains, considerable effort is still required to make the systems ready to deal with the full complexity of the real world.
Visualization of the tire-soil interaction area by means of ObjectARX programming interface
NASA Astrophysics Data System (ADS)
Mueller, W.; Gruszczyński, M.; Raba, B.; Lewicki, A.; Przybył, K.; Zaborowicz, M.; Koszela, K.; Boniecki, P.
2014-04-01
The process of data visualization, important for their analysis, becomes problematic when large data sets generated via computer simulations are available. This problem concerns, among others, the models that describe the geometry of tire-soil interaction. For the purpose of a graphical representation of this area and implementation of various geometric calculations the authors have developed a plug-in application for AutoCAD, based on the latest technologies, including ObjectARX, LINQ and the use of Visual Studio platform. Selected programming tools offer a wide variety of IT structures that enable data visualization and data analysis and are important e.g. in model verification.
Enhancing LoRaWAN Security through a Lightweight and Authenticated Key Management Approach.
Sanchez-Iborra, Ramon; Sánchez-Gómez, Jesús; Pérez, Salvador; Fernández, Pedro J; Santa, José; Hernández-Ramos, José L; Skarmeta, Antonio F
2018-06-05
Luckily, new communication technologies and protocols are nowadays designed considering security issues. A clear example of this can be found in the Internet of Things (IoT) field, a quite recent area where communication technologies such as ZigBee or IPv6 over Low power Wireless Personal Area Networks (6LoWPAN) already include security features to guarantee authentication, confidentiality and integrity. More recent technologies are Low-Power Wide-Area Networks (LP-WAN), which also consider security, but present initial approaches that can be further improved. An example of this can be found in Long Range (LoRa) and its layer-two supporter LoRa Wide Area Network (LoRaWAN), which include a security scheme based on pre-shared cryptographic material lacking flexibility when a key update is necessary. Because of this, in this work, we evaluate the security vulnerabilities of LoRaWAN in the area of key management and propose different alternative schemes. Concretely, the application of an approach based on the recently specified Ephemeral Diffie⁻Hellman Over COSE (EDHOC) is found as a convenient solution, given its flexibility in the update of session keys, its low computational cost and the limited message exchanges needed. A comparative conceptual analysis considering the overhead of different security schemes for LoRaWAN is carried out in order to evaluate their benefits in the challenging area of LP-WAN.
SETI reloaded: Next generation radio telescopes, transients and cognitive computing
NASA Astrophysics Data System (ADS)
Garrett, Michael A.
2015-08-01
The Search for Extra-terrestrial Intelligence (SETI) using radio telescopes is an area of research that is now more than 50 years old. Thus far, both targeted and wide-area surveys have yet to detect artificial signals from intelligent civilisations. In this paper, I argue that the incidence of co-existing intelligent and communicating civilisations is probably small in the Milky Way. While this makes successful SETI searches a very difficult pursuit indeed, the huge impact of even a single detection requires us to continue the search. A substantial increase in the overall performance of radio telescopes (and in particular future wide-field instruments such as the Square Kilometre Array - SKA), provide renewed optimism in the field. Evidence for this is already to be seen in the success of SETI researchers in acquiring observations on some of the world's most sensitive radio telescope facilities via open, peer-reviewed processes. The increasing interest in the dynamic radio sky, and our ability to detect new and rapid transient phenomena such as Fast Radio Bursts (FRB) is also greatly encouraging. While the nature of FRBs is not yet fully understood, I argue they are unlikely to be the signature of distant extra-terrestrial civilisations. As astronomers face a data avalanche on all sides, advances made in related areas such as advanced Big Data analytics, and cognitive computing are crucial to enable serendipitous discoveries to be made. In any case, as the era of the SKA fast approaches, the prospects of a SETI detection have never been better.
Supercritical entanglement in local systems: Counterexample to the area law for quantum matter.
Movassagh, Ramis; Shor, Peter W
2016-11-22
Quantum entanglement is the most surprising feature of quantum mechanics. Entanglement is simultaneously responsible for the difficulty of simulating quantum matter on a classical computer and the exponential speedups afforded by quantum computers. Ground states of quantum many-body systems typically satisfy an "area law": The amount of entanglement between a subsystem and the rest of the system is proportional to the area of the boundary. A system that obeys an area law has less entanglement and can be simulated more efficiently than a generic quantum state whose entanglement could be proportional to the total system's size. Moreover, an area law provides useful information about the low-energy physics of the system. It is widely believed that for physically reasonable quantum systems, the area law cannot be violated by more than a logarithmic factor in the system's size. We introduce a class of exactly solvable one-dimensional physical models which we can prove have exponentially more entanglement than suggested by the area law, and violate the area law by a square-root factor. This work suggests that simple quantum matter is richer and can provide much more quantum resources (i.e., entanglement) than expected. In addition to using recent advances in quantum information and condensed matter theory, we have drawn upon various branches of mathematics such as combinatorics of random walks, Brownian excursions, and fractional matching theory. We hope that the techniques developed herein may be useful for other problems in physics as well.
Supercritical entanglement in local systems: Counterexample to the area law for quantum matter
Movassagh, Ramis; Shor, Peter W.
2016-01-01
Quantum entanglement is the most surprising feature of quantum mechanics. Entanglement is simultaneously responsible for the difficulty of simulating quantum matter on a classical computer and the exponential speedups afforded by quantum computers. Ground states of quantum many-body systems typically satisfy an “area law”: The amount of entanglement between a subsystem and the rest of the system is proportional to the area of the boundary. A system that obeys an area law has less entanglement and can be simulated more efficiently than a generic quantum state whose entanglement could be proportional to the total system’s size. Moreover, an area law provides useful information about the low-energy physics of the system. It is widely believed that for physically reasonable quantum systems, the area law cannot be violated by more than a logarithmic factor in the system’s size. We introduce a class of exactly solvable one-dimensional physical models which we can prove have exponentially more entanglement than suggested by the area law, and violate the area law by a square-root factor. This work suggests that simple quantum matter is richer and can provide much more quantum resources (i.e., entanglement) than expected. In addition to using recent advances in quantum information and condensed matter theory, we have drawn upon various branches of mathematics such as combinatorics of random walks, Brownian excursions, and fractional matching theory. We hope that the techniques developed herein may be useful for other problems in physics as well. PMID:27821725
Update on Bayesian Blocks: Segmented Models for Sequential Data
NASA Technical Reports Server (NTRS)
Scargle, Jeff
2017-01-01
The Bayesian Block algorithm, in wide use in astronomy and other areas, has been improved in several ways. The model for block shape has been generalized to include other than constant signal rate - e.g., linear, exponential, or other parametric models. In addition the computational efficiency has been improved, so that instead of O(N**2) the basic algorithm is O(N) in most cases. Other improvements in the theory and application of segmented representations will be described.
CASL Dakota Capabilities Summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Simmons, Chris; Williams, Brian J.
2017-10-10
The Dakota software project serves the mission of Sandia National Laboratories and supports a worldwide user community by delivering state-of-the-art research and robust, usable software for optimization and uncertainty quantification. These capabilities enable advanced exploration and riskinformed prediction with a wide range of computational science and engineering models. Dakota is the verification and validation (V&V) / uncertainty quantification (UQ) software delivery vehicle for CASL, allowing analysts across focus areas to apply these capabilities to myriad nuclear engineering analyses.
A distributed parallel storage architecture and its potential application within EOSDIS
NASA Technical Reports Server (NTRS)
Johnston, William E.; Tierney, Brian; Feuquay, Jay; Butzer, Tony
1994-01-01
We describe the architecture, implementation, use of a scalable, high performance, distributed-parallel data storage system developed in the ARPA funded MAGIC gigabit testbed. A collection of wide area distributed disk servers operate in parallel to provide logical block level access to large data sets. Operated primarily as a network-based cache, the architecture supports cooperation among independently owned resources to provide fast, large-scale, on-demand storage to support data handling, simulation, and computation.
Future applications of artificial intelligence to Mission Control Centers
NASA Technical Reports Server (NTRS)
Friedland, Peter
1991-01-01
Future applications of artificial intelligence to Mission Control Centers are presented in the form of the viewgraphs. The following subject areas are covered: basic objectives of the NASA-wide AI program; inhouse research program; constraint-based scheduling; learning and performance improvement for scheduling; GEMPLAN multi-agent planner; planning, scheduling, and control; Bayesian learning; efficient learning algorithms; ICARUS (an integrated architecture for learning); design knowledge acquisition and retention; computer-integrated documentation; and some speculation on future applications.
Gross, Seth A; Smith, Michael S; Kaul, Vivek
2017-01-01
Background Barrett’s esophagus (BE) and esophageal dysplasia (ED) are frequently missed during screening and surveillance esophagoscopy because of sampling error associated with four-quadrant random forceps biopsy (FB). Aim The aim of this article is to determine if wide-area transepithelial sampling with three-dimensional computer-assisted analysis (WATS) used adjunctively with FB can increase the detection of BE and ED. Methods In this multicenter prospective trial, patients screened for suspected BE and those with known BE undergoing surveillance were enrolled. Patients at 25 community-based practices underwent WATS adjunctively to targeted FB and random four-quadrant FB. Results Of 4203 patients, 594 were diagnosed with BE by FB alone, and 493 additional cases were detected by adding WATS, increasing the overall detection of BE by 83% (493/594, 95% CI 74%–93%). Low-grade dysplasia (LGD) was diagnosed in 26 patients by FB alone, and 23 additional cases were detected by adding WATS, increasing the detection of LGD by 88.5% (23/26, 95% CI 48%–160%). Conclusions Adjunctive use of WATS to FB significantly improves the detection of both BE and ED. Sampling error, an inherent limitation associated with screening and surveillance, can be improved with WATS allowing better informed decisions to be made about the management and subsequent treatment of these patients. PMID:29881608
NASA Astrophysics Data System (ADS)
1996-02-01
Computational Chemistry for the Masses Not long ago, chemical computation was considered a specialty area requiring extensive computer knowledge, power, and time. Over the past decade, however, it has changed from the arcane pursuit of a few advanced university researchers in the area of physical chemistry to a familiar tool used by a wide range of chemists. Nevertheless, it has required its practitioners to have extensive knowledge of computer programming and a thorough understanding of theoretical chemical concepts and as a result usually was reserved for the graduate curriculum. Now a further metamorphosis is in progress, as computational chemistry moves into the undergraduate curriculum, often using off-the-shelf software--commercial packages or adaptations of them that are readily shared by their creators. As we put this issue together, we realized that many of the articles involved sophisticated computations that would not have been possible a few years ago in the courses described. Further, the hard and software used was widely available at a reasonable cost. Some of the articles focus on the teaching of computational methods and others simply incorporate it as a facet in their overall strategy; however, taken together, they reflect a strong trend to utilize a diverse set of readily available methods and products in the undergraduate curriculum. The most familiar recent use of computational chemistry is the computer design of molecules in organic, medicinal, and biochemistry. However, computational chemistry is useful for inorganic chemists as well and is now migrating to undergraduate courses. Lipkowitz, Pearl, Robertson, and Schultz (page 105) make a strong case for its inclusion and present a two-week component they have developed for their senior-level laboratory course. Comba and Zimmer (page 108) offer a review of inorganic molecular mechanics calculations, which is designed for the novice and includes the basic equations, their application to inorganic molecules, and a discussion of the how to evaluate the reliability of the results. A computational experiment has been specifically designed for the undergraduate laboratory by Bakalbassis, Stiakaki, Tsipis, and Tsipis (page 111). The students use an atom-superposition and electron-delocalization molecular orbital model to predict the structural, spectroscopic, and energetic properties of highly ionic metal-containing systems. The exercise introduces students to the value of computational experiments as an alternative to wet-lab work and teaches enough quantum theory to make them comfortable with current literature. For teachers of organic chemistry, Delaware and Fountain (page 116) analyze how models can actually hinder learning in the introductory course if presented passively and describe how to use computer visualizations of reactions in an active, cooperative learning mode. They argue that these computational exercises need to be embedded in a carefully planned learning system to be effective. In similar fashion, Sauers (page 114) finds that a computer-assisted molecular modeling experiment is an effective way of making the concept of "steric interactions" more accessible. The theoretical number of isomers and derivatives of organic compounds is another concept difficult to visualize, and the calculations that would used for enumeration are complex enough that they are not usually brought into the undergraduate curriculum. However, Novak (page 120) demonstrates that widely available PC software, such as Mathematica, can be used by undergraduates along with the Polya enumeration method to enumerate derivatives and see the connection between these numbers and the symmetry of the parent molecule. A different use of computational software in biochemistry than the usual computer-assisted design of molecules is the main focus of a Computer Series article by Letkeman (page 165), who models the complex interactions of metal ions in human blood serum.
KOLAM: a cross-platform architecture for scalable visualization and tracking in wide-area imagery
NASA Astrophysics Data System (ADS)
Fraser, Joshua; Haridas, Anoop; Seetharaman, Guna; Rao, Raghuveer M.; Palaniappan, Kannappan
2013-05-01
KOLAM is an open, cross-platform, interoperable, scalable and extensible framework supporting a novel multi- scale spatiotemporal dual-cache data structure for big data visualization and visual analytics. This paper focuses on the use of KOLAM for target tracking in high-resolution, high throughput wide format video also known as wide-area motion imagery (WAMI). It was originally developed for the interactive visualization of extremely large geospatial imagery of high spatial and spectral resolution. KOLAM is platform, operating system and (graphics) hardware independent, and supports embedded datasets scalable from hundreds of gigabytes to feasibly petabytes in size on clusters, workstations, desktops and mobile computers. In addition to rapid roam, zoom and hyper- jump spatial operations, a large number of simultaneously viewable embedded pyramid layers (also referred to as multiscale or sparse imagery), interactive colormap and histogram enhancement, spherical projection and terrain maps are supported. The KOLAM software architecture was extended to support airborne wide-area motion imagery by organizing spatiotemporal tiles in very large format video frames using a temporal cache of tiled pyramid cached data structures. The current version supports WAMI animation, fast intelligent inspection, trajectory visualization and target tracking (digital tagging); the latter by interfacing with external automatic tracking software. One of the critical needs for working with WAMI is a supervised tracking and visualization tool that allows analysts to digitally tag multiple targets, quickly review and correct tracking results and apply geospatial visual analytic tools on the generated trajectories. One-click manual tracking combined with multiple automated tracking algorithms are available to assist the analyst and increase human effectiveness.
Sharing digital micrographs and other data files between computers.
Entwistle, A
2004-01-01
It ought to be easy to exchange digital micrographs and other computer data files with a colleague even on another continent. In practice, this often is not the case. The advantages and disadvantages of various methods that are available for exchanging data files between computers are discussed. When possible, data should be transferred through computer networking. When data are to be exchanged locally between computers with similar operating systems, the use of a local area network is recommended. For computers in commercial or academic environments that have dissimilar operating systems or are more widely spaced, the use of FTPs is recommended. Failing this, posting the data on a website and transferring by hypertext transfer protocol is suggested. If peer to peer exchange between computers in domestic environments is needed, the use of Messenger services such as Microsoft Messenger or Yahoo Messenger is the method of choice. When it is not possible to transfer the data files over the internet, single use, writable CD ROMs are the best media for transferring data. If for some reason this is not possible, DVD-R/RW, DVD+R/RW, 100 MB ZIP disks and USB flash media are potentially useful media for exchanging data files.
JPARSS: A Java Parallel Network Package for Grid Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Jie; Akers, Walter; Chen, Ying
2002-03-01
The emergence of high speed wide area networks makes grid computinga reality. However grid applications that need reliable data transfer still have difficulties to achieve optimal TCP performance due to network tuning of TCP window size to improve bandwidth and to reduce latency on a high speed wide area network. This paper presents a Java package called JPARSS (Java Parallel Secure Stream (Socket)) that divides data into partitions that are sent over several parallel Java streams simultaneously and allows Java or Web applications to achieve optimal TCP performance in a grid environment without the necessity of tuning TCP window size.more » This package enables single sign-on, certificate delegation and secure or plain-text data transfer using several security components based on X.509 certificate and SSL. Several experiments will be presented to show that using Java parallelstreams is more effective than tuning TCP window size. In addition a simple architecture using Web services« less
Comparison of adaptive critic-based and classical wide-area controllers for power systems.
Ray, Swakshar; Venayagamoorthy, Ganesh Kumar; Chaudhuri, Balarko; Majumder, Rajat
2008-08-01
An adaptive critic design (ACD)-based damping controller is developed for a thyristor-controlled series capacitor (TCSC) installed in a power system with multiple poorly damped interarea modes. The performance of this ACD computational intelligence-based method is compared with two classical techniques, which are observer-based state-feedback (SF) control and linear matrix inequality LMI-H(infinity) robust control. Remote measurements are used as feedback signals to the wide-area damping controller for modulating the compensation of the TCSC. The classical methods use a linearized model of the system whereas the ACD method is purely measurement-based, leading to a nonlinear controller with fixed parameters. A comparative analysis of the controllers' performances is carried out under different disturbance scenarios. The ACD-based design has shown promising performance with very little knowledge of the system compared to classical model-based controllers. This paper also discusses the advantages and disadvantages of ACDs, SF, and LMI-H(infinity).
The Development of Ontology from Multiple Databases
NASA Astrophysics Data System (ADS)
Kasim, Shahreen; Aswa Omar, Nurul; Fudzee, Mohd Farhan Md; Azhar Ramli, Azizul; Aizi Salamat, Mohamad; Mahdin, Hairulnizam
2017-08-01
The area of halal industry is the fastest growing global business across the world. The halal food industry is thus crucial for Muslims all over the world as it serves to ensure them that the food items they consume daily are syariah compliant. Currently, ontology has been widely used in computer sciences area such as web on the heterogeneous information processing, semantic web, and information retrieval. However, ontology has still not been used widely in the halal industry. Today, Muslim community still have problem to verify halal status for products in the market especially foods consisting of E number. This research tried to solve problem in validating the halal status from various halal sources. There are various chemical ontology from multilple databases found to help this ontology development. The E numbers in this chemical ontology are codes for chemicals that can be used as food additives. With this E numbers ontology, Muslim community could identify and verify the halal status effectively for halal products in the market.
An exploration of neuromorphic systems and related design issues/challenges in dark silicon era
NASA Astrophysics Data System (ADS)
Chandaliya, Mudit; Chaturvedi, Nitin; Gurunarayanan, S.
2018-03-01
The current microprocessors has shown a remarkable performance and memory capacity improvement since its innovation. However, due to power and thermal limitations, only a fraction of cores can operate at full frequency at any instant of time irrespective of the advantages of new technology generation. This phenomenon of under-utilization of microprocessor is called as dark silicon which leads to distraction in innovative computing. To overcome the limitation of utilization wall, IBM technologies explored and invented neurosynaptic system chips. It has opened a wide scope of research in the field of innovative computing, technology, material sciences, machine learning etc. In this paper, we first reviewed the diverse stages of research that have been influential in the innovation of neurosynaptic architectures. These, architectures focuses on the development of brain-like framework which is efficient enough to execute a broad set of computations in real time while maintaining ultra-low power consumption as well as area considerations in mind. We also reveal the inadvertent challenges and the opportunities of designing neuromorphic systems as presented by the existing technologies in the dark silicon era, which constitute the utmost area of research in future.
Multistate Memristive Tantalum Oxide Devices for Ternary Arithmetic
Kim, Wonjoo; Chattopadhyay, Anupam; Siemon, Anne; Linn, Eike; Waser, Rainer; Rana, Vikas
2016-01-01
Redox-based resistive switching random access memory (ReRAM) offers excellent properties to implement future non-volatile memory arrays. Recently, the capability of two-state ReRAMs to implement Boolean logic functionality gained wide interest. Here, we report on seven-states Tantalum Oxide Devices, which enable the realization of an intrinsic modular arithmetic using a ternary number system. Modular arithmetic, a fundamental system for operating on numbers within the limit of a modulus, is known to mathematicians since the days of Euclid and finds applications in diverse areas ranging from e-commerce to musical notations. We demonstrate that multistate devices not only reduce the storage area consumption drastically, but also enable novel in-memory operations, such as computing using high-radix number systems, which could not be implemented using two-state devices. The use of high radix number system reduces the computational complexity by reducing the number of needed digits. Thus the number of calculation operations in an addition and the number of logic devices can be reduced. PMID:27834352
Multistate Memristive Tantalum Oxide Devices for Ternary Arithmetic.
Kim, Wonjoo; Chattopadhyay, Anupam; Siemon, Anne; Linn, Eike; Waser, Rainer; Rana, Vikas
2016-11-11
Redox-based resistive switching random access memory (ReRAM) offers excellent properties to implement future non-volatile memory arrays. Recently, the capability of two-state ReRAMs to implement Boolean logic functionality gained wide interest. Here, we report on seven-states Tantalum Oxide Devices, which enable the realization of an intrinsic modular arithmetic using a ternary number system. Modular arithmetic, a fundamental system for operating on numbers within the limit of a modulus, is known to mathematicians since the days of Euclid and finds applications in diverse areas ranging from e-commerce to musical notations. We demonstrate that multistate devices not only reduce the storage area consumption drastically, but also enable novel in-memory operations, such as computing using high-radix number systems, which could not be implemented using two-state devices. The use of high radix number system reduces the computational complexity by reducing the number of needed digits. Thus the number of calculation operations in an addition and the number of logic devices can be reduced.
Multistate Memristive Tantalum Oxide Devices for Ternary Arithmetic
NASA Astrophysics Data System (ADS)
Kim, Wonjoo; Chattopadhyay, Anupam; Siemon, Anne; Linn, Eike; Waser, Rainer; Rana, Vikas
2016-11-01
Redox-based resistive switching random access memory (ReRAM) offers excellent properties to implement future non-volatile memory arrays. Recently, the capability of two-state ReRAMs to implement Boolean logic functionality gained wide interest. Here, we report on seven-states Tantalum Oxide Devices, which enable the realization of an intrinsic modular arithmetic using a ternary number system. Modular arithmetic, a fundamental system for operating on numbers within the limit of a modulus, is known to mathematicians since the days of Euclid and finds applications in diverse areas ranging from e-commerce to musical notations. We demonstrate that multistate devices not only reduce the storage area consumption drastically, but also enable novel in-memory operations, such as computing using high-radix number systems, which could not be implemented using two-state devices. The use of high radix number system reduces the computational complexity by reducing the number of needed digits. Thus the number of calculation operations in an addition and the number of logic devices can be reduced.
The Role of Remote Sensing in Assessing Forest Biomass in Appalachian South Carolina
NASA Technical Reports Server (NTRS)
Shain, W.; Nix, L.
1982-01-01
Information is presented on the use of color infrared aerial photographs and ground sampling methods to quantify standing forest biomass in Appalachian South Carolina. Local tree biomass equations are given and subsequent evaluation of stand density and size classes using remote sensing methods is presented. Methods of terrain analysis, environmental hazard rating, and subsequent determination of accessibility of forest biomass are discussed. Computer-based statistical analyses are used to expand individual cover-type specific ground sample data to area-wide cover type inventory figures based on aerial photographic interpretation and area measurement. Forest biomass data are presented for the study area in terms of discriminant size classes, merchantability limits, accessibility (as related to terrain and yield/harvest constraints), and potential environmental impact of harvest.
Yu, Dantong; Katramatos, Dimitrios; Sim, Alexander; Shoshani, Arie
2014-04-22
A cross-domain network resource reservation scheduler configured to schedule a path from at least one end-site includes a management plane device configured to monitor and provide information representing at least one of functionality, performance, faults, and fault recovery associated with a network resource; a control plane device configured to at least one of schedule the network resource, provision local area network quality of service, provision local area network bandwidth, and provision wide area network bandwidth; and a service plane device configured to interface with the control plane device to reserve the network resource based on a reservation request and the information from the management plane device. Corresponding methods and computer-readable medium are also disclosed.
Use of ``virtual'' field trips in teaching introductory geology
NASA Astrophysics Data System (ADS)
Hurst, Stephen D.
1998-08-01
We designed a series of case studies for Introductory Geology Laboratory courses using computer visualization techniques integrated with traditional laboratory materials. These consist of a comprehensive case study which requires three two-hour long laboratory periods to complete, and several shorter case studies requiring one or two, two-hour laboratory periods. Currently we have prototypes of the Yellowstone National Park, Hawaii volcanoes and the Mid-Atlantic Ridge case studies. The Yellowstone prototype can be used to learn about a wide variety of rocks and minerals, about geothermal activity and hydrology, about volcanic hazards and the hot-spot theory of plate tectonics. The Hawaiian exercise goes into more depth about volcanoes, volcanic rocks and their relationship to plate movements. The Mid-Atlantic Ridge project focuses on formation of new ocean crust and mineral-rich hydrothermal deposits at spreading centers. With new improvements in visualization technology that are making their way to personal computers, we are now closer to the ideal of a "virtual" field trip. We are currently making scenes of field areas in Hawaii and Yellowstone which allow the student to pan around the area and zoom in on interesting objects. Specific rocks in the scene will be able to be "picked up" and studied in three dimensions. This technology improves the ability of the computer to present a realistic simulation of the field area and allows the student to have more control over the presentation. This advanced interactive technology is intuitive to control, relatively cheap and easy to add to existing computer programs and documents.
del-Moral-Martínez, Ignacio; Rosell-Polo, Joan R.; Company, Joaquim; Sanz, Ricardo; Escolà, Alexandre; Masip, Joan; Martínez-Casasnovas, José A.; Arnó, Jaume
2016-01-01
The leaf area index (LAI) is defined as the one-side leaf area per unit ground area, and is probably the most widely used index to characterize grapevine vigor. However, LAI varies spatially within vineyard plots. Mapping and quantifying this variability is very important for improving management decisions and agricultural practices. In this study, a mobile terrestrial laser scanner (MTLS) was used to map the LAI of a vineyard, and then to examine how different scanning methods (on-the-go or discontinuous systematic sampling) may affect the reliability of the resulting raster maps. The use of the MTLS allows calculating the enveloping vegetative area of the canopy, which is the sum of the leaf wall areas for both sides of the row (excluding gaps) and the projected upper area. Obtaining the enveloping areas requires scanning from both sides one meter length section along the row at each systematic sampling point. By converting the enveloping areas into LAI values, a raster map of the latter can be obtained by spatial interpolation (kriging). However, the user can opt for scanning on-the-go in a continuous way and compute 1-m LAI values along the rows, or instead, perform the scanning at discontinuous systematic sampling within the plot. An analysis of correlation between maps indicated that MTLS can be used discontinuously in specific sampling sections separated by up to 15 m along the rows. This capability significantly reduces the amount of data to be acquired at field level, the data storage capacity and the processing power of computers. PMID:26797618
del-Moral-Martínez, Ignacio; Rosell-Polo, Joan R; Company, Joaquim; Sanz, Ricardo; Escolà, Alexandre; Masip, Joan; Martínez-Casasnovas, José A; Arnó, Jaume
2016-01-19
The leaf area index (LAI) is defined as the one-side leaf area per unit ground area, and is probably the most widely used index to characterize grapevine vigor. However, LAI varies spatially within vineyard plots. Mapping and quantifying this variability is very important for improving management decisions and agricultural practices. In this study, a mobile terrestrial laser scanner (MTLS) was used to map the LAI of a vineyard, and then to examine how different scanning methods (on-the-go or discontinuous systematic sampling) may affect the reliability of the resulting raster maps. The use of the MTLS allows calculating the enveloping vegetative area of the canopy, which is the sum of the leaf wall areas for both sides of the row (excluding gaps) and the projected upper area. Obtaining the enveloping areas requires scanning from both sides one meter length section along the row at each systematic sampling point. By converting the enveloping areas into LAI values, a raster map of the latter can be obtained by spatial interpolation (kriging). However, the user can opt for scanning on-the-go in a continuous way and compute 1-m LAI values along the rows, or instead, perform the scanning at discontinuous systematic sampling within the plot. An analysis of correlation between maps indicated that MTLS can be used discontinuously in specific sampling sections separated by up to 15 m along the rows. This capability significantly reduces the amount of data to be acquired at field level, the data storage capacity and the processing power of computers.
Solving optimization problems on computational grids.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wright, S. J.; Mathematics and Computer Science
2001-05-01
Multiprocessor computing platforms, which have become more and more widely available since the mid-1980s, are now heavily used by organizations that need to solve very demanding computational problems. Parallel computing is now central to the culture of many research communities. Novel parallel approaches were developed for global optimization, network optimization, and direct-search methods for nonlinear optimization. Activity was particularly widespread in parallel branch-and-bound approaches for various problems in combinatorial and network optimization. As the cost of personal computers and low-end workstations has continued to fall, while the speed and capacity of processors and networks have increased dramatically, 'cluster' platforms havemore » become popular in many settings. A somewhat different type of parallel computing platform know as a computational grid (alternatively, metacomputer) has arisen in comparatively recent times. Broadly speaking, this term refers not to a multiprocessor with identical processing nodes but rather to a heterogeneous collection of devices that are widely distributed, possibly around the globe. The advantage of such platforms is obvious: they have the potential to deliver enormous computing power. Just as obviously, however, the complexity of grids makes them very difficult to use. The Condor team, headed by Miron Livny at the University of Wisconsin, were among the pioneers in providing infrastructure for grid computations. More recently, the Globus project has developed technologies to support computations on geographically distributed platforms consisting of high-end computers, storage and visualization devices, and other scientific instruments. In 1997, we started the metaneos project as a collaborative effort between optimization specialists and the Condor and Globus groups. Our aim was to address complex, difficult optimization problems in several areas, designing and implementing the algorithms and the software infrastructure need to solve these problems on computational grids. This article describes some of the results we have obtained during the first three years of the metaneos project. Our efforts have led to development of the runtime support library MW for implementing algorithms with master-worker control structure on Condor platforms. This work is discussed here, along with work on algorithms and codes for integer linear programming, the quadratic assignment problem, and stochastic linear programmming. Our experiences in the metaneos project have shown that cheap, powerful computational grids can be used to tackle large optimization problems of various types. In an industrial or commercial setting, the results demonstrate that one may not have to buy powerful computational servers to solve many of the large problems arising in areas such as scheduling, portfolio optimization, or logistics; the idle time on employee workstations (or, at worst, an investment in a modest cluster of PCs) may do the job. For the optimization research community, our results motivate further work on parallel, grid-enabled algorithms for solving very large problems of other types. The fact that very large problems can be solved cheaply allows researchers to better understand issues of 'practical' complexity and of the role of heuristics.« less
DInSAR time series generation within a cloud computing environment: from ERS to Sentinel-1 scenario
NASA Astrophysics Data System (ADS)
Casu, Francesco; Elefante, Stefano; Imperatore, Pasquale; Lanari, Riccardo; Manunta, Michele; Zinno, Ivana; Mathot, Emmanuel; Brito, Fabrice; Farres, Jordi; Lengert, Wolfgang
2013-04-01
One of the techniques that will strongly benefit from the advent of the Sentinel-1 system is Differential SAR Interferometry (DInSAR), which has successfully demonstrated to be an effective tool to detect and monitor ground displacements with centimetre accuracy. The geoscience communities (volcanology, seismicity, …), as well as those related to hazard monitoring and risk mitigation, make extensively use of the DInSAR technique and they will take advantage from the huge amount of SAR data acquired by Sentinel-1. Indeed, such an information will successfully permit the generation of Earth's surface displacement maps and time series both over large areas and long time span. However, the issue of managing, processing and analysing the large Sentinel data stream is envisaged by the scientific community to be a major bottleneck, particularly during crisis phases. The emerging need of creating a common ecosystem in which data, results and processing tools are shared, is envisaged to be a successful way to address such a problem and to contribute to the information and knowledge spreading. The Supersites initiative as well as the ESA SuperSites Exploitation Platform (SSEP) and the ESA Cloud Computing Operational Pilot (CIOP) projects provide effective answers to this need and they are pushing towards the development of such an ecosystem. It is clear that all the current and existent tools for querying, processing and analysing SAR data are required to be not only updated for managing the large data stream of Sentinel-1 satellite, but also reorganized for quickly replying to the simultaneous and highly demanding user requests, mainly during emergency situations. This translates into the automatic and unsupervised processing of large amount of data as well as the availability of scalable, widely accessible and high performance computing capabilities. The cloud computing environment permits to achieve all of these objectives, particularly in case of spike and peak requests of processing resources linked to disaster events. This work aims at presenting a parallel computational model for the widely used DInSAR algorithm named as Small BAseline Subset (SBAS), which has been implemented within the cloud computing environment provided by the ESA-CIOP platform. This activity has resulted in developing a scalable, unsupervised, portable, and widely accessible (through a web portal) parallel DInSAR computational tool. The activity has rewritten and developed the SBAS application algorithm within a parallel system environment, i.e., in a form that allows us to benefit from multiple processing units. This requires the devising a parallel version of the SBAS algorithm and its subsequent implementation, implying additional complexity in algorithm designing and an efficient multi processor programming, with the final aim of a parallel performance optimization. Although the presented algorithm has been designed to work with Sentinel-1 data, it can also process other satellite SAR data (ERS, ENVISAT, CSK, TSX, ALOS). Indeed, the performance analysis of the implemented SBAS parallel version has been tested on the full ASAR archive (64 acquisitions) acquired over the Napoli Bay, a volcanic and densely urbanized area in Southern Italy. The full processing - from the raw data download to the generation of DInSAR time series - has been carried out by engaging 4 nodes, each one with 2 cores and 16 GB of RAM, and has taken about 36 hours, with respect to about 135 hours of the sequential version. Extensive analysis on other test areas significant from DInSAR and geophysical viewpoint will be presented. Finally, preliminary performance evaluation of the presented approach within the Sentinel-1 scenario will be provided.
Real-time WAMI streaming target tracking in fog
NASA Astrophysics Data System (ADS)
Chen, Yu; Blasch, Erik; Chen, Ning; Deng, Anna; Ling, Haibin; Chen, Genshe
2016-05-01
Real-time information fusion based on WAMI (Wide-Area Motion Imagery), FMV (Full Motion Video), and Text data is highly desired for many mission critical emergency or security applications. Cloud Computing has been considered promising to achieve big data integration from multi-modal sources. In many mission critical tasks, however, powerful Cloud technology cannot satisfy the tight latency tolerance as the servers are allocated far from the sensing platform, actually there is no guaranteed connection in the emergency situations. Therefore, data processing, information fusion, and decision making are required to be executed on-site (i.e., near the data collection). Fog Computing, a recently proposed extension and complement for Cloud Computing, enables computing on-site without outsourcing jobs to a remote Cloud. In this work, we have investigated the feasibility of processing streaming WAMI in the Fog for real-time, online, uninterrupted target tracking. Using a single target tracking algorithm, we studied the performance of a Fog Computing prototype. The experimental results are very encouraging that validated the effectiveness of our Fog approach to achieve real-time frame rates.
Exploiting volatile opportunistic computing resources with Lobster
NASA Astrophysics Data System (ADS)
Woodard, Anna; Wolf, Matthias; Mueller, Charles; Tovar, Ben; Donnelly, Patrick; Hurtado Anampa, Kenyi; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas
2015-12-01
Analysis of high energy physics experiments using the Compact Muon Solenoid (CMS) at the Large Hadron Collider (LHC) can be limited by availability of computing resources. As a joint effort involving computer scientists and CMS physicists at Notre Dame, we have developed an opportunistic workflow management tool, Lobster, to harvest available cycles from university campus computing pools. Lobster consists of a management server, file server, and worker processes which can be submitted to any available computing resource without requiring root access. Lobster makes use of the Work Queue system to perform task management, while the CMS specific software environment is provided via CVMFS and Parrot. Data is handled via Chirp and Hadoop for local data storage and XrootD for access to the CMS wide-area data federation. An extensive set of monitoring and diagnostic tools have been developed to facilitate system optimisation. We have tested Lobster using the 20 000-core cluster at Notre Dame, achieving approximately 8-10k tasks running simultaneously, sustaining approximately 9 Gbit/s of input data and 340 Mbit/s of output data.
Service Migration from Cloud to Multi-tier Fog Nodes for Multimedia Dissemination with QoE Support
Camargo, João; Rochol, Juergen; Gerla, Mario
2018-01-01
A wide range of multimedia services is expected to be offered for mobile users via various wireless access networks. Even the integration of Cloud Computing in such networks does not support an adequate Quality of Experience (QoE) in areas with high demands for multimedia contents. Fog computing has been conceptualized to facilitate the deployment of new services that cloud computing cannot provide, particularly those demanding QoE guarantees. These services are provided using fog nodes located at the network edge, which is capable of virtualizing their functions/applications. Service migration from the cloud to fog nodes can be actuated by request patterns and the timing issues. To the best of our knowledge, existing works on fog computing focus on architecture and fog node deployment issues. In this article, we describe the operational impacts and benefits associated with service migration from the cloud to multi-tier fog computing for video distribution with QoE support. Besides that, we perform the evaluation of such service migration of video services. Finally, we present potential research challenges and trends. PMID:29364172
Distributed storage and cloud computing: a test case
NASA Astrophysics Data System (ADS)
Piano, S.; Delia Ricca, G.
2014-06-01
Since 2003 the computing farm hosted by the INFN Tier3 facility in Trieste supports the activities of many scientific communities. Hundreds of jobs from 45 different VOs, including those of the LHC experiments, are processed simultaneously. Given that normally the requirements of the different computational communities are not synchronized, the probability that at any given time the resources owned by one of the participants are not fully utilized is quite high. A balanced compensation should in principle allocate the free resources to other users, but there are limits to this mechanism. In fact, the Trieste site may not hold the amount of data needed to attract enough analysis jobs, and even in that case there could be a lack of bandwidth for their access. The Trieste ALICE and CMS computing groups, in collaboration with other Italian groups, aim to overcome the limitations of existing solutions using two approaches: sharing the data among all the participants taking full advantage of GARR-X wide area networks (10 GB/s) and integrating the resources dedicated to batch analysis with the ones reserved for dynamic interactive analysis, through modern solutions as cloud computing.
Service Migration from Cloud to Multi-tier Fog Nodes for Multimedia Dissemination with QoE Support.
Rosário, Denis; Schimuneck, Matias; Camargo, João; Nobre, Jéferson; Both, Cristiano; Rochol, Juergen; Gerla, Mario
2018-01-24
A wide range of multimedia services is expected to be offered for mobile users via various wireless access networks. Even the integration of Cloud Computing in such networks does not support an adequate Quality of Experience (QoE) in areas with high demands for multimedia contents. Fog computing has been conceptualized to facilitate the deployment of new services that cloud computing cannot provide, particularly those demanding QoE guarantees. These services are provided using fog nodes located at the network edge, which is capable of virtualizing their functions/applications. Service migration from the cloud to fog nodes can be actuated by request patterns and the timing issues. To the best of our knowledge, existing works on fog computing focus on architecture and fog node deployment issues. In this article, we describe the operational impacts and benefits associated with service migration from the cloud to multi-tier fog computing for video distribution with QoE support. Besides that, we perform the evaluation of such service migration of video services. Finally, we present potential research challenges and trends.
Resource Efficient Hardware Architecture for Fast Computation of Running Max/Min Filters
Torres-Huitzil, Cesar
2013-01-01
Running max/min filters on rectangular kernels are widely used in many digital signal and image processing applications. Filtering with a k × k kernel requires of k 2 − 1 comparisons per sample for a direct implementation; thus, performance scales expensively with the kernel size k. Faster computations can be achieved by kernel decomposition and using constant time one-dimensional algorithms on custom hardware. This paper presents a hardware architecture for real-time computation of running max/min filters based on the van Herk/Gil-Werman (HGW) algorithm. The proposed architecture design uses less computation and memory resources than previously reported architectures when targeted to Field Programmable Gate Array (FPGA) devices. Implementation results show that the architecture is able to compute max/min filters, on 1024 × 1024 images with up to 255 × 255 kernels, in around 8.4 milliseconds, 120 frames per second, at a clock frequency of 250 MHz. The implementation is highly scalable for the kernel size with good performance/area tradeoff suitable for embedded applications. The applicability of the architecture is shown for local adaptive image thresholding. PMID:24288456
The Nimrod computational workbench: a case study in desktop metacomputing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abramson, D.; Sosic, R.; Foster, I.
The coordinated use of geographically distributed computers, or metacomputing, can in principle provide more accessible and cost- effective supercomputing than conventional high-performance systems. However, we lack evidence that metacomputing systems can be made easily usable, or that there exist large numbers of applications able to exploit metacomputing resources. In this paper, we present work that addresses both these concerns. The basis for this work is a system called Nimrod that provides a desktop problem-solving environment for parametric experiments. We describe how Nimrod has been extended to support the scheduling of computational resources located in a wide-area environment, and report onmore » an experiment in which Nimrod was used to schedule a large parametric study across the Australian Internet. The experiment provided both new scientific results and insights into Nimrod capabilities. We relate the results of this experiment to lessons learned from the I-WAY distributed computing experiment, and draw conclusions as to how Nimrod and I-WAY- like computing environments should be developed to support desktop metacomputing.« less
Tools, techniques, organisation and culture of the CADD group at Sygnature Discovery.
St-Gallay, Steve A; Sambrook-Smith, Colin P
2017-03-01
Computer-aided drug design encompasses a wide variety of tools and techniques, and can be implemented with a range of organisational structures and focus in different organisations. Here we outline the computational chemistry skills within Sygnature Discovery, along with the software and hardware at our disposal, and briefly discuss the methods that are not employed and why. The goal of the group is to provide support for design and analysis in order to improve the quality of compounds synthesised and reduce the timelines of drug discovery projects, and we reveal how this is achieved at Sygnature. Impact on medicinal chemistry is vital to demonstrating the value of computational chemistry, and we discuss the approaches taken to influence the list of compounds for synthesis, and how we recognise success. Finally we touch on some of the areas being developed within the team in order to provide further value to the projects and clients.
Kelemen, Arpad; Vasilakos, Athanasios V; Liang, Yulan
2009-09-01
Comprehensive evaluation of common genetic variations through association of single-nucleotide polymorphism (SNP) structure with common complex disease in the genome-wide scale is currently a hot area in human genome research due to the recent development of the Human Genome Project and HapMap Project. Computational science, which includes computational intelligence (CI), has recently become the third method of scientific enquiry besides theory and experimentation. There have been fast growing interests in developing and applying CI in disease mapping using SNP and haplotype data. Some of the recent studies have demonstrated the promise and importance of CI for common complex diseases in genomic association study using SNP/haplotype data, especially for tackling challenges, such as gene-gene and gene-environment interactions, and the notorious "curse of dimensionality" problem. This review provides coverage of recent developments of CI approaches for complex diseases in genetic association study with SNP/haplotype data.
Using high-performance networks to enable computational aerosciences applications
NASA Technical Reports Server (NTRS)
Johnson, Marjory J.
1992-01-01
One component of the U.S. Federal High Performance Computing and Communications Program (HPCCP) is the establishment of a gigabit network to provide a communications infrastructure for researchers across the nation. This gigabit network will provide new services and capabilities, in addition to increased bandwidth, to enable future applications. An understanding of these applications is necessary to guide the development of the gigabit network and other high-performance networks of the future. In this paper we focus on computational aerosciences applications run remotely using the Numerical Aerodynamic Simulation (NAS) facility located at NASA Ames Research Center. We characterize these applications in terms of network-related parameters and relate user experiences that reveal limitations imposed by the current wide-area networking infrastructure. Then we investigate how the development of a nationwide gigabit network would enable users of the NAS facility to work in new, more productive ways.
Tools, techniques, organisation and culture of the CADD group at Sygnature Discovery
NASA Astrophysics Data System (ADS)
St-Gallay, Steve A.; Sambrook-Smith, Colin P.
2017-03-01
Computer-aided drug design encompasses a wide variety of tools and techniques, and can be implemented with a range of organisational structures and focus in different organisations. Here we outline the computational chemistry skills within Sygnature Discovery, along with the software and hardware at our disposal, and briefly discuss the methods that are not employed and why. The goal of the group is to provide support for design and analysis in order to improve the quality of compounds synthesised and reduce the timelines of drug discovery projects, and we reveal how this is achieved at Sygnature. Impact on medicinal chemistry is vital to demonstrating the value of computational chemistry, and we discuss the approaches taken to influence the list of compounds for synthesis, and how we recognise success. Finally we touch on some of the areas being developed within the team in order to provide further value to the projects and clients.
NASA Astrophysics Data System (ADS)
Papers are presented on ISDN, mobile radio systems and techniques for digital connectivity, centralized and distributed algorithms in computer networks, communications networks, quality assurance and impact on cost, adaptive filters in communications, the spread spectrum, signal processing, video communication techniques, and digital satellite services. Topics discussed include performance evaluation issues for integrated protocols, packet network operations, the computer network theory and multiple-access, microwave single sideband systems, switching architectures, fiber optic systems, wireless local communications, modulation, coding, and synchronization, remote switching, software quality, transmission, and expert systems in network operations. Consideration is given to wide area networks, image and speech processing, office communications application protocols, multimedia systems, customer-controlled network operations, digital radio systems, channel modeling and signal processing in digital communications, earth station/on-board modems, computer communications system performance evaluation, source encoding, compression, and quantization, and adaptive communications systems.
Partitioning problems in parallel, pipelined and distributed computing
NASA Technical Reports Server (NTRS)
Bokhari, S.
1985-01-01
The problem of optimally assigning the modules of a parallel program over the processors of a multiple computer system is addressed. A Sum-Bottleneck path algorithm is developed that permits the efficient solution of many variants of this problem under some constraints on the structure of the partitions. In particular, the following problems are solved optimally for a single-host, multiple satellite system: partitioning multiple chain structured parallel programs, multiple arbitrarily structured serial programs and single tree structured parallel programs. In addition, the problems of partitioning chain structured parallel programs across chain connected systems and across shared memory (or shared bus) systems are also solved under certain constraints. All solutions for parallel programs are equally applicable to pipelined programs. These results extend prior research in this area by explicitly taking concurrency into account and permit the efficient utilization of multiple computer architectures for a wide range of problems of practical interest.
Matsushima, Kyoji; Sonobe, Noriaki
2018-01-01
Digitized holography techniques are used to reconstruct three-dimensional (3D) images of physical objects using large-scale computer-generated holograms (CGHs). The object field is captured at three wavelengths over a wide area at high densities. Synthetic aperture techniques using single sensors are used for image capture in phase-shifting digital holography. The captured object field is incorporated into a virtual 3D scene that includes nonphysical objects, e.g., polygon-meshed CG models. The synthetic object field is optically reconstructed as a large-scale full-color CGH using red-green-blue color filters. The CGH has a wide full-parallax viewing zone and reconstructs a deep 3D scene with natural motion parallax.
Computer-Drawn Field Lines and Potential Surfaces for a Wide Range of Field Configurations
ERIC Educational Resources Information Center
Brandt, Siegmund; Schneider, Hermann
1976-01-01
Describes a computer program that computes field lines and equipotential surfaces for a wide range of field configurations. Presents the mathematical technique and details of the program, the input data, and different modes of graphical representation. (MLH)
NASA Technical Reports Server (NTRS)
1983-01-01
A computer code which can account for plastic deformation effects on stress generated in silicon sheet grown at high speeds is fully operative. Stress and strain rate distributions are presented for two different sheet temperature profiles. The calculations show that residual stress levels are very sensitive to details of the cooling profile in a sheet with creep. Experimental work has been started in several areas to improve understanding of ribbon temperature profiles and stress distributions associated with a 10 cm wide ribbon cartridge system.
1991-03-01
management methodologies claim to be "expert systems" with security intelligence built into them to I derive a body of both facts and speculative data ... Data Administration considerations . III -21 IV. ARTIFICIAL INTELLIGENCE . .. .. .. . .. IV - 1 A. Description of Technologies . . . . . .. IV - 1 1...as intelligent gateways, wide area networks, and distributed databases for the distribution of logistics products. The integrity of CALS data and the
Distributed telemedicine for the National Information Infrastructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forslund, D.W.; Lee, Seong H.; Reverbel, F.C.
1997-08-01
TeleMed is an advanced system that provides a distributed multimedia electronic medical record available over a wide area network. It uses object-based computing, distributed data repositories, advanced graphical user interfaces, and visualization tools along with innovative concept extraction of image information for storing and accessing medical records developed in a separate project from 1994-5. In 1996, we began the transition to Java, extended the infrastructure, and worked to begin deploying TeleMed-like technologies throughout the nation. Other applications are mentioned.
2000-10-14
without any knowledge of the problem area. Therefore, Darwinian-type evolutionary computation has found a very wide range of applications, including many ...the author examined many biomedical studies that included literature searches. The Science Citation Index (SCL) Abstracts of these studies...yield many records that are non-relevant to the main technical themes of the study. In summary, these types of simple limited queries can result in two
Alternative communication network designs for an operational Plato 4 CAI system
NASA Technical Reports Server (NTRS)
Mobley, R. E., Jr.; Eastwood, L. F., Jr.
1975-01-01
The cost of alternative communications networks for the dissemination of PLATO IV computer-aided instruction (CAI) was studied. Four communication techniques are compared: leased telephone lines, satellite communication, UHF TV, and low-power microwave radio. For each network design, costs per student contact hour are computed. These costs are derived as functions of student population density, a parameter which can be calculated from census data for one potential market for CAI, the public primary and secondary schools. Calculating costs in this way allows one to determine which of the four communications alternatives can serve this market least expensively for any given area in the U.S. The analysis indicates that radio distribution techniques are cost optimum over a wide range of conditions.
NASA Astrophysics Data System (ADS)
Mitrofanova, O. V.; Bayramukov, A. S.; Fedorinov, A. V.
2017-11-01
There are presented some results of computational-theoretical research on identifying thermo-physical features and topology of high-velocity curved and swirl flows, which are occur inside complicated channels of collector systems, active zones and nuclear power installations equipment with pressurized water reactors. Cylindrical curved channels of different configurations and various combinations of bends and cross sectional areas were considered as modeling objects. Results of computational experiments to determine velocity, pressure, vorticity and temperature fields in transverse and longitudinal sections of the pipeline showed that the complicated geometry of the channels can cause to large-scale swirl of flow, cavitation effects and generation acoustic fluctuations with wide spectrum of sound frequencies for the coolant in the dynamic modes.
NASA Technical Reports Server (NTRS)
Field, Richard T.
1990-01-01
SOILSIM, a digital model of energy and moisture fluxes in the soil and above the soil surface, is presented. It simulates the time evolution of soil temperature and moisture, temperature of the soil surface and plant canopy the above surface, and the fluxes of sensible and latent heat into the atmosphere in response to surface weather conditions. The model is driven by simple weather observations including wind speed, air temperature, air humidity, and incident radiation. The model intended to be useful in conjunction with remotely sensed information of the land surface state, such as surface brightness temperature and soil moisture, for computing wide area evapotranspiration.
Continuous stacking computational approach based automated microscope slide scanner
NASA Astrophysics Data System (ADS)
Murali, Swetha; Adhikari, Jayesh Vasudeva; Jagannadh, Veerendra Kalyan; Gorthi, Sai Siva
2018-02-01
Cost-effective and automated acquisition of whole slide images is a bottleneck for wide-scale deployment of digital pathology. In this article, a computation augmented approach for the development of an automated microscope slide scanner is presented. The realization of a prototype device built using inexpensive off-the-shelf optical components and motors is detailed. The applicability of the developed prototype to clinical diagnostic testing is demonstrated by generating good quality digital images of malaria-infected blood smears. Further, the acquired slide images have been processed to identify and count the number of malaria-infected red blood cells and thereby perform quantitative parasitemia level estimation. The presented prototype would enable cost-effective deployment of slide-based cyto-diagnostic testing in endemic areas.
Swingle, Brian
2013-09-06
We compute the entanglement entropy of a wide class of models that may be characterized as describing matter coupled to gauge fields. Our principle result is an entanglement sum rule that states that the entropy of the full system is the sum of the entropies of the two components. In the context of the models we consider, this result applies to the full entropy, but more generally it is a statement about the additivity of universal terms in the entropy. Our proof simultaneously extends and simplifies previous arguments, with extensions including new models at zero temperature as well as the ability to treat finite temperature crossovers. We emphasize that while the additivity is an exact statement, each term in the sum may still be difficult to compute. Our results apply to a wide variety of phases including Fermi liquids, spin liquids, and some non-Fermi liquid metals. For example, we prove that our model of an interacting Fermi liquid has exactly the log violation of the area law for entanglement entropy predicted by the Widom formula in agreement with earlier arguments.
Root-like enamel pearl: a case report
2014-01-01
Introduction In general, enamel pearls are found in maxillary molars as a small globule of enamel. However, this case report describes an enamel pearl with a prolate spheroid shape which is 1.8mm wide and 8mm long. The different type of enamel pearl found in my clinic has significantly improved our understanding of enamel pearl etiology and pathophysiology. Case presentation A 42-year-old Han Chinese woman with severe toothache received treatment in my Department of Endodontics. She had no significant past medical history. A dental examination revealed extensive distal decay in her left mandibular first molar, tenderness to percussion and palpation of the periradicular zone, and found a deep periodontal pocket on the buccal lateral. Vitality testing was negative. Periapical radiographic images revealed radiolucency around the mesial apex. Cone beam computed tomography detected an opaque enamel pearl in the furcation area with a prolate spheroid shape of 1.8mm wide and 8mm long. Conclusion The enamel pearl described in this case report is like a very long dental root. Cone beam computed tomography may be used for evaluating enamel pearls. PMID:25008098
Transverse Injection into Subsonic Crossflow with Various Injector Orifice Geometries
NASA Technical Reports Server (NTRS)
Foster, Lancert E.; Zaman, Khairul B.
2010-01-01
Computational and experimental results are presented for a case study of single injectors employed in 90 deg transverse injection into a non-reacting subsonic flow. Different injector orifice shapes are used (including circular, square, diamond-shaped, and wide rectangular slot), all with constant cross-sectional area, to observe the effects of this variation on injector penetration and mixing. Whereas the circle, square, and diamond injector produce similar jet plumes, the wide rectangular slot produces a plume with less vertical penetration than the others. There is also some evidence that the diamond injector produces slightly faster penetration with less mixing of the injected fluid. In addition, a variety of rectangular injectors were analyzed, with varying length/width ratios. Both experimental and computational data show improved plume penetration with increased streamwise orifice length. 3-D Reynolds-Averaged Navier-Stokes (RANS) results are obtained for the various injector geometries using NCC (National Combustion Code) with the kappa-epsilon turbulence model in multi-species modes on an unstructured grid. Grid sensitivity results are also presented which indicate consistent qualitative trends in the injector performance comparisons with increasing grid refinement.
Lee, Hsiang-Chieh; Ahsen, Osman Oguz; Liang, Kaicheng; Wang, Zhao; Cleveland, Cody; Booth, Lucas; Potsaid, Benjamin; Jayaraman, Vijaysekhar; Cable, Alex E; Mashimo, Hiroshi; Langer, Robert; Traverso, Giovanni; Fujimoto, James G
2016-08-01
We demonstrate a micromotor balloon imaging catheter for ultrahigh speed endoscopic optical coherence tomography (OCT) which provides wide area, circumferential structural and angiographic imaging of the esophagus without contrast agents. Using a 1310 nm MEMS tunable wavelength swept VCSEL light source, the system has a 1.2 MHz A-scan rate and ~8.5 µm axial resolution in tissue. The micromotor balloon catheter enables circumferential imaging of the esophagus at 240 frames per second (fps) with a ~30 µm (FWHM) spot size. Volumetric imaging is achieved by proximal pullback of the micromotor assembly within the balloon at 1.5 mm/sec. Volumetric data consisting of 4200 circumferential images of 5,000 A-scans each over a 2.6 cm length, covering a ~13 cm(2) area is acquired in <18 seconds. A non-rigid image registration algorithm is used to suppress motion artifacts from non-uniform rotational distortion (NURD), cardiac motion or respiration. En face OCT images at various depths can be generated. OCT angiography (OCTA) is computed using intensity decorrelation between sequential pairs of circumferential scans and enables three-dimensional visualization of vasculature. Wide area volumetric OCT and OCTA imaging of the swine esophagus in vivo is demonstrated.
Protein binding hot spots prediction from sequence only by a new ensemble learning method.
Hu, Shan-Shan; Chen, Peng; Wang, Bing; Li, Jinyan
2017-10-01
Hot spots are interfacial core areas of binding proteins, which have been applied as targets in drug design. Experimental methods are costly in both time and expense to locate hot spot areas. Recently, in-silicon computational methods have been widely used for hot spot prediction through sequence or structure characterization. As the structural information of proteins is not always solved, and thus hot spot identification from amino acid sequences only is more useful for real-life applications. This work proposes a new sequence-based model that combines physicochemical features with the relative accessible surface area of amino acid sequences for hot spot prediction. The model consists of 83 classifiers involving the IBk (Instance-based k means) algorithm, where instances are encoded by important properties extracted from a total of 544 properties in the AAindex1 (Amino Acid Index) database. Then top-performance classifiers are selected to form an ensemble by a majority voting technique. The ensemble classifier outperforms the state-of-the-art computational methods, yielding an F1 score of 0.80 on the benchmark binding interface database (BID) test set. http://www2.ahu.edu.cn/pchen/web/HotspotEC.htm .
A learnable parallel processing architecture towards unity of memory and computing
NASA Astrophysics Data System (ADS)
Li, H.; Gao, B.; Chen, Z.; Zhao, Y.; Huang, P.; Ye, H.; Liu, L.; Liu, X.; Kang, J.
2015-08-01
Developing energy-efficient parallel information processing systems beyond von Neumann architecture is a long-standing goal of modern information technologies. The widely used von Neumann computer architecture separates memory and computing units, which leads to energy-hungry data movement when computers work. In order to meet the need of efficient information processing for the data-driven applications such as big data and Internet of Things, an energy-efficient processing architecture beyond von Neumann is critical for the information society. Here we show a non-von Neumann architecture built of resistive switching (RS) devices named “iMemComp”, where memory and logic are unified with single-type devices. Leveraging nonvolatile nature and structural parallelism of crossbar RS arrays, we have equipped “iMemComp” with capabilities of computing in parallel and learning user-defined logic functions for large-scale information processing tasks. Such architecture eliminates the energy-hungry data movement in von Neumann computers. Compared with contemporary silicon technology, adder circuits based on “iMemComp” can improve the speed by 76.8% and the power dissipation by 60.3%, together with a 700 times aggressive reduction in the circuit area.
A learnable parallel processing architecture towards unity of memory and computing.
Li, H; Gao, B; Chen, Z; Zhao, Y; Huang, P; Ye, H; Liu, L; Liu, X; Kang, J
2015-08-14
Developing energy-efficient parallel information processing systems beyond von Neumann architecture is a long-standing goal of modern information technologies. The widely used von Neumann computer architecture separates memory and computing units, which leads to energy-hungry data movement when computers work. In order to meet the need of efficient information processing for the data-driven applications such as big data and Internet of Things, an energy-efficient processing architecture beyond von Neumann is critical for the information society. Here we show a non-von Neumann architecture built of resistive switching (RS) devices named "iMemComp", where memory and logic are unified with single-type devices. Leveraging nonvolatile nature and structural parallelism of crossbar RS arrays, we have equipped "iMemComp" with capabilities of computing in parallel and learning user-defined logic functions for large-scale information processing tasks. Such architecture eliminates the energy-hungry data movement in von Neumann computers. Compared with contemporary silicon technology, adder circuits based on "iMemComp" can improve the speed by 76.8% and the power dissipation by 60.3%, together with a 700 times aggressive reduction in the circuit area.
An Overview of Computational Aeroacoustic Modeling at NASA Langley
NASA Technical Reports Server (NTRS)
Lockard, David P.
2001-01-01
The use of computational techniques in the area of acoustics is known as computational aeroacoustics and has shown great promise in recent years. Although an ultimate goal is to use computational simulations as a virtual wind tunnel, the problem is so complex that blind applications of traditional algorithms are typically unable to produce acceptable results. The phenomena of interest are inherently unsteady and cover a wide range of frequencies and amplitudes. Nonetheless, with appropriate simplifications and special care to resolve specific phenomena, currently available methods can be used to solve important acoustic problems. These simulations can be used to complement experiments, and often give much more detailed information than can be obtained in a wind tunnel. The use of acoustic analogy methods to inexpensively determine far-field acoustics from near-field unsteadiness has greatly reduced the computational requirements. A few examples of current applications of computational aeroacoustics at NASA Langley are given. There remains a large class of problems that require more accurate and efficient methods. Research to develop more advanced methods that are able to handle the geometric complexity of realistic problems using block-structured and unstructured grids are highlighted.
Geospatial Data as a Service: The GEOGLAM Rangelands and Pasture Productivity Map Experience
NASA Astrophysics Data System (ADS)
Evans, B. J. K.; Antony, J.; Guerschman, J. P.; Larraondo, P. R.; Richards, C. J.
2017-12-01
Empowering end-users like pastoralists, land management specialists and land policy makers in the use of earth observation data for both day-to-day and seasonal planning needs both interactive delivery of multiple geospatial datasets and the capability of supporting on-the-fly dynamic queries while simultaneously fostering a community around the effort. The use of and wide adoption of large data archives, like those produced by earth observation missions, are often limited by compute and storage capabilities of the remote user. We demonstrate that wide-scale use of large data archives can be facilitated by end-users dynamically requesting value-added products using open standards (WCS, WMS, WPS), with compute running in the cloud or dedicated data-centres and visualizing outputs on web-front ends. As an example, we will demonstrate how a tool called GSKY can empower a remote end-user by providing the data delivery and analytics capabilities for the GEOGLAM Rangelands and Pasture Productivity (RAPP) Map tool. The GEOGLAM RAPP initiative from the Group on Earth Observations (GEO) and its Agricultural Monitoring subgroup aims at providing practical tools to end-users focusing on the important role of rangelands and pasture systems in providing food production security from both agricultural crops and animal protein. Figure 1, is a screen capture from the RAPP Map interface for an important pasture area in the Namibian rangelands. The RAPP Map has been in production for six months and has garnered significant interest from groups and users all over the world. GSKY, being formulated around the theme of Open Geospatial Data-as-a-Service capabilities uses distributed computing and storage to facilitate this. It works behind the scenes, accepting OGC standard requests in WCS, WMS and WPS. Results from these requests are rendered on a web-front end. In this way, the complexities of data locality and compute execution are masked from an end user. On-the-fly computation of products such as NDVI, Leaf Area Index, vegetation cover and others from original source data including MODIS are achived, with Landsat and Sentinel-2 on the horizon. Innovative use of cloud computing and storage along with flexible front-ends, allow the democratization of data dissemination and we hope better outcomes for the planet.
Wide-angle display developments by computer graphics
NASA Technical Reports Server (NTRS)
Fetter, William A.
1989-01-01
Computer graphics can now expand its new subset, wide-angle projection, to be as significant a generic capability as computer graphics itself. Some prior work in computer graphics is presented which leads to an attractive further subset of wide-angle projection, called hemispheric projection, to be a major communication media. Hemispheric film systems have long been present and such computer graphics systems are in use in simulators. This is the leading edge of capabilities which should ultimately be as ubiquitous as CRTs (cathode-ray tubes). These assertions are not from degrees in science or only from a degree in graphic design, but in a history of computer graphics innovations, laying groundwork by demonstration. The author believes that it is timely to look at several development strategies, since hemispheric projection is now at a point comparable to the early stages of computer graphics, requiring similar patterns of development again.
NASA Technical Reports Server (NTRS)
Shooman, Martin L.
1991-01-01
Many of the most challenging reliability problems of our present decade involve complex distributed systems such as interconnected telephone switching computers, air traffic control centers, aircraft and space vehicles, and local area and wide area computer networks. In addition to the challenge of complexity, modern fault-tolerant computer systems require very high levels of reliability, e.g., avionic computers with MTTF goals of one billion hours. Most analysts find that it is too difficult to model such complex systems without computer aided design programs. In response to this need, NASA has developed a suite of computer aided reliability modeling programs beginning with CARE 3 and including a group of new programs such as: HARP, HARP-PC, Reliability Analysts Workbench (Combination of model solvers SURE, STEM, PAWS, and common front-end model ASSIST), and the Fault Tree Compiler. The HARP program is studied and how well the user can model systems using this program is investigated. One of the important objectives will be to study how user friendly this program is, e.g., how easy it is to model the system, provide the input information, and interpret the results. The experiences of the author and his graduate students who used HARP in two graduate courses are described. Some brief comparisons were made with the ARIES program which the students also used. Theoretical studies of the modeling techniques used in HARP are also included. Of course no answer can be any more accurate than the fidelity of the model, thus an Appendix is included which discusses modeling accuracy. A broad viewpoint is taken and all problems which occurred in the use of HARP are discussed. Such problems include: computer system problems, installation manual problems, user manual problems, program inconsistencies, program limitations, confusing notation, long run times, accuracy problems, etc.
Merging of multi-temporal SST data at South China Sea
NASA Astrophysics Data System (ADS)
Ng, H. G.; MatJafri, M. Z.; Abdullah, K.; Lim, H. S.
2008-10-01
The sea surface temperature (SST) mapping could be performed with a wide spatial and temporal extent in a reasonable time limit. The space-borne sensor of AVHRR was widely used for the purpose. However, the current SST retrieval techniques for infrared channels were limited only for the cloud-free area, because the electromagnetic waves in the infrared wavelengths could not penetrate the cloud. Therefore, the SST availability was low for the single image. To overcome this problem, we studied to produce the composite of three day's SST map. The diurnal changes of SST data are quite stable through a short period of time if no abrupt natural disaster occurrence. Therefore, the SST data of three consecutive days with nearly coincident daily time were merged in order to create a three day's composite SST data. The composite image could increase the SST availability. In this study, we acquired the level 1b AVHRR (Advanced Very High Resolution Radiometer) images from Malaysia Center of Remote Sensing (MACRES). The images were first preprocessed and the cloud and land areas were masked. We made some modifications on the technique of obtaining the threshold value for cloud masking. The SST was estimated by using the day split MCSST algorithm. The cloud free water pixels availability were computed and compared. The mean of SST for three day's composite data were calculated and a SST map was generated. The cloud free water pixels availability were computed and compared. The SST data availability was increased by merging the SST data.
Radke, Oliver C; Schneider, Thomas; Braune, Anja; Pirracchio, Romain; Fischer, Felix; Koch, Thea
2016-09-28
Both Electrical Impedance Tomography (EIT) and Computed Tomography (CT) allow the estimation of the lung area. We compared two algorithms for the detection of the lung area per quadrant from the EIT images with the lung areas derived from the CT images. 39 outpatients who were scheduled for an elective CT scan of the thorax were included in the study. For each patient we recorded EIT images immediately before the CT scan. The lung area per quadrant was estimated from both CT and EIT data using two different algorithms for the EIT data. Data showed considerable variation during spontaneous breathing of the patients. Overall correlation between EIT and CT was poor (0.58-0.77), the correlation between the two EIT algorithms was better (0.90-0.92). Bland-Altmann analysis revealed absence of bias, but wide limits of agreement. Lung area estimation from CT and EIT differs significantly, most probably because of the fundamental difference in image generation.
A CFD Heterogeneous Parallel Solver Based on Collaborating CPU and GPU
NASA Astrophysics Data System (ADS)
Lai, Jianqi; Tian, Zhengyu; Li, Hua; Pan, Sha
2018-03-01
Since Graphic Processing Unit (GPU) has a strong ability of floating-point computation and memory bandwidth for data parallelism, it has been widely used in the areas of common computing such as molecular dynamics (MD), computational fluid dynamics (CFD) and so on. The emergence of compute unified device architecture (CUDA), which reduces the complexity of compiling program, brings the great opportunities to CFD. There are three different modes for parallel solution of NS equations: parallel solver based on CPU, parallel solver based on GPU and heterogeneous parallel solver based on collaborating CPU and GPU. As we can see, GPUs are relatively rich in compute capacity but poor in memory capacity and the CPUs do the opposite. We need to make full use of the GPUs and CPUs, so a CFD heterogeneous parallel solver based on collaborating CPU and GPU has been established. Three cases are presented to analyse the solver’s computational accuracy and heterogeneous parallel efficiency. The numerical results agree well with experiment results, which demonstrate that the heterogeneous parallel solver has high computational precision. The speedup on a single GPU is more than 40 for laminar flow, it decreases for turbulent flow, but it still can reach more than 20. What’s more, the speedup increases as the grid size becomes larger.
Advanced Computational Methods in Bio-Mechanics.
Al Qahtani, Waleed M S; El-Anwar, Mohamed I
2018-04-15
A novel partnership between surgeons and machines, made possible by advances in computing and engineering technology, could overcome many of the limitations of traditional surgery. By extending surgeons' ability to plan and carry out surgical interventions more accurately and with fewer traumas, computer-integrated surgery (CIS) systems could help to improve clinical outcomes and the efficiency of healthcare delivery. CIS systems could have a similar impact on surgery to that long since realised in computer-integrated manufacturing. Mathematical modelling and computer simulation have proved tremendously successful in engineering. Computational mechanics has enabled technological developments in virtually every area of our lives. One of the greatest challenges for mechanists is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. Biomechanics has significant potential for applications in orthopaedic industry, and the performance arts since skills needed for these activities are visibly related to the human musculoskeletal and nervous systems. Although biomechanics is widely used nowadays in the orthopaedic industry to design orthopaedic implants for human joints, dental parts, external fixations and other medical purposes, numerous researches funded by billions of dollars are still running to build a new future for sports and human healthcare in what is called biomechanics era.
Juan, Hsu-Cheng; Lin, Hung-Yu; Chou, Yii-Her; Yang, Yi-Hsin; Shih, Paul Ming-Chen; Chuang, Shu-Mien; Shen, Jung-Tsung; Juan, Yung-Shun
2012-08-01
To assess the effects of abdominal fat on shock wave lithotripsy (SWL). We used pre-SWL unenhanced computed tomography (CT) to evaluate the impact of abdominal fat distribution and calculus characteristics on the outcome of SWL. One hundred and eighty-five patients with a solitary ureteric calculus treated with SWL were retrospectively reviewed. Each patient underwent unenhanced CT within 1 month before SWL treatment. Treatment outcomes were evaluated 1 month later. Unenhanced CT parameters, including calculus surface area, Hounsfield unit (HU) density, abdominal fat area and skin to calculus distance (SSD) were analysed. One hundred and twenty-eight of the 185 patients were found to be calculus-free following treatment. HU density, total fat area, visceral fat area and SSD were identified as significant variables on multivariate logistic regression analysis. The receiver-operating characteristic analyses showed that total fat area, para/perirenal fat area and visceral fat area were sensitive predictors of SWL outcomes. This study revealed that higher quantities of abdominal fat, especially visceral fat, are associated with a lower calculus-free rate following SWL treatment. Unenhanced CT is a convenient technique for diagnosing the presence of a calculus, assessing the intra-abdominal fat distribution and thereby helping to predict the outcome of SWL. • Unenhanced CT is now widely used to assess ureteric calculi. • The same CT protocol can provide measurements of abdominal fat distribution. • Ureteric calculi are usually treated by shock wave lithotripsy (SWL). • Greater intra-abdominal fat stores are generally associated with poorer SWL results.
Active Control Technology at NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Antcliff, Richard R.; McGowan, Anna-Marie R.
2000-01-01
NASA Langley has a long history of attacking important technical opportunities from a broad base of supporting disciplines. The research and development at Langley in this subject area range from the test tube to the test flight. The information covered here will range from the development of innovative new materials, sensors and actuators, to the incorporation of smart sensors and actuators in practical devices, to the optimization of the location of these devices, to, finally, a wide variety of applications of these devices utilizing Langley's facilities and expertise. Advanced materials are being developed for sensors and actuators, as well as polymers for integrating smart devices into composite structures. Contributions reside in three key areas: computational materials; advanced piezoelectric materials; and integrated composite structures. The computational materials effort is focused on developing predictive tools for the efficient design of new materials with the appropriate combination of properties for next generation smart airframe systems. Research in the area of advanced piezoelectrics includes optimizing the efficiency, force output, use temperature, and energy transfer between the structure and device for both ceramic and polymeric materials. For structural health monitoring, advanced non-destructive techniques including fiber optics are being developed for detection of delaminations, cracks and environmental deterioration in aircraft structures. The computational materials effort is focused on developing predictive tools for the efficient design of new materials with the appropriate combination of properties for next generation smart airframe system. Innovative fabrication techniques processing structural composites with sensor and actuator integration are being developed.
NASA Astrophysics Data System (ADS)
Kokkinaki, A.; Sleep, B. E.; Chambers, J. E.; Cirpka, O. A.; Nowak, W.
2010-12-01
Electrical Resistance Tomography (ERT) is a popular method for investigating subsurface heterogeneity. The method relies on measuring electrical potential differences and obtaining, through inverse modeling, the underlying electrical conductivity field, which can be related to hydraulic conductivities. The quality of site characterization strongly depends on the utilized inversion technique. Standard ERT inversion methods, though highly computationally efficient, do not consider spatial correlation of soil properties; as a result, they often underestimate the spatial variability observed in earth materials, thereby producing unrealistic subsurface models. Also, these methods do not quantify the uncertainty of the estimated properties, thus limiting their use in subsequent investigations. Geostatistical inverse methods can be used to overcome both these limitations; however, they are computationally expensive, which has hindered their wide use in practice. In this work, we compare a standard Gauss-Newton smoothness constrained least squares inversion method against the quasi-linear geostatistical approach using the three-dimensional ERT dataset of the SABRe (Source Area Bioremediation) project. The two methods are evaluated for their ability to: a) produce physically realistic electrical conductivity fields that agree with the wide range of data available for the SABRe site while being computationally efficient, and b) provide information on the spatial statistics of other parameters of interest, such as hydraulic conductivity. To explore the trade-off between inversion quality and computational efficiency, we also employ a 2.5-D forward model with corrections for boundary conditions and source singularities. The 2.5-D model accelerates the 3-D geostatistical inversion method. New adjoint equations are developed for the 2.5-D forward model for the efficient calculation of sensitivities. Our work shows that spatial statistics can be incorporated in large-scale ERT inversions to improve the inversion results without making them computationally prohibitive.
NASA Astrophysics Data System (ADS)
Becker, Matthew Rand
I present a new algorithm, CALCLENS, for efficiently computing weak gravitational lensing shear signals from large N-body light cone simulations over a curved sky. This new algorithm properly accounts for the sky curvature and boundary conditions, is able to produce redshift- dependent shear signals including corrections to the Born approximation by using multiple- plane ray tracing, and properly computes the lensed images of source galaxies in the light cone. The key feature of this algorithm is a new, computationally efficient Poisson solver for the sphere that combines spherical harmonic transform and multigrid methods. As a result, large areas of sky (~10,000 square degrees) can be ray traced efficiently at high-resolution using only a few hundred cores. Using this new algorithm and curved-sky calculations that only use a slower but more accurate spherical harmonic transform Poisson solver, I study the convergence, shear E-mode, shear B-mode and rotation mode power spectra. Employing full-sky E/B-mode decompositions, I confirm that the numerically computed shear B-mode and rotation mode power spectra are equal at high accuracy ( ≲ 1%) as expected from perturbation theory up to second order. Coupled with realistic galaxy populations placed in large N-body light cone simulations, this new algorithm is ideally suited for the construction of synthetic weak lensing shear catalogs to be used to test for systematic effects in data analysis procedures for upcoming large-area sky surveys. The implementation presented in this work, written in C and employing widely available software libraries to maintain portability, is publicly available at http://code.google.com/p/calclens.
NASA Astrophysics Data System (ADS)
Becker, Matthew R.
2013-10-01
I present a new algorithm, Curved-sky grAvitational Lensing for Cosmological Light conE simulatioNS (CALCLENS), for efficiently computing weak gravitational lensing shear signals from large N-body light cone simulations over a curved sky. This new algorithm properly accounts for the sky curvature and boundary conditions, is able to produce redshift-dependent shear signals including corrections to the Born approximation by using multiple-plane ray tracing and properly computes the lensed images of source galaxies in the light cone. The key feature of this algorithm is a new, computationally efficient Poisson solver for the sphere that combines spherical harmonic transform and multigrid methods. As a result, large areas of sky (˜10 000 square degrees) can be ray traced efficiently at high resolution using only a few hundred cores. Using this new algorithm and curved-sky calculations that only use a slower but more accurate spherical harmonic transform Poisson solver, I study the convergence, shear E-mode, shear B-mode and rotation mode power spectra. Employing full-sky E/B-mode decompositions, I confirm that the numerically computed shear B-mode and rotation mode power spectra are equal at high accuracy (≲1 per cent) as expected from perturbation theory up to second order. Coupled with realistic galaxy populations placed in large N-body light cone simulations, this new algorithm is ideally suited for the construction of synthetic weak lensing shear catalogues to be used to test for systematic effects in data analysis procedures for upcoming large-area sky surveys. The implementation presented in this work, written in C and employing widely available software libraries to maintain portability, is publicly available at http://code.google.com/p/calclens.
A methodology for secure recovery of spacecrafts based on a trusted hardware platform
NASA Astrophysics Data System (ADS)
Juliato, Marcio; Gebotys, Catherine
2017-02-01
This paper proposes a methodology for the secure recovery of spacecrafts and the recovery of its cryptographic capabilities in emergency scenarios recurring from major unintentional failures and malicious attacks. The proposed approach employs trusted modules to achieve higher reliability and security levels in space missions due to the presence of integrity check capabilities as well as secure recovery mechanisms. Additionally, several recovery protocols are thoroughly discussed and analyzed against a wide variety of attacks. Exhaustive search attacks are shown in a wide variety of contexts and are shown to be infeasible and totally independent of the computational power of attackers. Experimental results have shown that the proposed methodology allows for the fast and secure recovery of spacecrafts, demanding minimum implementation area, power consumption and bandwidth.
Computational mechanics needs study
NASA Technical Reports Server (NTRS)
Griffin, O. Hayden, Jr.
1993-01-01
In order to assess the needs in computational mechanics over the next decade, we formulated a questionnaire and contacted computational mechanics researchers and users in industry, government, and academia. As expected, we found a wide variety of computational mechanics usage and research. This report outlines the activity discussed with those contacts, as well as that in our own organizations. It should be noted that most of the contacts were made before the recent decline of the defense industry. Therefore, areas which are strongly defense-oriented may decrease in relative importance. In order to facilitate updating of this study, names of a few key researchers in each area are included as starting points for future literature surveys. These lists of names are not intended to represent those persons doing the best research in that area, nor are they intended to be comprehensive. They are, as previously stated, offered as starting points for future literature searches. Overall, there is currently a broad activity in computational mechanics in this country, with the breadth and depth increasing as more sophisticated software and faster computers become more available. The needs and desires of the workers in this field are as diverse as their background and organizational products. There seems to be some degree of software development in any organization (although the level of activity is highly variable from one organization to another) which has any research component in its mission. It seems, however, that there is considerable use of commercial software in almost all organizations. In most industrial research organizations, it appears that very little actual software development is contracted out, but that most is done in-house, using a mixture of funding sources. Government agencies vary widely in the ratio of in-house to out-house ratio. There is a considerable amount of experimental verification in most, but not all, organizations. Generally, the amount of experimental verification is more than we expected. Of all the survey contacts, one or two believe that the resources they are allocated are sufficient, but most do not. Some believe they have only half the resources they need. Some see their resource deficits as short-term, while others see it as a trend which will continue or perhaps worsen. The pessimism is stronger in the defense and aerospace industry. When considering only the nonlinear development efforts, there appears to be an even mix of geometric and material nonlinearity. There is not much particular emphasis in linear analysis unless it is in extension of current analysis capabilities to larger problems. The primary exception is concern about modeling of composites, where proven methodologies have trailed element and computer hardware development. Most of the people we spoke to use finite element techniques, but there is some finite difference and boundary element work ongoing. There is also some interest in multiple methods. Coupling of finite elements and boundary elements appears to be of high interest, since the two analysis types are complementary.
Workshop report on large-scale matrix diagonalization methods in chemistry theory institute
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bischof, C.H.; Shepard, R.L.; Huss-Lederman, S.
The Large-Scale Matrix Diagonalization Methods in Chemistry theory institute brought together 41 computational chemists and numerical analysts. The goal was to understand the needs of the computational chemistry community in problems that utilize matrix diagonalization techniques. This was accomplished by reviewing the current state of the art and looking toward future directions in matrix diagonalization techniques. This institute occurred about 20 years after a related meeting of similar size. During those 20 years the Davidson method continued to dominate the problem of finding a few extremal eigenvalues for many computational chemistry problems. Work on non-diagonally dominant and non-Hermitian problems asmore » well as parallel computing has also brought new methods to bear. The changes and similarities in problems and methods over the past two decades offered an interesting viewpoint for the success in this area. One important area covered by the talks was overviews of the source and nature of the chemistry problems. The numerical analysts were uniformly grateful for the efforts to convey a better understanding of the problems and issues faced in computational chemistry. An important outcome was an understanding of the wide range of eigenproblems encountered in computational chemistry. The workshop covered problems involving self- consistent-field (SCF), configuration interaction (CI), intramolecular vibrational relaxation (IVR), and scattering problems. In atomic structure calculations using the Hartree-Fock method (SCF), the symmetric matrices can range from order hundreds to thousands. These matrices often include large clusters of eigenvalues which can be as much as 25% of the spectrum. However, if Cl methods are also used, the matrix size can be between 10{sup 4} and 10{sup 9} where only one or a few extremal eigenvalues and eigenvectors are needed. Working with very large matrices has lead to the development of« less
Classification of dried vegetables using computer image analysis and artificial neural networks
NASA Astrophysics Data System (ADS)
Koszela, K.; Łukomski, M.; Mueller, W.; Górna, K.; Okoń, P.; Boniecki, P.; Zaborowicz, M.; Wojcieszak, D.
2017-07-01
In the recent years, there has been a continuously increasing demand for vegetables and dried vegetables. This trend affects the growth of the dehydration industry in Poland helping to exploit excess production. More and more often dried vegetables are used in various sectors of the food industry, both due to their high nutritional qualities and changes in consumers' food preferences. As we observe an increase in consumer awareness regarding a healthy lifestyle and a boom in health food, there is also an increase in the consumption of such food, which means that the production and crop area can increase further. Among the dried vegetables, dried carrots play a strategic role due to their wide application range and high nutritional value. They contain high concentrations of carotene and sugar which is present in the form of crystals. Carrots are also the vegetables which are most often subjected to a wide range of dehydration processes; this makes it difficult to perform a reliable qualitative assessment and classification of this dried product. The many qualitative properties of dried carrots determining their positive or negative quality assessment include colour and shape. The aim of the research was to develop and implement the model of a computer system for the recognition and classification of freeze-dried, convection-dried and microwave vacuum dried products using the methods of computer image analysis and artificial neural networks.
Qiao, Ning; Mostafa, Hesham; Corradi, Federico; Osswald, Marc; Stefanini, Fabio; Sumislawska, Dora; Indiveri, Giacomo
2015-01-01
Implementing compact, low-power artificial neural processing systems with real-time on-line learning abilities is still an open challenge. In this paper we present a full-custom mixed-signal VLSI device with neuromorphic learning circuits that emulate the biophysics of real spiking neurons and dynamic synapses for exploring the properties of computational neuroscience models and for building brain-inspired computing systems. The proposed architecture allows the on-chip configuration of a wide range of network connectivities, including recurrent and deep networks, with short-term and long-term plasticity. The device comprises 128 K analog synapse and 256 neuron circuits with biologically plausible dynamics and bi-stable spike-based plasticity mechanisms that endow it with on-line learning abilities. In addition to the analog circuits, the device comprises also asynchronous digital logic circuits for setting different synapse and neuron properties as well as different network configurations. This prototype device, fabricated using a 180 nm 1P6M CMOS process, occupies an area of 51.4 mm(2), and consumes approximately 4 mW for typical experiments, for example involving attractor networks. Here we describe the details of the overall architecture and of the individual circuits and present experimental results that showcase its potential. By supporting a wide range of cortical-like computational modules comprising plasticity mechanisms, this device will enable the realization of intelligent autonomous systems with on-line learning capabilities.
Morphology control in polymer blend fibers—a high throughput computing approach
NASA Astrophysics Data System (ADS)
Sesha Sarath Pokuri, Balaji; Ganapathysubramanian, Baskar
2016-08-01
Fibers made from polymer blends have conventionally enjoyed wide use, particularly in textiles. This wide applicability is primarily aided by the ease of manufacturing such fibers. More recently, the ability to tailor the internal morphology of polymer blend fibers by carefully designing processing conditions has enabled such fibers to be used in technologically relevant applications. Some examples include anisotropic insulating properties for heat and anisotropic wicking of moisture, coaxial morphologies for optical applications as well as fibers with high internal surface area for filtration and catalysis applications. However, identifying the appropriate processing conditions from the large space of possibilities using conventional trial-and-error approaches is a tedious and resource-intensive process. Here, we illustrate a high throughput computational approach to rapidly explore and characterize how processing conditions (specifically blend ratio and evaporation rates) affect the internal morphology of polymer blends during solvent based fabrication. We focus on a PS: PMMA system and identify two distinct classes of morphologies formed due to variations in the processing conditions. We subsequently map the processing conditions to the morphology class, thus constructing a ‘phase diagram’ that enables rapid identification of processing parameters for specific morphology class. We finally demonstrate the potential for time dependent processing conditions to get desired features of the morphology. This opens up the possibility of rational stage-wise design of processing pathways for tailored fiber morphology using high throughput computing.
Human-technology interaction for standoff IED detection
NASA Astrophysics Data System (ADS)
Zhang, Evan; Zou, Yiyang; Zachrich, Liping; Fulton, Jack
2011-03-01
IEDs kill our soldiers and innocent people every day. Lessons learned from Iraq and Afghanistan clearly indicated that IEDs cannot be detected/defeated by technology alone; human-technology interaction must be engaged. In most cases, eye is the best detector, brain is the best computer, and technologies are tools, they must be used by human being properly then can achieve full functionality. In this paper, a UV Raman/fluorescence, CCD and LWIR 3 sensor fusion system for standoff IED detection and a handheld fusion system for close range IED detection are developed and demonstrated. We must train solders using their eyes or CCD/LWIR cameras to do wide area search while on the move to find small suspected area first then use the spectrometer because the laser spot is too small, to scan a one-mile long and 2-meter wide road needs 185 days although our fusion system can detect the IED in 30m with 1s interrogating time. Even if the small suspected area (e.g., 0.5mx0.5m) is found, human eyes still cannot detect the IED, soldiers must use or interact with the technology - laser based spectrometer to scan the area then they are able to detect and identify the IED in 10 minutes not 185 days. Therefore, the human-technology interaction approach will be the best solution for IED detection.
Uncovering novel repositioning opportunities using the Open Targets platform.
Khaladkar, Mugdha; Koscielny, Gautier; Hasan, Samiul; Agarwal, Pankaj; Dunham, Ian; Rajpal, Deepak; Sanseau, Philippe
2017-12-01
The recently developed Open Targets platform consolidates a wide range of comprehensive evidence associating known and potential drug targets with human diseases. We have harnessed the integrated data from this platform for novel drug repositioning opportunities. Our computational workflow systematically mines data from various evidence categories and presents potential repositioning opportunities for drugs that are marketed or being investigated in ongoing human clinical trials, based on evidence strength on target-disease pairing. We classified these novel target-disease opportunities in several ways: (i) number of independent counts of evidence; (ii) broad therapy area of origin; and (iii) repositioning within or across therapy areas. Finally, we elaborate on one example that was identified by this approach. Copyright © 2017 Elsevier Ltd. All rights reserved.
Active Control Technology at NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Antcliff, Richard R.; McGowan, Anna-Marie R.
2000-01-01
NASA Langley has a long history of attacking important technical Opportunities from a broad base of supporting disciplines. The research and development at Langley in this subject area range from the test tube to the test flight, The information covered here will range from the development of innovative new materials, sensors and actuators, to the incorporation of smart sensors and actuators in practical devices, to the optimization of the location of these devices, to, finally, a wide variety of applications of these devices utilizing Langley's facilities and expertise. Advanced materials are being developed for sensors and actuators, as well as polymers for integrating smart devices into composite structures. Contributions reside in three key areas: computational materials; advanced piezoelectric materials; and integrated composite structures.
CONTACT: An Air Force technical report on military satellite control technology
NASA Astrophysics Data System (ADS)
Weakley, Christopher K.
1993-07-01
This technical report focuses on Military Satellite Control Technologies and their application to the Air Force Satellite Control Network (AFSCN). This report is a compilation of articles that provide an overview of the AFSCN and the Advanced Technology Program, and discusses relevant technical issues and developments applicable to the AFSCN. Among the topics covered are articles on Future Technology Projections; Future AFSCN Topologies; Modeling of the AFSCN; Wide Area Communications Technology Evolution; Automating AFSCN Resource Scheduling; Health & Status Monitoring at Remote Tracking Stations; Software Metrics and Tools for Measuring AFSCN Software Performance; Human-Computer Interface Working Group; Trusted Systems Workshop; and the University Technical Interaction Program. In addition, Key Technology Area points of contact are listed in the report.
Cooperative high-performance storage in the accelerated strategic computing initiative
NASA Technical Reports Server (NTRS)
Gary, Mark; Howard, Barry; Louis, Steve; Minuzzo, Kim; Seager, Mark
1996-01-01
The use and acceptance of new high-performance, parallel computing platforms will be impeded by the absence of an infrastructure capable of supporting orders-of-magnitude improvement in hierarchical storage and high-speed I/O (Input/Output). The distribution of these high-performance platforms and supporting infrastructures across a wide-area network further compounds this problem. We describe an architectural design and phased implementation plan for a distributed, Cooperative Storage Environment (CSE) to achieve the necessary performance, user transparency, site autonomy, communication, and security features needed to support the Accelerated Strategic Computing Initiative (ASCI). ASCI is a Department of Energy (DOE) program attempting to apply terascale platforms and Problem-Solving Environments (PSEs) toward real-world computational modeling and simulation problems. The ASCI mission must be carried out through a unified, multilaboratory effort, and will require highly secure, efficient access to vast amounts of data. The CSE provides a logically simple, geographically distributed, storage infrastructure of semi-autonomous cooperating sites to meet the strategic ASCI PSE goal of highperformance data storage and access at the user desktop.
Proteinortho: detection of (co-)orthologs in large-scale analysis.
Lechner, Marcus; Findeiss, Sven; Steiner, Lydia; Marz, Manja; Stadler, Peter F; Prohaska, Sonja J
2011-04-28
Orthology analysis is an important part of data analysis in many areas of bioinformatics such as comparative genomics and molecular phylogenetics. The ever-increasing flood of sequence data, and hence the rapidly increasing number of genomes that can be compared simultaneously, calls for efficient software tools as brute-force approaches with quadratic memory requirements become infeasible in practise. The rapid pace at which new data become available, furthermore, makes it desirable to compute genome-wide orthology relations for a given dataset rather than relying on relations listed in databases. The program Proteinortho described here is a stand-alone tool that is geared towards large datasets and makes use of distributed computing techniques when run on multi-core hardware. It implements an extended version of the reciprocal best alignment heuristic. We apply Proteinortho to compute orthologous proteins in the complete set of all 717 eubacterial genomes available at NCBI at the beginning of 2009. We identified thirty proteins present in 99% of all bacterial proteomes. Proteinortho significantly reduces the required amount of memory for orthology analysis compared to existing tools, allowing such computations to be performed on off-the-shelf hardware.
NASA Technical Reports Server (NTRS)
Weaver, W. L.; Green, R. N.
1980-01-01
Geometric shape factors were computed and applied to satellite simulated irradiance measurements to estimate Earth emitted flux densities for global and zonal scales and for areas smaller than the detector field of view (FOV). Wide field of view flat plate detectors were emphasized, but spherical detectors were also studied. The radiation field was modeled after data from the Nimbus 2 and 3 satellites. At a satellite altitude of 600 km, zonal estimates were in error 1.0 to 1.2 percent and global estimates were in error less than 0.2 percent. Estimates with unrestricted field of view (UFOV) detectors were about the same for Lambertian and limb darkening radiation models. The opposite was found for restricted field of view detectors. The UFOV detectors are found to be poor estimators of flux density from the total FOV and are shown to be much better as estimators of flux density from a circle centered at the FOV with an area significantly smaller than that for the total FOV.
Efficient feature extraction from wide-area motion imagery by MapReduce in Hadoop
NASA Astrophysics Data System (ADS)
Cheng, Erkang; Ma, Liya; Blaisse, Adam; Blasch, Erik; Sheaff, Carolyn; Chen, Genshe; Wu, Jie; Ling, Haibin
2014-06-01
Wide-Area Motion Imagery (WAMI) feature extraction is important for applications such as target tracking, traffic management and accident discovery. With the increasing amount of WAMI collections and feature extraction from the data, a scalable framework is needed to handle the large amount of information. Cloud computing is one of the approaches recently applied in large scale or big data. In this paper, MapReduce in Hadoop is investigated for large scale feature extraction tasks for WAMI. Specifically, a large dataset of WAMI images is divided into several splits. Each split has a small subset of WAMI images. The feature extractions of WAMI images in each split are distributed to slave nodes in the Hadoop system. Feature extraction of each image is performed individually in the assigned slave node. Finally, the feature extraction results are sent to the Hadoop File System (HDFS) to aggregate the feature information over the collected imagery. Experiments of feature extraction with and without MapReduce are conducted to illustrate the effectiveness of our proposed Cloud-Enabled WAMI Exploitation (CAWE) approach.
47 CFR 54.518 - Support for wide area networks.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 3 2010-10-01 2010-10-01 false Support for wide area networks. 54.518 Section... area networks. To the extent that states, schools, or libraries build or purchase a wide area network to provide telecommunications services, the cost of such wide area networks shall not be eligible for...
Junwei Ma; Han Yuan; Sunderam, Sridhar; Besio, Walter; Lei Ding
2017-07-01
Neural activity inside the human brain generate electrical signals that can be detected on the scalp. Electroencephalograph (EEG) is one of the most widely utilized techniques helping physicians and researchers to diagnose and understand various brain diseases. Due to its nature, EEG signals have very high temporal resolution but poor spatial resolution. To achieve higher spatial resolution, a novel tri-polar concentric ring electrode (TCRE) has been developed to directly measure Surface Laplacian (SL). The objective of the present study is to accurately calculate SL for TCRE based on a realistic geometry head model. A locally dense mesh was proposed to represent the head surface, where the local dense parts were to match the small structural components in TCRE. Other areas without dense mesh were used for the purpose of reducing computational load. We conducted computer simulations to evaluate the performance of the proposed mesh and evaluated possible numerical errors as compared with a low-density model. Finally, with achieved accuracy, we presented the computed forward lead field of SL for TCRE for the first time in a realistic geometry head model and demonstrated that it has better spatial resolution than computed SL from classic EEG recordings.
Sharma, Nandita; Gedeon, Tom
2012-12-01
Stress is a major growing concern in our day and age adversely impacting both individuals and society. Stress research has a wide range of benefits from improving personal operations, learning, and increasing work productivity to benefiting society - making it an interesting and socially beneficial area of research. This survey reviews sensors that have been used to measure stress and investigates techniques for modelling stress. It discusses non-invasive and unobtrusive sensors for measuring computed stress, a term we coin in the paper. Sensors that do not impede everyday activities that could be used by those who would like to monitor stress levels on a regular basis (e.g. vehicle drivers, patients with illnesses linked to stress) is the focus of the discussion. Computational techniques have the capacity to determine optimal sensor fusion and automate data analysis for stress recognition and classification. Several computational techniques have been developed to model stress based on techniques such as Bayesian networks, artificial neural networks, and support vector machines, which this survey investigates. The survey concludes with a summary and provides possible directions for further computational stress research. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Fedosov, Dmitry
2011-03-01
Computational biophysics is a large and rapidly growing area of computational physics. In this talk, we will focus on a number of biophysical problems related to blood cells and blood flow in health and disease. Blood flow plays a fundamental role in a wide range of physiological processes and pathologies in the organism. To understand and, if necessary, manipulate the course of these processes it is essential to investigate blood flow under realistic conditions including deformability of blood cells, their interactions, and behavior in the complex microvascular network. Using a multiscale cell model we are able to accurately capture red blood cell mechanics, rheology, and dynamics in agreement with a number of single cell experiments. Further, this validated model yields accurate predictions of the blood rheological properties, cell migration, cell-free layer, and hemodynamic resistance in microvessels. In addition, we investigate blood related changes in malaria, which include a considerable stiffening of red blood cells and their cytoadherence to endothelium. For these biophysical problems computational modeling is able to provide new physical insights and capabilities for quantitative predictions of blood flow in health and disease.
Valle, Benoît; Simonneau, Thierry; Boulord, Romain; Sourd, Francis; Frisson, Thibault; Ryckewaert, Maxime; Hamard, Philippe; Brichet, Nicolas; Dauzat, Myriam; Christophe, Angélique
2017-01-01
Plant science uses increasing amounts of phenotypic data to unravel the complex interactions between biological systems and their variable environments. Originally, phenotyping approaches were limited by manual, often destructive operations, causing large errors. Plant imaging emerged as a viable alternative allowing non-invasive and automated data acquisition. Several procedures based on image analysis were developed to monitor leaf growth as a major phenotyping target. However, in most proposals, a time-consuming parameterization of the analysis pipeline is required to handle variable conditions between images, particularly in the field due to unstable light and interferences with soil surface or weeds. To cope with these difficulties, we developed a low-cost, 2D imaging method, hereafter called PYM. The method is based on plant leaf ability to absorb blue light while reflecting infrared wavelengths. PYM consists of a Raspberry Pi computer equipped with an infrared camera and a blue filter and is associated with scripts that compute projected leaf area. This new method was tested on diverse species placed in contrasting conditions. Application to field conditions was evaluated on lettuces grown under photovoltaic panels. The objective was to look for possible acclimation of leaf expansion under photovoltaic panels to optimise the use of solar radiation per unit soil area. The new PYM device proved to be efficient and accurate for screening leaf area of various species in wide ranges of environments. In the most challenging conditions that we tested, error on plant leaf area was reduced to 5% using PYM compared to 100% when using a recently published method. A high-throughput phenotyping cart, holding 6 chained PYM devices, was designed to capture up to 2000 pictures of field-grown lettuce plants in less than 2 h. Automated analysis of image stacks of individual plants over their growth cycles revealed unexpected differences in leaf expansion rate between lettuces rows depending on their position below or between the photovoltaic panels. The imaging device described here has several benefits, such as affordability, low cost, reliability and flexibility for online analysis and storage. It should be easily appropriated and customized to meet the needs of various users.
The Effects of Terrain Properties on Determining Crater Model Ages of Lunar Surfaces
NASA Astrophysics Data System (ADS)
Kirchoff, M. R.; Marchi, S.
2017-12-01
Analyzing crater size-frequency distributions (SFDs) and using them to determine model ages of surfaces is an important technique for understanding the Moon's geologic history and evolution. Small craters with diameters (D) < 1 km are frequently used, especially given the very high resolution imaging now available from Lunar Reconnaissance Orbiter Narrow and Wide Angle Cameras (LROC-NAC/WAC) and the Selene Terrain Camera. However, for these diameters, final crater sizes and shapes are affected by the properties of the terrains on which they are formed [1], which alters crater SFD shapes [2]. We use the Model Production Function (MPF; [2]), which includes terrain properties in computing crater production functions, to explore how incorporating terrain properties affects the estimation of crater model ages. First, crater SFDs are compiled utilizing LROC-WAC/NAC images to measure craters with diameters from 10 m up to 20 km (size of largest crater measured depends on the terrain). A nested technique is used to obtain this wide diameter range: D ≥ 0.5 km craters are measured in the largest area, D = 0.09-0.5 km craters are measured in a smaller area within the largest area, and D = 0.01-0.1 km craters are measured in the smallest area located in both of the larger areas. Then, we quantitatively fit the crater SFD with distinct MPFs that use broadly different terrain properties. Terrain properties are varied through coarsely altering the parameters in the crater scaling law [1] that represent material type (consolidated, unconsolidated, porous), material tensile strength, and material density (for further details see [2]). We also discuss the effect of changing terrain properties with depth (i.e., layering). Finally, fits are used to compute the D = 1 km crater model ages for the terrains. We discuss the new constraints on how terrain properties affect crater model ages from our analyses of a variety of lunar terrains from highlands to mare and impact melt to continuous ejecta deposits. References: [1] Holsapple, K. A & Housen, K. R., Icarus 187, 345-356, 2007. [2] Marchi, S., et al., AJ 137, 4936-4948, 2009.
Are You Connected to the Best Apps?
Gaudette, Robert F
2015-11-01
While the vast majority of pharmacists use computers to access medical information, many prefer a mobile device to find information quickly. This review discusses pharmacists' use of mobile device applications (apps) and highlights an assortment of apps that are particularly helpful. Epocrates, which provides drug information and clinical content, was the first popular smartphone app developed in this area and was used to introduce the concept. Today, apps that provide a wide range of drug information can be supplemented with apps that fine-tune specific information about drug monitoring, disease states, and cost.
Coordinated Fault Tolerance for High-Performance Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dongarra, Jack; Bosilca, George; et al.
2013-04-08
Our work to meet our goal of end-to-end fault tolerance has focused on two areas: (1) improving fault tolerance in various software currently available and widely used throughout the HEC domain and (2) using fault information exchange and coordination to achieve holistic, systemwide fault tolerance and understanding how to design and implement interfaces for integrating fault tolerance features for multiple layers of the software stack—from the application, math libraries, and programming language runtime to other common system software such as jobs schedulers, resource managers, and monitoring tools.
Simulation with quantum mechanics/molecular mechanics for drug discovery.
Barbault, Florent; Maurel, François
2015-08-08
Biological macromolecules, such as proteins or nucleic acids, are (still) molecules and thus they follow the same chemical rules that any simple molecule follows, even if their size generally renders accurate studies unhelpful. However, in the context of drug discovery, a detailed analysis of ligand association is required for understanding or predicting their interactions and hybrid quantum mechanics/molecular mechanics (QM/MM) computations are relevant tools to help elucidate this process. Areas covered: In this review, the authors explore the use of QM/MM for drug discovery. After a brief description of the molecular mechanics (MM) technique, the authors describe the subtractive and additive techniques for QM/MM computations. The authors then present several application cases in topics involved in drug discovery. Expert opinion: QM/MM have been widely employed during the last decades to study chemical processes such as enzyme-inhibitor interactions. However, despite the enthusiasm around this area, plain MM simulations may be more meaningful than QM/MM. To obtain reliable results, the authors suggest fixing several keystone parameters according to the underlying chemistry of each studied system.
Automatic Mosaicking of Satellite Imagery Considering the Clouds
NASA Astrophysics Data System (ADS)
Kang, Yifei; Pan, Li; Chen, Qi; Zhang, Tong; Zhang, Shasha; Liu, Zhang
2016-06-01
With the rapid development of high resolution remote sensing for earth observation technology, satellite imagery is widely used in the fields of resource investigation, environment protection, and agricultural research. Image mosaicking is an important part of satellite imagery production. However, the existence of clouds leads to lots of disadvantages for automatic image mosaicking, mainly in two aspects: 1) Image blurring may be caused during the process of image dodging, 2) Cloudy areas may be passed through by automatically generated seamlines. To address these problems, an automatic mosaicking method is proposed for cloudy satellite imagery in this paper. Firstly, modified Otsu thresholding and morphological processing are employed to extract cloudy areas and obtain the percentage of cloud cover. Then, cloud detection results are used to optimize the process of dodging and mosaicking. Thus, the mosaic image can be combined with more clear-sky areas instead of cloudy areas. Besides, clear-sky areas will be clear and distortionless. The Chinese GF-1 wide-field-of-view orthoimages are employed as experimental data. The performance of the proposed approach is evaluated in four aspects: the effect of cloud detection, the sharpness of clear-sky areas, the rationality of seamlines and efficiency. The evaluation results demonstrated that the mosaic image obtained by our method has fewer clouds, better internal color consistency and better visual clarity compared with that obtained by traditional method. The time consumed by the proposed method for 17 scenes of GF-1 orthoimages is within 4 hours on a desktop computer. The efficiency can meet the general production requirements for massive satellite imagery.
NASA Astrophysics Data System (ADS)
Nicotra, M. A.; Anun, S.; Montes, M.; Goldes, G.; Carranza, G.
We describe the outlines of a project for the interconnection between the Astrophysical Station of Bosque Alegre and the wide area network of the University of Córdoba. The Astrophysical Station is located 38.55 km (23.96 milles) from the Observatory of Córdoba. This location is suitable for radio links in the range of centimeters wavelenghts. In the last years, Spread-Spectrum technology equipments has become popular. Spread-Spectrum signals, contrary to narrow band radio signals, operates within a widthband 20 to 200 times broader than the widthband of the modulated information. Signals are modulated by special spreading codes, in such a way that emulates noisy signals. These codes are known under the generic designation of pseudo-random or pseudo-noise. In addition, the wide band is correlated with a low power density in the emitted signals. Spread-Spectrum equipment links are stable, exhibits low interferences with conventional radio transmitters, and their commercial prices are remarkably lower than those for the conventional microwave devices. Data links are compliant with Ethernet protocol networks and operates with data tramsmition rates up to 4 Mbits per second. The described equipment will enable the access to full-Internet services for visitor astronomers in Bosque Alegre. Also, it will be possible fast transfer for the observational data from telescope to computers in the local area network at Córdoba. This project must be considered as the second stage of another wide purpose project, which has the main purpose in transforming the Bosque Alegre Station as a fully robotic station controlled from the computational center at the Observatory in Cordoba. The advantages of robotic telescopes has recently been the subject of several discussions. However, it is now widely accepted that an automatic station enables some important options in the use of the astronomical instruments, such us the possibility of performing parallel programs, one of which is selected accordingly to environmental conditions in the instant of the observation.
Distributed solar radiation fast dynamic measurement for PV cells
NASA Astrophysics Data System (ADS)
Wan, Xuefen; Yang, Yi; Cui, Jian; Du, Xingjing; Zheng, Tao; Sardar, Muhammad Sohail
2017-10-01
To study the operating characteristics about PV cells, attention must be given to the dynamic behavior of the solar radiation. The dynamic behaviors of annual, monthly, daily and hourly averages of solar radiation have been studied in detail. But faster dynamic behaviors of solar radiation need more researches. The solar radiation random fluctuations in minute-long or second-long range, which lead to alternating radiation and cool down/warm up PV cell frequently, decrease conversion efficiency. Fast dynamic processes of solar radiation are mainly relevant to stochastic moving of clouds. Even in clear sky condition, the solar irradiations show a certain degree of fast variation. To evaluate operating characteristics of PV cells under fast dynamic irradiation, a solar radiation measuring array (SRMA) based on large active area photodiode, LoRa spread spectrum communication and nanoWatt MCU is proposed. This cross photodiodes structure tracks fast stochastic moving of clouds. To compensate response time of pyranometer and reduce system cost, the terminal nodes with low-cost fast-responded large active area photodiode are placed besides positions of tested PV cells. A central node, consists with pyranometer, large active area photodiode, wind detector and host computer, is placed in the center of the central topologies coordinate to scale temporal envelope of solar irradiation and get calibration information between pyranometer and large active area photodiodes. In our SRMA system, the terminal nodes are designed based on Microchip's nanoWatt XLP PIC16F1947. FDS-100 is adopted for large active area photodiode in terminal nodes and host computer. The output current and voltage of each PV cell are monitored by I/V measurement. AS62-T27/SX1278 LoRa communication modules are used for communicating between terminal nodes and host computer. Because the LoRa LPWAN (Low Power Wide Area Network) specification provides seamless interoperability among Smart Things without the need of complex local installations, configuring of our SRMA system is very easy. Lora also provides SRMA a means to overcome the short communication distance and weather signal propagation decline such as in ZigBee and WiFi. The host computer in SRMA system uses the low power single-board PC EMB-3870 which was produced by NORCO. Wind direction sensor SM5386B and wind-force sensor SM5387B are installed to host computer through RS-485 bus for wind reference data collection. And Davis 6450 solar radiation sensor, which is a precision instrument that detects radiation at wavelengths of 300 to 1100 nanometers, allow host computer to follow real-time solar radiation. A LoRa polling scheme is adopt for the communication between host computer and terminal nodes in SRMA. An experimental SRMA has been established. This system was tested in Ganyu, Jiangshu province from May to August, 2016. In the test, the distances between the nodes and the host computer were between 100m and 1900m. At work, SRMA system showed higher reliability. Terminal nodes could follow the instructions from host computer and collect solar radiation data of distributed PV cells effectively. And the host computer managed the SRAM and achieves reference parameters well. Communications between the host computer and terminal nodes were almost unaffected by the weather. In conclusion, the testing results show that SRMA could be a capable method for fast dynamic measuring about solar radiation and related PV cell operating characteristics.
Efficient Use of Distributed Systems for Scientific Applications
NASA Technical Reports Server (NTRS)
Taylor, Valerie; Chen, Jian; Canfield, Thomas; Richard, Jacques
2000-01-01
Distributed computing has been regarded as the future of high performance computing. Nationwide high speed networks such as vBNS are becoming widely available to interconnect high-speed computers, virtual environments, scientific instruments and large data sets. One of the major issues to be addressed with distributed systems is the development of computational tools that facilitate the efficient execution of parallel applications on such systems. These tools must exploit the heterogeneous resources (networks and compute nodes) in distributed systems. This paper presents a tool, called PART, which addresses this issue for mesh partitioning. PART takes advantage of the following heterogeneous system features: (1) processor speed; (2) number of processors; (3) local network performance; and (4) wide area network performance. Further, different finite element applications under consideration may have different computational complexities, different communication patterns, and different element types, which also must be taken into consideration when partitioning. PART uses parallel simulated annealing to partition the domain, taking into consideration network and processor heterogeneity. The results of using PART for an explicit finite element application executing on two IBM SPs (located at Argonne National Laboratory and the San Diego Supercomputer Center) indicate an increase in efficiency by up to 36% as compared to METIS, a widely used mesh partitioning tool. The input to METIS was modified to take into consideration heterogeneous processor performance; METIS does not take into consideration heterogeneous networks. The execution times for these applications were reduced by up to 30% as compared to METIS. These results are given in Figure 1 for four irregular meshes with number of elements ranging from 30,269 elements for the Barth5 mesh to 11,451 elements for the Barth4 mesh. Future work with PART entails using the tool with an integrated application requiring distributed systems. In particular this application, illustrated in the document entails an integration of finite element and fluid dynamic simulations to address the cooling of turbine blades of a gas turbine engine design. It is not uncommon to encounter high-temperature, film-cooled turbine airfoils with 1,000,000s of degrees of freedom. This results because of the complexity of the various components of the airfoils, requiring fine-grain meshing for accuracy. Additional information is contained in the original.
Lee, Hsiang-Chieh; Ahsen, Osman Oguz; Liang, Kaicheng; Wang, Zhao; Cleveland, Cody; Booth, Lucas; Potsaid, Benjamin; Jayaraman, Vijaysekhar; Cable, Alex E.; Mashimo, Hiroshi; Langer, Robert; Traverso, Giovanni; Fujimoto, James G.
2016-01-01
We demonstrate a micromotor balloon imaging catheter for ultrahigh speed endoscopic optical coherence tomography (OCT) which provides wide area, circumferential structural and angiographic imaging of the esophagus without contrast agents. Using a 1310 nm MEMS tunable wavelength swept VCSEL light source, the system has a 1.2 MHz A-scan rate and ~8.5 µm axial resolution in tissue. The micromotor balloon catheter enables circumferential imaging of the esophagus at 240 frames per second (fps) with a ~30 µm (FWHM) spot size. Volumetric imaging is achieved by proximal pullback of the micromotor assembly within the balloon at 1.5 mm/sec. Volumetric data consisting of 4200 circumferential images of 5,000 A-scans each over a 2.6 cm length, covering a ~13 cm2 area is acquired in <18 seconds. A non-rigid image registration algorithm is used to suppress motion artifacts from non-uniform rotational distortion (NURD), cardiac motion or respiration. En face OCT images at various depths can be generated. OCT angiography (OCTA) is computed using intensity decorrelation between sequential pairs of circumferential scans and enables three-dimensional visualization of vasculature. Wide area volumetric OCT and OCTA imaging of the swine esophagus in vivo is demonstrated. PMID:27570688
Efficient large-scale graph data optimization for intelligent video surveillance
NASA Astrophysics Data System (ADS)
Shang, Quanhong; Zhang, Shujun; Wang, Yanbo; Sun, Chen; Wang, Zepeng; Zhang, Luming
2017-08-01
Society is rapidly accepting the use of a wide variety of cameras Location and applications: site traffic monitoring, parking Lot surveillance, car and smart space. These ones here the camera provides data every day in an analysis Effective way. Recent advances in sensor technology Manufacturing, communications and computing are stimulating.The development of new applications that can change the traditional Vision system incorporating universal smart camera network. This Analysis of visual cues in multi camera networks makes wide Applications ranging from smart home and office automation to large area surveillance and traffic surveillance. In addition, dense Camera networks, most of which have large overlapping areas of cameras. In the view of good research, we focus on sparse camera networks. One Sparse camera network using large area surveillance. As few cameras as possible, most cameras do not overlap Each other’s field of vision. This task is challenging Lack of knowledge of topology Network, the specific changes in appearance and movement Track different opinions of the target, as well as difficulties Understanding complex events in a network. In this review in this paper, we present a comprehensive survey of recent studies Results to solve the problem of topology learning, Object appearance modeling and global activity understanding sparse camera network. In addition, some of the current open Research issues are discussed.
NASA Technical Reports Server (NTRS)
Lyon, Richard G. (Inventor); Leisawitz, David T. (Inventor); Rinehart, Stephen A. (Inventor); Memarsadeghi, Nargess (Inventor)
2012-01-01
Disclosed herein are systems, computer-implemented methods, and tangible computer-readable storage media for wide field imaging interferometry. The method includes for each point in a two dimensional detector array over a field of view of an image: gathering a first interferogram from a first detector and a second interferogram from a second detector, modulating a path-length for a signal from an image associated with the first interferogram in the first detector, overlaying first data from the modulated first detector and second data from the second detector, and tracking the modulating at every point in a two dimensional detector array comprising the first detector and the second detector over a field of view for the image. The method then generates a wide-field data cube based on the overlaid first data and second data for each point. The method can generate an image from the wide-field data cube.
Performance Evaluation of Communication Software Systems for Distributed Computing
NASA Technical Reports Server (NTRS)
Fatoohi, Rod
1996-01-01
In recent years there has been an increasing interest in object-oriented distributed computing since it is better quipped to deal with complex systems while providing extensibility, maintainability, and reusability. At the same time, several new high-speed network technologies have emerged for local and wide area networks. However, the performance of networking software is not improving as fast as the networking hardware and the workstation microprocessors. This paper gives an overview and evaluates the performance of the Common Object Request Broker Architecture (CORBA) standard in a distributed computing environment at NASA Ames Research Center. The environment consists of two testbeds of SGI workstations connected by four networks: Ethernet, FDDI, HiPPI, and ATM. The performance results for three communication software systems are presented, analyzed and compared. These systems are: BSD socket programming interface, IONA's Orbix, an implementation of the CORBA specification, and the PVM message passing library. The results show that high-level communication interfaces, such as CORBA and PVM, can achieve reasonable performance under certain conditions.
Sawyer, Travis W; Petersburg, Ryan; Bohndiek, Sarah E
2017-04-20
Optical fiber technology is found in a wide variety of applications to flexibly relay light between two points, enabling information transfer across long distances and allowing access to hard-to-reach areas. Large-core optical fibers and light guides find frequent use in illumination and spectroscopic applications, for example, endoscopy and high-resolution astronomical spectroscopy. Proper alignment is critical for maximizing throughput in optical fiber coupling systems; however, there currently are no formal approaches to tolerancing the alignment of a light-guide coupling system. Here, we propose a Fourier alignment sensitivity (FAS) algorithm to determine the optimal tolerances on the alignment of a light guide by computing the alignment sensitivity. The algorithm shows excellent agreement with both simulated and experimentally measured values and improves on the computation time of equivalent ray-tracing simulations by two orders of magnitude. We then apply FAS to tolerance and fabricate a coupling system, which is shown to meet specifications, thus validating FAS as a tolerancing technique. These results indicate that FAS is a flexible and rapid means to quantify the alignment sensitivity of a light guide, widely informing the design and tolerancing of coupling systems.
NASA Astrophysics Data System (ADS)
Clifford, Corey; Kimber, Mark
2017-11-01
Over the last 30 years, an industry-wide shift within the nuclear community has led to increased utilization of computational fluid dynamics (CFD) to supplement nuclear reactor safety analyses. One such area that is of particular interest to the nuclear community, specifically to those performing loss-of-flow accident (LOFA) analyses for next-generation very-high temperature reactors (VHTR), is the capacity of current computational models to predict heat transfer across a wide range of buoyancy conditions. In the present investigation, a critical evaluation of Reynolds-averaged Navier-Stokes (RANS) and large-eddy simulation (LES) turbulence modeling techniques is conducted based on CFD validation data collected from the Rotatable Buoyancy Tunnel (RoBuT) at Utah State University. Four different experimental flow conditions are investigated: (1) buoyancy-aided forced convection; (2) buoyancy-opposed forced convection; (3) buoyancy-aided mixed convection; (4) buoyancy-opposed mixed convection. Overall, good agreement is found for both forced convection-dominated scenarios, but an overly-diffusive prediction of the normal Reynolds stress is observed for the RANS-based turbulence models. Low-Reynolds number RANS models perform adequately for mixed convection, while higher-order RANS approaches underestimate the influence of buoyancy on the production of turbulence.
Sawyer, Travis W.; Petersburg, Ryan; Bohndiek, Sarah E.
2017-01-01
Optical fiber technology is found in a wide variety of applications to flexibly relay light between two points, enabling information transfer across long distances and allowing access to hard-to-reach areas. Large-core optical fibers and light guides find frequent use in illumination and spectroscopic applications; for example, endoscopy and high-resolution astronomical spectroscopy. Proper alignment is critical for maximizing throughput in optical fiber coupling systems, however, there currently are no formal approaches to tolerancing the alignment of a light guide coupling system. Here, we propose a Fourier Alignment Sensitivity (FAS) algorithm to determine the optimal tolerances on the alignment of a light guide by computing the alignment sensitivity. The algorithm shows excellent agreement with both simulated and experimentally measured values and improves on the computation time of equivalent ray tracing simulations by two orders of magnitude. We then apply FAS to tolerance and fabricate a coupling system, which is shown to meet specifications, thus validating FAS as a tolerancing technique. These results indicate that FAS is a flexible and rapid means to quantify the alignment sensitivity of a light guide, widely informing the design and tolerancing of coupling systems. PMID:28430250
Simple sequence repeats in Escherichia coli: abundance, distribution, composition, and polymorphism.
Gur-Arie, R; Cohen, C J; Eitan, Y; Shelef, L; Hallerman, E M; Kashi, Y
2000-01-01
Computer-based genome-wide screening of the DNA sequence of Escherichia coli strain K12 revealed tens of thousands of tandem simple sequence repeat (SSR) tracts, with motifs ranging from 1 to 6 nucleotides. SSRs were well distributed throughout the genome. Mononucleotide SSRs were over-represented in noncoding regions and under-represented in open reading frames (ORFs). Nucleotide composition of mono- and dinucleotide SSRs, both in ORFs and in noncoding regions, differed from that of the genomic region in which they occurred, with 93% of all mononucleotide SSRs proving to be of A or T. Computer-based analysis of the fine position of every SSR locus in the noncoding portion of the genome relative to downstream ORFs showed SSRs located in areas that could affect gene regulation. DNA sequences at 14 arbitrarily chosen SSR tracts were compared among E. coli strains. Polymorphisms of SSR copy number were observed at four of seven mononucleotide SSR tracts screened, with all polymorphisms occurring in noncoding regions. SSR polymorphism could prove important as a genome-wide source of variation, both for practical applications (including rapid detection, strain identification, and detection of loci affecting key phenotypes) and for evolutionary adaptation of microbes.
Clague, D.A.; Frey, F.A.; Thompson, G.; Rindge, S.
1981-01-01
A wide range of rock types (abyssal tholeiite, Fe-Ti-rich basalt, andesite, and rhyodacite) were dredged from near 95oW and 85oW on the Galapagos spreading center. Computer modeling of major element compositions has shown that these rocks could be derived from common parental magmas by successive degrees of fractional crystallization. However, the P2O5/K2O ratio implies distinct mantle source compositions for the two areas. These source regions also have different rare earth element (REE) abundance patterns. The sequence of fractionated lavas differs for the two areas and indicates earlier fractionation of apatite and titanomagnetite in the lavas from 95oW. The mantle source regions for these two areas are interpreted to be depleted in incompatible (and volatile?) elements, although the source region beneath 95oW is less severely depleted in La and K. -Authors
Psychophysics and Neuronal Bases of Sound Localization in Humans
Ahveninen, Jyrki; Kopco, Norbert; Jääskeläinen, Iiro P.
2013-01-01
Localization of sound sources is a considerable computational challenge for the human brain. Whereas the visual system can process basic spatial information in parallel, the auditory system lacks a straightforward correspondence between external spatial locations and sensory receptive fields. Consequently, the question how different acoustic features supporting spatial hearing are represented in the central nervous system is still open. Functional neuroimaging studies in humans have provided evidence for a posterior auditory “where” pathway that encompasses non-primary auditory cortex areas, including the planum temporale (PT) and posterior superior temporal gyrus (STG), which are strongly activated by horizontal sound direction changes, distance changes, and movement. However, these areas are also activated by a wide variety of other stimulus features, posing a challenge for the interpretation that the underlying areas are purely spatial. This review discusses behavioral and neuroimaging studies on sound localization, and some of the competing models of representation of auditory space in humans. PMID:23886698
NASA Technical Reports Server (NTRS)
Richards, Stephen F.
1991-01-01
Although computerized operations have significant gains realized in many areas, one area, scheduling, has enjoyed few benefits from automation. The traditional methods of industrial engineering and operations research have not proven robust enough to handle the complexities associated with the scheduling of realistic problems. To address this need, NASA has developed the computer-aided scheduling system (COMPASS), a sophisticated, interactive scheduling tool that is in wide-spread use within NASA and the contractor community. Therefore, COMPASS provides no explicit support for the large class of problems in which several people, perhaps at various locations, build separate schedules that share a common pool of resources. This research examines the issue of distributing scheduling, as applied to application domains characterized by the partial ordering of tasks, limited resources, and time restrictions. The focus of this research is on identifying issues related to distributed scheduling, locating applicable problem domains within NASA, and suggesting areas for ongoing research. The issues that this research identifies are goals, rescheduling requirements, database support, the need for communication and coordination among individual schedulers, the potential for expert system support for scheduling, and the possibility of integrating artificially intelligent schedulers into a network of human schedulers.
Characteristic analysis-1981: Final program and a possible discovery
McCammon, R.B.; Botbol, J.M.; Sinding-Larsen, R.; Bowen, R.W.
1983-01-01
The latest ornewest version of thecharacteristicanalysis (NCHARAN)computer program offers the exploration geologist a wide variety of options for integrating regionalized multivariate data. The options include the selection of regional cells for characterizing deposit models, the selection of variables that constitute the models, and the choice of logical combinations of variables that best represent these models. Moreover, the program provides for the display of results which, in turn, makes possible review, reselection, and refinement of a model. Most important, the performance of the above-mentioned steps in an interactive computing mode can result in a timely and meaningful interpretation of the data available to the exploration geologist. The most recent application of characteristic analysis has resulted in the possible discovery of economic sulfide mineralization in the Grong area in central Norway. Exploration data for 27 geophysical, geological, and geochemical variables were used to construct a mineralized and a lithogeochemical model for an area that contained a known massive sulfide deposit. The models were applied to exploration data collected from the Gjersvik area in the Grong mining district and resulted in the identification of two localities of possible mineralization. Detailed field examination revealed the presence of a sulfide vein system and a partially inverted stratigraphic sequence indicating the possible presence of a massive sulfide deposit at depth. ?? 1983 Plenum Publishing Corporation.
A scattering model for forested area
NASA Technical Reports Server (NTRS)
Karam, M. A.; Fung, A. K.
1988-01-01
A forested area is modeled as a volume of randomly oriented and distributed disc-shaped, or needle-shaped leaves shading a distribution of branches modeled as randomly oriented finite-length, dielectric cylinders above an irregular soil surface. Since the radii of branches have a wide range of sizes, the model only requires the length of a branch to be large compared with its radius which may be any size relative to the incident wavelength. In addition, the model also assumes the thickness of a disc-shaped leaf or the radius of a needle-shaped leaf is much smaller than the electromagnetic wavelength. The scattering phase matrices for disc, needle, and cylinder are developed in terms of the scattering amplitudes of the corresponding fields which are computed by the forward scattering theorem. These quantities along with the Kirchoff scattering model for a randomly rough surface are used in the standard radiative transfer formulation to compute the backscattering coefficient. Numerical illustrations for the backscattering coefficient are given as a function of the shading factor, incidence angle, leaf orientation distribution, branch orientation distribution, and the number density of leaves. Also illustrated are the properties of the extinction coefficient as a function of leaf and branch orientation distributions. Comparisons are made with measured backscattering coefficients from forested areas reported in the literature.
ERIC Educational Resources Information Center
Chou, Huey-Wen; Wang, Yu-Fang
1999-01-01
Compares the effects of two training methods on computer attitude and performance in a World Wide Web page design program in a field experiment with high school students in Taiwan. Discusses individual differences, Kolb's Experiential Learning Theory and Learning Style Inventory, Computer Attitude Scale, and results of statistical analyses.…
Planetary-Scale Geospatial Data Analysis Techniques in Google's Earth Engine Platform (Invited)
NASA Astrophysics Data System (ADS)
Hancher, M.
2013-12-01
Geoscientists have more and more access to new tools for large-scale computing. With any tool, some tasks are easy and other tasks hard. It is natural to look to new computing platforms to increase the scale and efficiency of existing techniques, but there is a more exiting opportunity to discover and develop a new vocabulary of fundamental analysis idioms that are made easy and effective by these new tools. Google's Earth Engine platform is a cloud computing environment for earth data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog includes a nearly complete archive of scenes from Landsat 4, 5, 7, and 8 that have been processed by the USGS, as well as a wide variety of other remotely-sensed and ancillary data products. Earth Engine supports a just-in-time computation model that enables real-time preview during algorithm development and debugging as well as during experimental data analysis and open-ended data exploration. Data processing operations are performed in parallel across many computers in Google's datacenters. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, resampling, and associating image metadata with pixel data. Early applications of Earth Engine have included the development of Google's global cloud-free fifteen-meter base map and global multi-decadal time-lapse animations, as well as numerous large and small experimental analyses by scientists from a range of academic, government, and non-governmental institutions, working in a wide variety of application areas including forestry, agriculture, urban mapping, and species habitat modeling. Patterns in the successes and failures of these early efforts have begun to emerge, sketching the outlines of a new set of simple and effective approaches to geospatial data analysis.
Karpievitch, Yuliya V; Almeida, Jonas S
2006-01-01
Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet. PMID:16539707
Karpievitch, Yuliya V; Almeida, Jonas S
2006-03-15
Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet.
Morphologic Evolution of the Mount St. Helens Crater Area, Washington
NASA Technical Reports Server (NTRS)
Beach, G. L.
1985-01-01
The large rockslide-avalanche that preceded the eruption of Mount St. Helens on 18 May 1980 removed approximately 2.8 cubic km of material from the summit and north flank of the volcano, forming a horseshoe-shaped crater 2.0 km wide and 3.9 km long. A variety of erosional and depositional processes, notably mass wasting and gully development, acted to modify the topographic configuration of the crater area. To document this morphologic evolution, a series of annual large-scale topographic maps is being produced as a base for comparitive geomorphic analysis. Four topographic maps of the Mount St. Helens crater area at a scale of 1:4000 were produced by the National Mapping Division of the U. S. Geological Survey. Stereo aerial photography for the maps was obtained on 23 October 1980, 10 September 1981, 1 September 1982, and 17 August 1983. To quantify topographic changes in the study area, each topographic map is being digitized and corresponding X, Y, and Z values from successive maps are being computer-compared.
NASA Technical Reports Server (NTRS)
1975-01-01
A project was undertaken in Meade County, South Dakota to provide (1) a general county-wide resource survey of land use and soils and (2) a detailed survey of land use for the environmentally sensitive area adjacent to the Black Hills. Imagery from LANDSAT-1 was visually interpreted to provide land use information and a general soils map. A detailed land use map for the Black Hills area was interpreted from RB-57 photographs and interpretations of soil characteristics were input into a computer data base and mapped. The detailed land use data were then used in conjunction with soil maps to provide information for the development of zoning ordinance maps and other land use planning in the Black Hills area. The use of photographs as base maps was also demonstrated. In addition, the use of airborne thermography to locate spoilage areas in sugar beet piles and to determine the apparent temperature of rooftops was evaluated.
Effects of the 2008 flood on economic performance and food security in Yemen: a simulation analysis.
Breisinger, Clemens; Ecker, Olivier; Thiele, Rainer; Wiebelt, Manfred
2016-04-01
Extreme weather events such as floods and droughts can have devastating consequences for individual well being and economic development, in particular in poor societies with limited availability of coping mechanisms. Combining a dynamic computable general equilibrium model of the Yemeni economy with a household-level calorie consumption simulation model, this paper assesses the economy-wide, agricultural and food security effects of the 2008 tropical storm and flash flood that hit the Hadramout and Al-Mahrah governorates. The estimation results suggest that agricultural value added, farm household incomes and rural food security deteriorated long term in the flood-affected areas. Due to economic spillover effects, significant income losses and increases in food insecurity also occurred in areas that were unaffected by flooding. This finding suggests that while most relief efforts are typically concentrated in directly affected areas, future efforts should also consider surrounding areas and indirectly affected people. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.
Memory management in genome-wide association studies
2009-01-01
Genome-wide association is a powerful tool for the identification of genes that underlie common diseases. Genome-wide association studies generate billions of genotypes and pose significant computational challenges for most users including limited computer memory. We applied a recently developed memory management tool to two analyses of North American Rheumatoid Arthritis Consortium studies and measured the performance in terms of central processing unit and memory usage. We conclude that our memory management approach is simple, efficient, and effective for genome-wide association studies. PMID:20018047
Airside HVAC BESTEST: HVAC Air-Distribution System Model Test Cases for ASHRAE Standard 140
DOE Office of Scientific and Technical Information (OSTI.GOV)
Judkoff, Ronald; Neymark, Joel; Kennedy, Mike D.
This paper summarizes recent work to develop new airside HVAC equipment model analytical verification test cases for ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs. The analytical verification test method allows comparison of simulation results from a wide variety of building energy simulation programs with quasi-analytical solutions, further described below. Standard 140 is widely cited for evaluating software for use with performance-path energy efficiency analysis, in conjunction with well-known energy-efficiency standards including ASHRAE Standard 90.1, the International Energy Conservation Code, and other international standards. Airside HVAC Equipment is a common area ofmore » modelling not previously explicitly tested by Standard 140. Integration of the completed test suite into Standard 140 is in progress.« less
Intelligent Information Fusion in the Aviation Domain: A Semantic-Web based Approach
NASA Technical Reports Server (NTRS)
Ashish, Naveen; Goforth, Andre
2005-01-01
Information fusion from multiple sources is a critical requirement for System Wide Information Management in the National Airspace (NAS). NASA and the FAA envision creating an "integrated pool" of information originally coming from different sources, which users, intelligent agents and NAS decision support tools can tap into. In this paper we present the results of our initial investigations into the requirements and prototype development of such an integrated information pool for the NAS. We have attempted to ascertain key requirements for such an integrated pool based on a survey of DSS tools that will benefit from this integrated pool. We then advocate key technologies from computer science research areas such as the semantic web, information integration, and intelligent agents that we believe are well suited to achieving the envisioned system wide information management capabilities.
Open Babel: An open chemical toolbox
2011-01-01
Background A frequent problem in computational modeling is the interconversion of chemical structures between different formats. While standard interchange formats exist (for example, Chemical Markup Language) and de facto standards have arisen (for example, SMILES format), the need to interconvert formats is a continuing problem due to the multitude of different application areas for chemistry data, differences in the data stored by different formats (0D versus 3D, for example), and competition between software along with a lack of vendor-neutral formats. Results We discuss, for the first time, Open Babel, an open-source chemical toolbox that speaks the many languages of chemical data. Open Babel version 2.3 interconverts over 110 formats. The need to represent such a wide variety of chemical and molecular data requires a library that implements a wide range of cheminformatics algorithms, from partial charge assignment and aromaticity detection, to bond order perception and canonicalization. We detail the implementation of Open Babel, describe key advances in the 2.3 release, and outline a variety of uses both in terms of software products and scientific research, including applications far beyond simple format interconversion. Conclusions Open Babel presents a solution to the proliferation of multiple chemical file formats. In addition, it provides a variety of useful utilities from conformer searching and 2D depiction, to filtering, batch conversion, and substructure and similarity searching. For developers, it can be used as a programming library to handle chemical data in areas such as organic chemistry, drug design, materials science, and computational chemistry. It is freely available under an open-source license from http://openbabel.org. PMID:21982300
The information systems heritage. [overview of technology developments over past five decades
NASA Technical Reports Server (NTRS)
Kurzhals, P. R.; Bricker, R. W.; Jensen, A. S.; Smith, A. T.
1981-01-01
This paper addresses key developments in the evolution of information systems over the past five decades. Major areas covered include the growth of imaging sensors from such pioneering devices as the iconoscope and orthicon which ushered in television, through a wide range of vidicon tubes, to the solid-state arrays which characterize current systems; the phenomenal expansion of electronic communications from telegraph and telephone wires, through the introduction of broadcast and microwave relay services, to the present era of worldwide satellite communications and data networks; and the key role of digital computers from their ancient precursors like the abacus and the mechanical calculating engines, through the appearance of the first large-scale electronic computers and their transistorized successors, to the rapid proliferation of miniaturized processors which impact every aspect of aerospace systems today.
Preface for the special issue of Mathematical Biosciences and Engineering, BIOCOMP 2012.
Buonocore, Aniello; Di Crescenzo, Antonio; Hastings, Alan
2014-04-01
The International Conference "BIOCOMP2012 - Mathematical Modeling and Computational Topics in Biosciences'', was held in Vietri sul Mare (Italy), June 4-8, 2012. It was dedicated to the Memory of Professor Luigi M. Ricciardi (1942-2011), who was a visionary and tireless promoter of the 3 previous editions of the BIOCOMP conference series. We thought that the best way to honor his memory was to continue the BIOCOMP program. Over the years, this conference promoted scientific activities related to his wide interests and scientific expertise, which ranged in various areas of applications of mathematics, probability and statistics to biosciences and cybernetics, also with emphasis on computational problems. We are pleased that many of his friends and colleagues, as well as many other scientists, were attracted by the goals of this recent event and offered to contribute to its success.
Detailed description of the Mayo/IBM PACS
NASA Astrophysics Data System (ADS)
Gehring, Dale G.; Persons, Kenneth R.; Rothman, Melvyn L.; Salutz, James R.; Morin, Richard L.
1991-07-01
The Mayo Clinic and IBM/Rochester have jointly developed a picture archiving system (PACS) for use with Mayo's MRI and Neuro-CT imaging modalities. The system was developed to replace the imaging system's vendor-supplied magnetic tape archiving capability. The system consists of seven MR imagers and nine CT scanners, each interfaced to the PACS via IBM Personal System/2(tm) (PS/2) computers, which act as gateways from the imaging modality to the PACS network. The PAC system operates on the token-ring component of Mayo's city-wide local area network. Also on the PACS network are four optical storage subsystems used for image archival, three optical subsystems used for image retrieval, an IBM Application System/400(tm) (AS/400) computer used for database management and multiple PS/2-based image display systems and their image servers.
A novel parallel architecture for local histogram equalization
NASA Astrophysics Data System (ADS)
Ohannessian, Mesrob I.; Choueiter, Ghinwa F.; Diab, Hassan
2005-07-01
Local histogram equalization is an image enhancement algorithm that has found wide application in the pre-processing stage of areas such as computer vision, pattern recognition and medical imaging. The computationally intensive nature of the procedure, however, is a main limitation when real time interactive applications are in question. This work explores the possibility of performing parallel local histogram equalization, using an array of special purpose elementary processors, through an HDL implementation that targets FPGA or ASIC platforms. A novel parallelization scheme is presented and the corresponding architecture is derived. The algorithm is reduced to pixel-level operations. Processing elements are assigned image blocks, to maintain a reasonable performance-cost ratio. To further simplify both processor and memory organizations, a bit-serial access scheme is used. A brief performance assessment is provided to illustrate and quantify the merit of the approach.
NASA Astrophysics Data System (ADS)
Zadkov, Victor N.; Koroteev, Nikolai I.
1995-10-01
An experience of managing the continuing education and retraining programs at the International Laser Center (ILC) of Moscow State University is discussed. The offered programs are in a wide range of areas, namely laser physics and technology, laser biophysics and biomedicine, laser chemistry, and computers in laser physics. The attendees who are presumably scientists, engineers, technical managers, and graduate students can join these programs through the annual ILC term (6 months), individual training and research programs (up to a year), annual ILC Laser Graduate School, graduate study, and post-docs program, which are reviewed in the paper. A curriculum that includes basic and specialized courses is described in detail. A brief description of the ILC Laser Teaching and Computer Labs that support all the educational courses is given as well.
Latency Hiding in Dynamic Partitioning and Load Balancing of Grid Computing Applications
NASA Technical Reports Server (NTRS)
Das, Sajal K.; Harvey, Daniel J.; Biswas, Rupak
2001-01-01
The Information Power Grid (IPG) concept developed by NASA is aimed to provide a metacomputing platform for large-scale distributed computations, by hiding the intricacies of highly heterogeneous environment and yet maintaining adequate security. In this paper, we propose a latency-tolerant partitioning scheme that dynamically balances processor workloads on the.IPG, and minimizes data movement and runtime communication. By simulating an unsteady adaptive mesh application on a wide area network, we study the performance of our load balancer under the Globus environment. The number of IPG nodes, the number of processors per node, and the interconnected speeds are parameterized to derive conditions under which the IPG would be suitable for parallel distributed processing of such applications. Experimental results demonstrate that effective solution are achieved when the IPG nodes are connected by a high-speed asynchronous interconnection network.
[Computerized evaluation of reparative processes of the cervix uteri].
Pasquinucci, C; Contini, V
1990-01-01
This study was aimed to evaluate the effect of polydeoxyribonucleotide (PDRN), as reported in relevant literature, on cervical epithelia dynamics. Particularly, the interactions taking place between columnar epithelium and the squamous one have been examined. For the purposes of the study, the following computerized techniques, already widely known, have been used: The colposcope is joined to a videocamera connected with a computer (AT compatible). The computer is equipped with a graphic card capable to record and to digit the image, i.e. to make it recognizable by the computer itself. Thereafter, many operations can be performed on the colposcopic images: reductions, enlargements, retouches, record, recall, analysis, etc. Moreover, irregular epithelial areas can be easily determined to a good approximation and, using pre-established enlargement ratios, their evolution can be evaluated. By means of this technique 12 out-patients with uterine cervix ectopias, with or without normal transformation zone (NTZ), have been examined. The monthly therapy was 12 pessaries, each containing 5 mg polydeoxyribonucleotide (POLIDES 5--Farmigea), from the 7th to the 18th day of the cycle, repeated for 3 months. Since the first month of treatment a reduction of the ectopic columnar epithelium has been noted in most patients (9 on 12), with a squamous epithelium increase (peripheral reparative process). This process has kept on increasing during the following months in the 9 patients responding to the treatment, whose ectopic areas were covered by squamous epithelium (average 55% of the area; range 33%-78%). No response to the treatment has been shown in 3 cases.(ABSTRACT TRUNCATED AT 250 WORDS)
Computational nuclear quantum many-body problem: The UNEDF project
NASA Astrophysics Data System (ADS)
Bogner, S.; Bulgac, A.; Carlson, J.; Engel, J.; Fann, G.; Furnstahl, R. J.; Gandolfi, S.; Hagen, G.; Horoi, M.; Johnson, C.; Kortelainen, M.; Lusk, E.; Maris, P.; Nam, H.; Navratil, P.; Nazarewicz, W.; Ng, E.; Nobre, G. P. A.; Ormand, E.; Papenbrock, T.; Pei, J.; Pieper, S. C.; Quaglioni, S.; Roche, K. J.; Sarich, J.; Schunck, N.; Sosonkina, M.; Terasaki, J.; Thompson, I.; Vary, J. P.; Wild, S. M.
2013-10-01
The UNEDF project was a large-scale collaborative effort that applied high-performance computing to the nuclear quantum many-body problem. The primary focus of the project was on constructing, validating, and applying an optimized nuclear energy density functional, which entailed a wide range of pioneering developments in microscopic nuclear structure and reactions, algorithms, high-performance computing, and uncertainty quantification. UNEDF demonstrated that close associations among nuclear physicists, mathematicians, and computer scientists can lead to novel physics outcomes built on algorithmic innovations and computational developments. This review showcases a wide range of UNEDF science results to illustrate this interplay.
Ekstrom, Arne D.; Arnold, Aiden E. G. F.; Iaria, Giuseppe
2014-01-01
While the widely studied allocentric spatial representation holds a special status in neuroscience research, its exact nature and neural underpinnings continue to be the topic of debate, particularly in humans. Here, based on a review of human behavioral research, we argue that allocentric representations do not provide the kind of map-like, metric representation one might expect based on past theoretical work. Instead, we suggest that almost all tasks used in past studies involve a combination of egocentric and allocentric representation, complicating both the investigation of the cognitive basis of an allocentric representation and the task of identifying a brain region specifically dedicated to it. Indeed, as we discuss in detail, past studies suggest numerous brain regions important to allocentric spatial memory in addition to the hippocampus, including parahippocampal, retrosplenial, and prefrontal cortices. We thus argue that although allocentric computations will often require the hippocampus, particularly those involving extracting details across temporally specific routes, the hippocampus is not necessary for all allocentric computations. We instead suggest that a non-aggregate network process involving multiple interacting brain areas, including hippocampus and extra-hippocampal areas such as parahippocampal, retrosplenial, prefrontal, and parietal cortices, better characterizes the neural basis of spatial representation during navigation. According to this model, an allocentric representation does not emerge from the computations of a single brain region (i.e., hippocampus) nor is it readily decomposable into additive computations performed by separate brain regions. Instead, an allocentric representation emerges from computations partially shared across numerous interacting brain regions. We discuss our non-aggregate network model in light of existing data and provide several key predictions for future experiments. PMID:25346679
Computer Vision Techniques for Transcatheter Intervention
Zhao, Feng; Roach, Matthew
2015-01-01
Minimally invasive transcatheter technologies have demonstrated substantial promise for the diagnosis and the treatment of cardiovascular diseases. For example, transcatheter aortic valve implantation is an alternative to aortic valve replacement for the treatment of severe aortic stenosis, and transcatheter atrial fibrillation ablation is widely used for the treatment and the cure of atrial fibrillation. In addition, catheter-based intravascular ultrasound and optical coherence tomography imaging of coronary arteries provides important information about the coronary lumen, wall, and plaque characteristics. Qualitative and quantitative analysis of these cross-sectional image data will be beneficial to the evaluation and the treatment of coronary artery diseases such as atherosclerosis. In all the phases (preoperative, intraoperative, and postoperative) during the transcatheter intervention procedure, computer vision techniques (e.g., image segmentation and motion tracking) have been largely applied in the field to accomplish tasks like annulus measurement, valve selection, catheter placement control, and vessel centerline extraction. This provides beneficial guidance for the clinicians in surgical planning, disease diagnosis, and treatment assessment. In this paper, we present a systematical review on these state-of-the-art methods. We aim to give a comprehensive overview for researchers in the area of computer vision on the subject of transcatheter intervention. Research in medical computing is multi-disciplinary due to its nature, and hence, it is important to understand the application domain, clinical background, and imaging modality, so that methods and quantitative measurements derived from analyzing the imaging data are appropriate and meaningful. We thus provide an overview on the background information of the transcatheter intervention procedures, as well as a review of the computer vision techniques and methodologies applied in this area. PMID:27170893
Computational analysis of conserved RNA secondary structure in transcriptomes and genomes.
Eddy, Sean R
2014-01-01
Transcriptomics experiments and computational predictions both enable systematic discovery of new functional RNAs. However, many putative noncoding transcripts arise instead from artifacts and biological noise, and current computational prediction methods have high false positive rates. I discuss prospects for improving computational methods for analyzing and identifying functional RNAs, with a focus on detecting signatures of conserved RNA secondary structure. An interesting new front is the application of chemical and enzymatic experiments that probe RNA structure on a transcriptome-wide scale. I review several proposed approaches for incorporating structure probing data into the computational prediction of RNA secondary structure. Using probabilistic inference formalisms, I show how all these approaches can be unified in a well-principled framework, which in turn allows RNA probing data to be easily integrated into a wide range of analyses that depend on RNA secondary structure inference. Such analyses include homology search and genome-wide detection of new structural RNAs.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-21
... Defense Federal Acquisition Regulation Supplement; Updates to Wide Area WorkFlow (DFARS Case 2011-D027... Wide Area WorkFlow (WAWF) and TRICARE Encounter Data System (TEDS). WAWF, which electronically... civil emergencies, when access to Wide Area WorkFlow by those contractors is not feasible; (4) Purchases...
Machine learning properties of binary wurtzite superlattices
Pilania, G.; Liu, X. -Y.
2018-01-12
The burgeoning paradigm of high-throughput computations and materials informatics brings new opportunities in terms of targeted materials design and discovery. The discovery process can be significantly accelerated and streamlined if one can learn effectively from available knowledge and past data to predict materials properties efficiently. Indeed, a very active area in materials science research is to develop machine learning based methods that can deliver automated and cross-validated predictive models using either already available materials data or new data generated in a targeted manner. In the present paper, we show that fast and accurate predictions of a wide range of propertiesmore » of binary wurtzite superlattices, formed by a diverse set of chemistries, can be made by employing state-of-the-art statistical learning methods trained on quantum mechanical computations in combination with a judiciously chosen numerical representation to encode materials’ similarity. These surrogate learning models then allow for efficient screening of vast chemical spaces by providing instant predictions of the targeted properties. Moreover, the models can be systematically improved in an adaptive manner, incorporate properties computed at different levels of fidelities and are naturally amenable to inverse materials design strategies. Finally, while the learning approach to make predictions for a wide range of properties (including structural, elastic and electronic properties) is demonstrated here for a specific example set containing more than 1200 binary wurtzite superlattices, the adopted framework is equally applicable to other classes of materials as well.« less
Qiao, Ning; Mostafa, Hesham; Corradi, Federico; Osswald, Marc; Stefanini, Fabio; Sumislawska, Dora; Indiveri, Giacomo
2015-01-01
Implementing compact, low-power artificial neural processing systems with real-time on-line learning abilities is still an open challenge. In this paper we present a full-custom mixed-signal VLSI device with neuromorphic learning circuits that emulate the biophysics of real spiking neurons and dynamic synapses for exploring the properties of computational neuroscience models and for building brain-inspired computing systems. The proposed architecture allows the on-chip configuration of a wide range of network connectivities, including recurrent and deep networks, with short-term and long-term plasticity. The device comprises 128 K analog synapse and 256 neuron circuits with biologically plausible dynamics and bi-stable spike-based plasticity mechanisms that endow it with on-line learning abilities. In addition to the analog circuits, the device comprises also asynchronous digital logic circuits for setting different synapse and neuron properties as well as different network configurations. This prototype device, fabricated using a 180 nm 1P6M CMOS process, occupies an area of 51.4 mm2, and consumes approximately 4 mW for typical experiments, for example involving attractor networks. Here we describe the details of the overall architecture and of the individual circuits and present experimental results that showcase its potential. By supporting a wide range of cortical-like computational modules comprising plasticity mechanisms, this device will enable the realization of intelligent autonomous systems with on-line learning capabilities. PMID:25972778
Machine learning properties of binary wurtzite superlattices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pilania, G.; Liu, X. -Y.
The burgeoning paradigm of high-throughput computations and materials informatics brings new opportunities in terms of targeted materials design and discovery. The discovery process can be significantly accelerated and streamlined if one can learn effectively from available knowledge and past data to predict materials properties efficiently. Indeed, a very active area in materials science research is to develop machine learning based methods that can deliver automated and cross-validated predictive models using either already available materials data or new data generated in a targeted manner. In the present paper, we show that fast and accurate predictions of a wide range of propertiesmore » of binary wurtzite superlattices, formed by a diverse set of chemistries, can be made by employing state-of-the-art statistical learning methods trained on quantum mechanical computations in combination with a judiciously chosen numerical representation to encode materials’ similarity. These surrogate learning models then allow for efficient screening of vast chemical spaces by providing instant predictions of the targeted properties. Moreover, the models can be systematically improved in an adaptive manner, incorporate properties computed at different levels of fidelities and are naturally amenable to inverse materials design strategies. Finally, while the learning approach to make predictions for a wide range of properties (including structural, elastic and electronic properties) is demonstrated here for a specific example set containing more than 1200 binary wurtzite superlattices, the adopted framework is equally applicable to other classes of materials as well.« less
Computational systems chemical biology.
Oprea, Tudor I; May, Elebeoba E; Leitão, Andrei; Tropsha, Alexander
2011-01-01
There is a critical need for improving the level of chemistry awareness in systems biology. The data and information related to modulation of genes and proteins by small molecules continue to accumulate at the same time as simulation tools in systems biology and whole body physiologically based pharmacokinetics (PBPK) continue to evolve. We called this emerging area at the interface between chemical biology and systems biology systems chemical biology (SCB) (Nat Chem Biol 3: 447-450, 2007).The overarching goal of computational SCB is to develop tools for integrated chemical-biological data acquisition, filtering and processing, by taking into account relevant information related to interactions between proteins and small molecules, possible metabolic transformations of small molecules, as well as associated information related to genes, networks, small molecules, and, where applicable, mutants and variants of those proteins. There is yet an unmet need to develop an integrated in silico pharmacology/systems biology continuum that embeds drug-target-clinical outcome (DTCO) triplets, a capability that is vital to the future of chemical biology, pharmacology, and systems biology. Through the development of the SCB approach, scientists will be able to start addressing, in an integrated simulation environment, questions that make the best use of our ever-growing chemical and biological data repositories at the system-wide level. This chapter reviews some of the major research concepts and describes key components that constitute the emerging area of computational systems chemical biology.
Computational Systems Chemical Biology
Oprea, Tudor I.; May, Elebeoba E.; Leitão, Andrei; Tropsha, Alexander
2013-01-01
There is a critical need for improving the level of chemistry awareness in systems biology. The data and information related to modulation of genes and proteins by small molecules continue to accumulate at the same time as simulation tools in systems biology and whole body physiologically-based pharmacokinetics (PBPK) continue to evolve. We called this emerging area at the interface between chemical biology and systems biology systems chemical biology, SCB (Oprea et al., 2007). The overarching goal of computational SCB is to develop tools for integrated chemical-biological data acquisition, filtering and processing, by taking into account relevant information related to interactions between proteins and small molecules, possible metabolic transformations of small molecules, as well as associated information related to genes, networks, small molecules and, where applicable, mutants and variants of those proteins. There is yet an unmet need to develop an integrated in silico pharmacology / systems biology continuum that embeds drug-target-clinical outcome (DTCO) triplets, a capability that is vital to the future of chemical biology, pharmacology and systems biology. Through the development of the SCB approach, scientists will be able to start addressing, in an integrated simulation environment, questions that make the best use of our ever-growing chemical and biological data repositories at the system-wide level. This chapter reviews some of the major research concepts and describes key components that constitute the emerging area of computational systems chemical biology. PMID:20838980
Requirements for company-wide management
NASA Technical Reports Server (NTRS)
Southall, J. W.
1980-01-01
Computing system requirements were developed for company-wide management of information and computer programs in an engineering data processing environment. The requirements are essential to the successful implementation of a computer-based engineering data management system; they exceed the capabilities provided by the commercially available data base management systems. These requirements were derived from a study entitled The Design Process, which was prepared by design engineers experienced in development of aerospace products.
Data management and analysis for the Earth System Grid
NASA Astrophysics Data System (ADS)
Williams, D. N.; Ananthakrishnan, R.; Bernholdt, D. E.; Bharathi, S.; Brown, D.; Chen, M.; Chervenak, A. L.; Cinquini, L.; Drach, R.; Foster, I. T.; Fox, P.; Hankin, S.; Henson, V. E.; Jones, P.; Middleton, D. E.; Schwidder, J.; Schweitzer, R.; Schuler, R.; Shoshani, A.; Siebenlist, F.; Sim, A.; Strand, W. G.; Wilhelmi, N.; Su, M.
2008-07-01
The international climate community is expected to generate hundreds of petabytes of simulation data within the next five to seven years. This data must be accessed and analyzed by thousands of analysts worldwide in order to provide accurate and timely estimates of the likely impact of climate change on physical, biological, and human systems. Climate change is thus not only a scientific challenge of the first order but also a major technological challenge. In order to address this technological challenge, the Earth System Grid Center for Enabling Technologies (ESG-CET) has been established within the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC)-2 program, with support from the offices of Advanced Scientific Computing Research and Biological and Environmental Research. ESG-CET's mission is to provide climate researchers worldwide with access to the data, information, models, analysis tools, and computational capabilities required to make sense of enormous climate simulation datasets. Its specific goals are to (1) make data more useful to climate researchers by developing Grid technology that enhances data usability; (2) meet specific distributed database, data access, and data movement needs of national and international climate projects; (3) provide a universal and secure web-based data access portal for broad multi-model data collections; and (4) provide a wide-range of Grid-enabled climate data analysis tools and diagnostic methods to international climate centers and U.S. government agencies. Building on the successes of the previous Earth System Grid (ESG) project, which has enabled thousands of researchers to access tens of terabytes of data from a small number of ESG sites, ESG-CET is working to integrate a far larger number of distributed data providers, high-bandwidth wide-area networks, and remote computers in a highly collaborative problem-solving environment.
Protein alignment algorithms with an efficient backtracking routine on multiple GPUs.
Blazewicz, Jacek; Frohmberg, Wojciech; Kierzynka, Michal; Pesch, Erwin; Wojciechowski, Pawel
2011-05-20
Pairwise sequence alignment methods are widely used in biological research. The increasing number of sequences is perceived as one of the upcoming challenges for sequence alignment methods in the nearest future. To overcome this challenge several GPU (Graphics Processing Unit) computing approaches have been proposed lately. These solutions show a great potential of a GPU platform but in most cases address the problem of sequence database scanning and computing only the alignment score whereas the alignment itself is omitted. Thus, the need arose to implement the global and semiglobal Needleman-Wunsch, and Smith-Waterman algorithms with a backtracking procedure which is needed to construct the alignment. In this paper we present the solution that performs the alignment of every given sequence pair, which is a required step for progressive multiple sequence alignment methods, as well as for DNA recognition at the DNA assembly stage. Performed tests show that the implementation, with performance up to 6.3 GCUPS on a single GPU for affine gap penalties, is very efficient in comparison to other CPU and GPU-based solutions. Moreover, multiple GPUs support with load balancing makes the application very scalable. The article shows that the backtracking procedure of the sequence alignment algorithms may be designed to fit in with the GPU architecture. Therefore, our algorithm, apart from scores, is able to compute pairwise alignments. This opens a wide range of new possibilities, allowing other methods from the area of molecular biology to take advantage of the new computational architecture. Performed tests show that the efficiency of the implementation is excellent. Moreover, the speed of our GPU-based algorithms can be almost linearly increased when using more than one graphics card.
Novel computer-based endoscopic camera
NASA Astrophysics Data System (ADS)
Rabinovitz, R.; Hai, N.; Abraham, Martin D.; Adler, Doron; Nissani, M.; Fridental, Ron; Vitsnudel, Ilia
1995-05-01
We have introduced a computer-based endoscopic camera which includes (a) unique real-time digital image processing to optimize image visualization by reducing over exposed glared areas and brightening dark areas, and by accentuating sharpness and fine structures, and (b) patient data documentation and management. The image processing is based on i Sight's iSP1000TM digital video processor chip and Adaptive SensitivityTM patented scheme for capturing and displaying images with wide dynamic range of light, taking into account local neighborhood image conditions and global image statistics. It provides the medical user with the ability to view images under difficult lighting conditions, without losing details `in the dark' or in completely saturated areas. The patient data documentation and management allows storage of images (approximately 1 MB per image for a full 24 bit color image) to any storage device installed into the camera, or to an external host media via network. The patient data which is included with every image described essential information on the patient and procedure. The operator can assign custom data descriptors, and can search for the stored image/data by typing any image descriptor. The camera optics has extended zoom range of f equals 20 - 45 mm allowing control of the diameter of the field which is displayed on the monitor such that the complete field of view of the endoscope can be displayed on all the area of the screen. All these features provide versatile endoscopic camera with excellent image quality and documentation capabilities.
Gridless, pattern-driven point cloud completion and extension
NASA Astrophysics Data System (ADS)
Gravey, Mathieu; Mariethoz, Gregoire
2016-04-01
While satellites offer Earth observation with a wide coverage, other remote sensing techniques such as terrestrial LiDAR can acquire very high-resolution data on an area that is limited in extension and often discontinuous due to shadow effects. Here we propose a numerical approach to merge these two types of information, thereby reconstructing high-resolution data on a continuous large area. It is based on a pattern matching process that completes the areas where only low-resolution data is available, using bootstrapped high-resolution patterns. Currently, the most common approach to pattern matching is to interpolate the point data on a grid. While this approach is computationally efficient, it presents major drawbacks for point clouds processing because a significant part of the information is lost in the point-to-grid resampling, and that a prohibitive amount of memory is needed to store large grids. To address these issues, we propose a gridless method that compares point clouds subsets without the need to use a grid. On-the-fly interpolation involves a heavy computational load, which is met by using a GPU high-optimized implementation and a hierarchical pattern searching strategy. The method is illustrated using data from the Val d'Arolla, Swiss Alps, where high-resolution terrestrial LiDAR data are fused with lower-resolution Landsat and WorldView-3 acquisitions, such that the density of points is homogeneized (data completion) and that it is extend to a larger area (data extension).
Olson, Scott A.; with a section by Veilleux, Andrea G.
2014-01-01
This report provides estimates of flood discharges at selected annual exceedance probabilities (AEPs) for streamgages in and adjacent to Vermont and equations for estimating flood discharges at AEPs of 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent (recurrence intervals of 2-, 5-, 10-, 25-, 50-, 100-, 200-, and 500-years, respectively) for ungaged, unregulated, rural streams in Vermont. The equations were developed using generalized least-squares regression. Flood-frequency and drainage-basin characteristics from 145 streamgages were used in developing the equations. The drainage-basin characteristics used as explanatory variables in the regression equations include drainage area, percentage of wetland area, and the basin-wide mean of the average annual precipitation. The average standard errors of prediction for estimating the flood discharges at the 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent AEP with these equations are 34.9, 36.0, 38.7, 42.4, 44.9, 47.3, 50.7, and 55.1 percent, respectively. Flood discharges at selected AEPs for streamgages were computed by using the Expected Moments Algorithm. To improve estimates of the flood discharges for given exceedance probabilities at streamgages in Vermont, a new generalized skew coefficient was developed. The new generalized skew for the region is a constant, 0.44. The mean square error of the generalized skew coefficient is 0.078. This report describes a technique for using results from the regression equations to adjust an AEP discharge computed from a streamgage record. This report also describes a technique for using a drainage-area adjustment to estimate flood discharge at a selected AEP for an ungaged site upstream or downstream from a streamgage. The final regression equations and the flood-discharge frequency data used in this study will be available in StreamStats. StreamStats is a World Wide Web application providing automated regression-equation solutions for user-selected sites on streams.
Advances in Spatial Data Infrastructure, Acquisition, Analysis, Archiving and Dissemination
NASA Technical Reports Server (NTRS)
Ramapriyan, Hampapuran K.; Rochon, Gilbert L.; Duerr, Ruth; Rank, Robert; Nativi, Stefano; Stocker, Erich Franz
2010-01-01
The authors review recent contributions to the state-of-thescience and benign proliferation of satellite remote sensing, spatial data infrastructure, near-real-time data acquisition, analysis on high performance computing platforms, sapient archiving, multi-modal dissemination and utilization for a wide array of scientific applications. The authors also address advances in Geoinformatics and its growing ubiquity, as evidenced by its inclusion as a focus area within the American Geophysical Union (AGU), European Geosciences Union (EGU), as well as by the evolution of the IEEE Geoscience and Remote Sensing Society's (GRSS) Data Archiving and Distribution Technical Committee (DAD TC).
Mechanical Models of Fault-Related Folding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, A. M.
2003-01-09
The subject of the proposed research is fault-related folding and ground deformation. The results are relevant to oil-producing structures throughout the world, to understanding of damage that has been observed along and near earthquake ruptures, and to earthquake-producing structures in California and other tectonically-active areas. The objectives of the proposed research were to provide both a unified, mechanical infrastructure for studies of fault-related foldings and to present the results in computer programs that have graphical users interfaces (GUIs) so that structural geologists and geophysicists can model a wide variety of fault-related folds (FaRFs).
Dynamic programming and graph algorithms in computer vision.
Felzenszwalb, Pedro F; Zabih, Ramin
2011-04-01
Optimization is a powerful paradigm for expressing and solving problems in a wide range of areas, and has been successfully applied to many vision problems. Discrete optimization techniques are especially interesting since, by carefully exploiting problem structure, they often provide nontrivial guarantees concerning solution quality. In this paper, we review dynamic programming and graph algorithms, and discuss representative examples of how these discrete optimization techniques have been applied to some classical vision problems. We focus on the low-level vision problem of stereo, the mid-level problem of interactive object segmentation, and the high-level problem of model-based recognition.
Unconditionally energy stable numerical schemes for phase-field vesicle membrane model
NASA Astrophysics Data System (ADS)
Guillén-González, F.; Tierra, G.
2018-02-01
Numerical schemes to simulate the deformation of vesicles membranes via minimizing the bending energy have been widely studied in recent times due to its connection with many biological motivated problems. In this work we propose a new unconditionally energy stable numerical scheme for a vesicle membrane model that satisfies exactly the conservation of volume constraint and penalizes the surface area constraint. Moreover, we extend these ideas to present an unconditionally energy stable splitting scheme decoupling the interaction of the vesicle with a surrounding fluid. Finally, the well behavior of the proposed schemes are illustrated through several computational experiments.
NASA Astrophysics Data System (ADS)
Jiang, Zhen-Yu; Li, Lin; Huang, Yi-Fan
2009-07-01
The segmented mirror telescope is widely used. The aberrations of segmented mirror systems are different from single mirror systems. This paper uses the Fourier optics theory to analyse the Zernike aberrations of segmented mirror systems. It concludes that the Zernike aberrations of segmented mirror systems obey the linearity theorem. The design of a segmented space telescope and segmented schemes are discussed, and its optical model is constructed. The computer simulation experiment is performed with this optical model to verify the suppositions. The experimental results confirm the correctness of the model.
Polytopol computing for multi-core and distributed systems
NASA Astrophysics Data System (ADS)
Spaanenburg, Henk; Spaanenburg, Lambert; Ranefors, Johan
2009-05-01
Multi-core computing provides new challenges to software engineering. The paper addresses such issues in the general setting of polytopol computing, that takes multi-core problems in such widely differing areas as ambient intelligence sensor networks and cloud computing into account. It argues that the essence lies in a suitable allocation of free moving tasks. Where hardware is ubiquitous and pervasive, the network is virtualized into a connection of software snippets judiciously injected to such hardware that a system function looks as one again. The concept of polytopol computing provides a further formalization in terms of the partitioning of labor between collector and sensor nodes. Collectors provide functions such as a knowledge integrator, awareness collector, situation displayer/reporter, communicator of clues and an inquiry-interface provider. Sensors provide functions such as anomaly detection (only communicating singularities, not continuous observation), they are generally powered or self-powered, amorphous (not on a grid) with generation-and-attrition, field re-programmable, and sensor plug-and-play-able. Together the collector and the sensor are part of the skeleton injector mechanism, added to every node, and give the network the ability to organize itself into some of many topologies. Finally we will discuss a number of applications and indicate how a multi-core architecture supports the security aspects of the skeleton injector.
Asia-Pacific POPIN workshop on Internet.
1996-01-01
This brief article announces the accomplishments of the ESCAP Population Division of the Department of Economic and Social Information and Policy Analysis (DESIPA) in conjunction with the Asia-Pacific POPIN Internet (Information Superhighway) Training Workshop in popularizing useful new computer information technologies. A successful workshop was held in Bangkok in November 1996 for 18 people from 8 countries in the Asian and Pacific region, many of whom were from population information centers. Participants were taught some techniques for disseminating population data and information through use of the Internet computer facility. Participants learned 1) how to use Windows software in the ESCAP local area network (LAN), 2) about concepts such as HTML (hypertext mark-up language), and 3) detailed information about computer language. Computer practices involved "surfing the Net (Internet)" and linking with the global POPIN site on the Internet. Participants learned about computer programs for information handling and learned how to prepare documents using HTML, how to mount information on the World Wide Web (WWW) of the Internet, how to convert existing documents into "HTML-style" files, and how to scan graphics, such as logos, photographs, and maps, for visual display on the Internet. The Workshop and the three training modules was funded by the UN Population Fund (UNFPA). The POPIN Coordinator was pleased that competency was accomplished in such a short period of time.
An electrically reconfigurable logic gate intrinsically enabled by spin-orbit materials.
Kazemi, Mohammad
2017-11-10
The spin degree of freedom in magnetic devices has been discussed widely for computing, since it could significantly reduce energy dissipation, might enable beyond Von Neumann computing, and could have applications in quantum computing. For spin-based computing to become widespread, however, energy efficient logic gates comprising as few devices as possible are required. Considerable recent progress has been reported in this area. However, proposals for spin-based logic either require ancillary charge-based devices and circuits in each individual gate or adopt principals underlying charge-based computing by employing ancillary spin-based devices, which largely negates possible advantages. Here, we show that spin-orbit materials possess an intrinsic basis for the execution of logic operations. We present a spin-orbit logic gate that performs a universal logic operation utilizing the minimum possible number of devices, that is, the essential devices required for representing the logic operands. Also, whereas the previous proposals for spin-based logic require extra devices in each individual gate to provide reconfigurability, the proposed gate is 'electrically' reconfigurable at run-time simply by setting the amplitude of the clock pulse applied to the gate. We demonstrate, analytically and numerically with experimentally benchmarked models, that the gate performs logic operations and simultaneously stores the result, realizing the 'stateful' spin-based logic scalable to ultralow energy dissipation.
New computer program solves wide variety of heat flow problems
NASA Technical Reports Server (NTRS)
Almond, J. C.
1966-01-01
Boeing Engineering Thermal Analyzer /BETA/ computer program uses numerical methods to provide accurate heat transfer solutions to a wide variety of heat flow problems. The program solves steady-state and transient problems in almost any situation that can be represented by a resistance-capacitance network.
Sreenivas, K; Sekhar, N Seshadri; Saxena, Manoj; Paliwal, R; Pathak, S; Porwal, M C; Fyzee, M A; Rao, S V C Kameswara; Wadodkar, M; Anasuya, T; Murthy, M S R; Ravisankar, T; Dadhwal, V K
2015-09-15
The present study aims at analysis of spatial and temporal variability in agricultural land cover during 2005-6 and 2011-12 from an ongoing program of annual land use mapping using multidate Advanced Wide Field Sensor (AWiFS) data aboard Resourcesat-1 and 2. About 640-690 multi-temporal AWiFS quadrant data products per year (depending on cloud cover) were co-registered and radiometrically normalized to prepare state (administrative unit) mosaics. An 18-fold classification was adopted in this project. Rule-based techniques along with maximum-likelihood algorithm were employed to deriving land cover information as well as changes within agricultural land cover classes. The agricultural land cover classes include - kharif (June-October), rabi (November-April), zaid (April-June), area sown more than once, fallow lands and plantation crops. Mean kappa accuracy of these estimates varied from 0.87 to 0.96 for various classes. Standard error of estimate has been computed for each class annually and the area estimates were corrected using standard error of estimate. The corrected estimates range between 99 and 116 Mha for kharif and 77-91 Mha for rabi. The kharif, rabi and net sown area were aggregated at 10 km × 10 km grid on annual basis for entire India and CV was computed at each grid cell using temporal spatially-aggregated area as input. This spatial variability of agricultural land cover classes was analyzed across meteorological zones, irrigated command areas and administrative boundaries. The results indicate that out of various states/meteorological zones, Punjab was consistently cropped during kharif as well as rabi seasons. Out of all irrigated commands, Tawa irrigated command was consistently cropped during rabi season. Copyright © 2014 Elsevier Ltd. All rights reserved.
Hydrodynamic modeling of urban flooding taking into account detailed data about city infrastructure
NASA Astrophysics Data System (ADS)
Belikov, Vitaly; Norin, Sergey; Aleksyuk, Andrey; Krylenko, Inna; Borisova, Natalya; Rumyantsev, Alexey
2017-04-01
Flood waves moving across urban areas have specific features. Thus, the linear objects of infrastructure (such as embankments, roads, dams) can change the direction of flow or block the water movement. On the contrary, paved avenues and wide streets in the cities contribute to the concentration of flood waters. Buildings create an additional resistance to the movement of water, which depends on the urban density and the type of constructions; this effect cannot be completely described by Manning's resistance law. In addition, part of the earth surface, occupied by buildings, is excluded from the flooded area, which results in a substantial (relative to undeveloped areas) increase of the depth of flooding, especially for unsteady flow conditions. An approach to numerical simulation of urban areas flooding that consists in direct allocating of all buildings and structures on the computational grid are proposed. This can be done in almost full automatic way with usage of modern software. Real geometry of all objects of infrastructure can be taken into account on the base of highly detailed digital maps and satellite images. The calculations based on two-dimensional Saint-Venant equations on irregular adaptive computational meshes, which can contain millions of cells and take into account tens of thousands of buildings and other objects of infrastructure. Flood maps, received as result of modeling, are the basis for the damage and risk assessment for urban areas. The main advantage of the developed method is high-precision calculations, realistic modeling results and appropriate graphical display of the flood dynamics and dam-break wave's propagation on urban areas. Verification of this method has been done on the experimental data and real events simulations, including catastrophic flooding of the Krymsk city in 2012 year.
2008-02-01
FINAL ENVIRONMENTAL ASSESSMENT February 2008 Malmstrom ® AFB WIDE AREA COVERAGE CONSTRUCT LAND MOBILE NETWORK COMMUNICATIONS INFRASTRUCTURE...Wide Area Coverage Construct Land Mobile Network Communications Infrastructure Malmstrom Air Force Base, Montana 5a. CONTRACT NUMBER 5b. GRANT...SIGNIFICANT IMPACT WIDE AREA COVERAGE CONSTRUCT LAND MOBILE NETWORK COMMUNICATIONS INFRASTRUCTURE MALMSTROM AIR FORCE BASE, MONTANA The
Proteinortho: Detection of (Co-)orthologs in large-scale analysis
2011-01-01
Background Orthology analysis is an important part of data analysis in many areas of bioinformatics such as comparative genomics and molecular phylogenetics. The ever-increasing flood of sequence data, and hence the rapidly increasing number of genomes that can be compared simultaneously, calls for efficient software tools as brute-force approaches with quadratic memory requirements become infeasible in practise. The rapid pace at which new data become available, furthermore, makes it desirable to compute genome-wide orthology relations for a given dataset rather than relying on relations listed in databases. Results The program Proteinortho described here is a stand-alone tool that is geared towards large datasets and makes use of distributed computing techniques when run on multi-core hardware. It implements an extended version of the reciprocal best alignment heuristic. We apply Proteinortho to compute orthologous proteins in the complete set of all 717 eubacterial genomes available at NCBI at the beginning of 2009. We identified thirty proteins present in 99% of all bacterial proteomes. Conclusions Proteinortho significantly reduces the required amount of memory for orthology analysis compared to existing tools, allowing such computations to be performed on off-the-shelf hardware. PMID:21526987
A Survey of Computational Intelligence Techniques in Protein Function Prediction
Tiwari, Arvind Kumar; Srivastava, Rajeev
2014-01-01
During the past, there was a massive growth of knowledge of unknown proteins with the advancement of high throughput microarray technologies. Protein function prediction is the most challenging problem in bioinformatics. In the past, the homology based approaches were used to predict the protein function, but they failed when a new protein was different from the previous one. Therefore, to alleviate the problems associated with homology based traditional approaches, numerous computational intelligence techniques have been proposed in the recent past. This paper presents a state-of-the-art comprehensive review of various computational intelligence techniques for protein function predictions using sequence, structure, protein-protein interaction network, and gene expression data used in wide areas of applications such as prediction of DNA and RNA binding sites, subcellular localization, enzyme functions, signal peptides, catalytic residues, nuclear/G-protein coupled receptors, membrane proteins, and pathway analysis from gene expression datasets. This paper also summarizes the result obtained by many researchers to solve these problems by using computational intelligence techniques with appropriate datasets to improve the prediction performance. The summary shows that ensemble classifiers and integration of multiple heterogeneous data are useful for protein function prediction. PMID:25574395
Using Computational and Mechanical Models to Study Animal Locomotion
Miller, Laura A.; Goldman, Daniel I.; Hedrick, Tyson L.; Tytell, Eric D.; Wang, Z. Jane; Yen, Jeannette; Alben, Silas
2012-01-01
Recent advances in computational methods have made realistic large-scale simulations of animal locomotion possible. This has resulted in numerous mathematical and computational studies of animal movement through fluids and over substrates with the purpose of better understanding organisms’ performance and improving the design of vehicles moving through air and water and on land. This work has also motivated the development of improved numerical methods and modeling techniques for animal locomotion that is characterized by the interactions of fluids, substrates, and structures. Despite the large body of recent work in this area, the application of mathematical and numerical methods to improve our understanding of organisms in the context of their environment and physiology has remained relatively unexplored. Nature has evolved a wide variety of fascinating mechanisms of locomotion that exploit the properties of complex materials and fluids, but only recently are the mathematical, computational, and robotic tools available to rigorously compare the relative advantages and disadvantages of different methods of locomotion in variable environments. Similarly, advances in computational physiology have only recently allowed investigators to explore how changes at the molecular, cellular, and tissue levels might lead to changes in performance at the organismal level. In this article, we highlight recent examples of how computational, mathematical, and experimental tools can be combined to ultimately answer the questions posed in one of the grand challenges in organismal biology: “Integrating living and physical systems.” PMID:22988026
Pothole Detection System Using a Black-box Camera.
Jo, Youngtae; Ryu, Seungki
2015-11-19
Aging roads and poor road-maintenance systems result a large number of potholes, whose numbers increase over time. Potholes jeopardize road safety and transportation efficiency. Moreover, they are often a contributing factor to car accidents. To address the problems associated with potholes, the locations and size of potholes must be determined quickly. Sophisticated road-maintenance strategies can be developed using a pothole database, which requires a specific pothole-detection system that can collect pothole information at low cost and over a wide area. However, pothole repair has long relied on manual detection efforts. Recent automatic detection systems, such as those based on vibrations or laser scanning, are insufficient to detect potholes correctly and inexpensively owing to the unstable detection of vibration-based methods and high costs of laser scanning-based methods. Thus, in this paper, we introduce a new pothole-detection system using a commercial black-box camera. The proposed system detects potholes over a wide area and at low cost. We have developed a novel pothole-detection algorithm specifically designed to work with the embedded computing environments of black-box cameras. Experimental results are presented with our proposed system, showing that potholes can be detected accurately in real-time.
Ion conduction in crystalline superionic solids and its applications
NASA Astrophysics Data System (ADS)
Chandra, Angesh
2014-06-01
Superionic solids an area of multidisciplinary research activity, incorporates to study the physical, chemical and technological aspects of rapid ion movements within the bulk of the special class of ionic materials. It is an emerging area of materials science, as these solids show tremendous technological scopes to develop wide variety of solid state electrochemical devices such as batteries, fuel cells, supercapacitors, sensors, electrochromic displays (ECDs), memories, etc. These devices have wide range of applicabilities viz. power sources for IC microchips to transport vehicles, novel sensors for controlling atmospheric pollution, new kind of memories for computers, smart windows/display panels, etc. The field grew with a rapid pace since then, especially with regards to designing new materials as well as to explore their device potentialities. Amongst the known superionic solids, fast Ag+ ion conducting crystalline solid electrolytes are attracted special attention due to their relatively higher room temperature conductivity as well as ease of materials handling/synthesis. Ion conduction in these electrolytes is very much interesting part of today. In the present review article, the ion conducting phenomenon and some device applications of crystalline/polycrystalline superionic solid electrolytes have been reviewed in brief. Synthesis and characterization tools have also been discussed in the present review article.
Extrinsic local regression on manifold-valued data
Lin, Lizhen; St Thomas, Brian; Zhu, Hongtu; Dunson, David B.
2017-01-01
We propose an extrinsic regression framework for modeling data with manifold valued responses and Euclidean predictors. Regression with manifold responses has wide applications in shape analysis, neuroscience, medical imaging and many other areas. Our approach embeds the manifold where the responses lie onto a higher dimensional Euclidean space, obtains a local regression estimate in that space, and then projects this estimate back onto the image of the manifold. Outside the regression setting both intrinsic and extrinsic approaches have been proposed for modeling i.i.d manifold-valued data. However, to our knowledge our work is the first to take an extrinsic approach to the regression problem. The proposed extrinsic regression framework is general, computationally efficient and theoretically appealing. Asymptotic distributions and convergence rates of the extrinsic regression estimates are derived and a large class of examples are considered indicating the wide applicability of our approach. PMID:29225385
Modification of YAPE keypoint detection algorithm for wide local contrast range images
NASA Astrophysics Data System (ADS)
Lukoyanov, A.; Nikolaev, D.; Konovalenko, I.
2018-04-01
Keypoint detection is an important tool of image analysis, and among many contemporary keypoint detection algorithms YAPE is known for its computational performance, allowing its use in mobile and embedded systems. One of its shortcomings is high sensitivity to local contrast which leads to high detection density in high-contrast areas while missing detections in low-contrast ones. In this work we study the contrast sensitivity of YAPE and propose a modification which compensates for this property on images with wide local contrast range (Yet Another Contrast-Invariant Point Extractor, YACIPE). As a model example, we considered the traffic sign recognition problem, where some signs are well-lighted, whereas others are in shadows and thus have low contrast. We show that the number of traffic signs on the image of which has not been detected any keypoints is 40% less for the proposed modification compared to the original algorithm.
Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools
NASA Astrophysics Data System (ADS)
Boe, Bryce A.
There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.
An Application of the Difference Potentials Method to Solving External Problems in CFD
NASA Technical Reports Server (NTRS)
Ryaben 'Kii, Victor S.; Tsynkov, Semyon V.
1997-01-01
Numerical solution of infinite-domain boundary-value problems requires some special techniques that would make the problem available for treatment on the computer. Indeed, the problem must be discretized in a way that the computer operates with only finite amount of information. Therefore, the original infinite-domain formulation must be altered and/or augmented so that on one hand the solution is not changed (or changed slightly) and on the other hand the finite discrete formulation becomes available. One widely used approach to constructing such discretizations consists of truncating the unbounded original domain and then setting the artificial boundary conditions (ABC's) at the newly formed external boundary. The role of the ABC's is to close the truncated problem and at the same time to ensure that the solution found inside the finite computational domain would be maximally close to (in the ideal case, exactly the same as) the corresponding fragment of the original infinite-domain solution. Let us emphasize that the proper treatment of artificial boundaries may have a profound impact on the overall quality and performance of numerical algorithms. The latter statement is corroborated by the numerous computational experiments and especially concerns the area of CFD, in which external problems present a wide class of practically important formulations. In this paper, we review some work that has been done over the recent years on constructing highly accurate nonlocal ABC's for calculation of compressible external flows. The approach is based on implementation of the generalized potentials and pseudodifferential boundary projection operators analogous to those proposed first by Calderon. The difference potentials method (DPM) by Ryaben'kii is used for the effective computation of the generalized potentials and projections. The resulting ABC's clearly outperform the existing methods from the standpoints of accuracy and robustness, in many cases noticeably speed up the multigrid convergence, and at the same time are quite comparable to other methods from the standpoints of geometric universality and simplicity of implementation.
NASA Astrophysics Data System (ADS)
Mahmoudabadi, H.; Briggs, G.
2016-12-01
Gridded data sets, such as geoid models or datum shift grids, are commonly used in coordinate transformation algorithms. Grid files typically contain known or measured values at regular fixed intervals. The process of computing a value at an unknown location from the values in the grid data set is called "interpolation". Generally, interpolation methods predict a value at a given point by computing a weighted average of the known values in the neighborhood of the point. Geostatistical Kriging is a widely used interpolation method for irregular networks. Kriging interpolation first analyzes the spatial structure of the input data, then generates a general model to describe spatial dependencies. This model is used to calculate values at unsampled locations by finding direction, shape, size, and weight of neighborhood points. Because it is based on a linear formulation for the best estimation, Kriging it the optimal interpolation method in statistical terms. The Kriging interpolation algorithm produces an unbiased prediction, as well as the ability to calculate the spatial distribution of uncertainty, allowing you to estimate the errors in an interpolation for any particular point. Kriging is not widely used in geospatial applications today, especially applications that run on low power devices or deal with large data files. This is due to the computational power and memory requirements of standard Kriging techniques. In this paper, improvements are introduced in directional kriging implementation by taking advantage of the structure of the grid files. The regular spacing of points simplifies finding the neighborhood points and computing their pairwise distances, reducing the the complexity and improving the execution time of the Kriging algorithm. Also, the proposed method iteratively loads small portion of interest areas in different directions to reduce the amount of required memory. This makes the technique feasible on almost any computer processor. Comparison between kriging and other standard interpolation methods demonstrated more accurate estimations in less denser data files.
A new computer approach to mixed feature classification for forestry application
NASA Technical Reports Server (NTRS)
Kan, E. P.
1976-01-01
A computer approach for mapping mixed forest features (i.e., types, classes) from computer classification maps is discussed. Mixed features such as mixed softwood/hardwood stands are treated as admixtures of softwood and hardwood areas. Large-area mixed features are identified and small-area features neglected when the nominal size of a mixed feature can be specified. The computer program merges small isolated areas into surrounding areas by the iterative manipulation of the postprocessing algorithm that eliminates small connected sets. For a forestry application, computer-classified LANDSAT multispectral scanner data of the Sam Houston National Forest were used to demonstrate the proposed approach. The technique was successful in cleaning the salt-and-pepper appearance of multiclass classification maps and in mapping admixtures of softwood areas and hardwood areas. However, the computer-mapped mixed areas matched very poorly with the ground truth because of inadequate resolution and inappropriate definition of mixed features.
NASA Technical Reports Server (NTRS)
Howard, J. A.
1974-01-01
The United Nations initially contracted with NASA to carry out investigations in three countries; but now as the result of rapidly increasing interest, ERTS imagery has been/is being used in 7 additional projects related to agriculture, forestry, land-use, soils, landforms and hydrology. Initially the ERTS frames were simply used to provide a synoptic view of a large area of a developing country as a basis to regional surveys. From this, interest has extended to using reconstituted false color imagery and latterly, in co-operation with Purdue University, the use of computer generated false color mosaics and computer generated large scale maps. As many developing countries are inadequately mapped and frequently rely on outdated maps, the ERTS imagery is considered to provide a very wide spectrum of valuable data. Thematic maps can be readily prepared at a scale of 1:250,000 using standard NASA imagery. These provide coverage of areas not previously mapped and provide supplementary information and enable existing maps to be up-dated. There is also increasing evidence that ERTS imagery is useful for temporal studies and for providing a new dimension in integrated surveys.
Clinical application of three-dimensional printing technology in craniofacial plastic surgery.
Choi, Jong Woo; Kim, Namkug
2015-05-01
Three-dimensional (3D) printing has been particularly widely adopted in medical fields. Application of the 3D printing technique has even been extended to bio-cell printing for 3D tissue/organ development, the creation of scaffolds for tissue engineering, and actual clinical application for various medical parts. Of various medical fields, craniofacial plastic surgery is one of areas that pioneered the use of the 3D printing concept. Rapid prototype technology was introduced in the 1990s to medicine via computer-aided design, computer-aided manufacturing. To investigate the current status of 3D printing technology and its clinical application, a systematic review of the literature was conducted. In addition, the benefits and possibilities of the clinical application of 3D printing in craniofacial surgery are reviewed, based on personal experiences with more than 500 craniofacial cases conducted using 3D printing tactile prototype models.
Computational Research on Mobile Pastoralism Using Agent-Based Modeling and Satellite Imagery.
Sakamoto, Takuto
2016-01-01
Dryland pastoralism has long attracted considerable attention from researchers in diverse fields. However, rigorous formal study is made difficult by the high level of mobility of pastoralists as well as by the sizable spatio-temporal variability of their environment. This article presents a new computational approach for studying mobile pastoralism that overcomes these issues. Combining multi-temporal satellite images and agent-based modeling allows a comprehensive examination of pastoral resource access over a realistic dryland landscape with unpredictable ecological dynamics. The article demonstrates the analytical potential of this approach through its application to mobile pastoralism in northeast Nigeria. Employing more than 100 satellite images of the area, extensive simulations are conducted under a wide array of circumstances, including different land-use constraints. The simulation results reveal complex dependencies of pastoral resource access on these circumstances along with persistent patterns of seasonal land use observed at the macro level.
Computational Research on Mobile Pastoralism Using Agent-Based Modeling and Satellite Imagery
Sakamoto, Takuto
2016-01-01
Dryland pastoralism has long attracted considerable attention from researchers in diverse fields. However, rigorous formal study is made difficult by the high level of mobility of pastoralists as well as by the sizable spatio-temporal variability of their environment. This article presents a new computational approach for studying mobile pastoralism that overcomes these issues. Combining multi-temporal satellite images and agent-based modeling allows a comprehensive examination of pastoral resource access over a realistic dryland landscape with unpredictable ecological dynamics. The article demonstrates the analytical potential of this approach through its application to mobile pastoralism in northeast Nigeria. Employing more than 100 satellite images of the area, extensive simulations are conducted under a wide array of circumstances, including different land-use constraints. The simulation results reveal complex dependencies of pastoral resource access on these circumstances along with persistent patterns of seasonal land use observed at the macro level. PMID:26963526
FPGA implementation of self organizing map with digital phase locked loops.
Hikawa, Hiroomi
2005-01-01
The self-organizing map (SOM) has found applicability in a wide range of application areas. Recently new SOM hardware with phase modulated pulse signal and digital phase-locked loops (DPLLs) has been proposed (Hikawa, 2005). The system uses the DPLL as a computing element since the operation of the DPLL is very similar to that of SOM's computation. The system also uses square waveform phase to hold the value of the each input vector element. This paper discuss the hardware implementation of the DPLL SOM architecture. For effective hardware implementation, some components are redesigned to reduce the circuit size. The proposed SOM architecture is described in VHDL and implemented on field programmable gate array (FPGA). Its feasibility is verified by experiments. Results show that the proposed SOM implemented on the FPGA has a good quantization capability, and its circuit size very small.
Optical aurora detectors: using natural optics to motivate education and outreach
NASA Astrophysics Data System (ADS)
Shaw, Joseph A.; Way, Jesse M.; Pust, Nathan J.; Nugent, Paul W.; Coate, Hans; Balster, Daniel
2009-06-01
Natural optical phenomena enjoy a level of interest sufficiently high among a wide array of people to provide ideal education and outreach opportunities. The aurora promotes particularly high interest, perhaps because of its relative rarity in the areas of the world where most people live. A project is being conducted at Montana State University to use common interest and curiosity about auroras to motivate learning and outreach through the design and deployment of optical sensor systems that detect the presence of an auroral display and send cell phone messages to alert interested people. Project participants learn about the physics and optics of the aurora, basic principles of optical system design, radiometric calculations and calibrations, electro-optical detectors, electronics, embedded computer systems, and computer software. The project is moving into a stage where it will provide greatly expanded outreach and education opportunities as optical aurora detector kits are created and disbursed to colleges around our region.
Organization of the Drosophila larval visual circuit
Gendre, Nanae; Neagu-Maier, G Larisa; Fetter, Richard D; Schneider-Mizell, Casey M; Truman, James W; Zlatic, Marta; Cardona, Albert
2017-01-01
Visual systems transduce, process and transmit light-dependent environmental cues. Computation of visual features depends on photoreceptor neuron types (PR) present, organization of the eye and wiring of the underlying neural circuit. Here, we describe the circuit architecture of the visual system of Drosophila larvae by mapping the synaptic wiring diagram and neurotransmitters. By contacting different targets, the two larval PR-subtypes create two converging pathways potentially underlying the computation of ambient light intensity and temporal light changes already within this first visual processing center. Locally processed visual information then signals via dedicated projection interneurons to higher brain areas including the lateral horn and mushroom body. The stratified structure of the larval optic neuropil (LON) suggests common organizational principles with the adult fly and vertebrate visual systems. The complete synaptic wiring diagram of the LON paves the way to understanding how circuits with reduced numerical complexity control wide ranges of behaviors.
Modeling Methodologies for Design and Control of Solid Oxide Fuel Cell APUs
NASA Astrophysics Data System (ADS)
Pianese, C.; Sorrentino, M.
2009-08-01
Among the existing fuel cell technologies, Solid Oxide Fuel Cells (SOFC) are particularly suitable for both stationary and mobile applications, due to their high energy conversion efficiencies, modularity, high fuel flexibility, low emissions and noise. Moreover, the high working temperatures enable their use for efficient cogeneration applications. SOFCs are entering in a pre-industrial era and a strong interest for designing tools has growth in the last years. Optimal system configuration, components sizing, control and diagnostic system design require computational tools that meet the conflicting needs of accuracy, affordable computational time, limited experimental efforts and flexibility. The paper gives an overview on control-oriented modeling of SOFC at both single cell and stack level. Such an approach provides useful simulation tools for designing and controlling SOFC-APUs destined to a wide application area, ranging from automotive to marine and airplane APUs.
NASA Astrophysics Data System (ADS)
Coletti, Cecilia; Corinti, Davide; Paciotti, Roberto; Re, Nazzareno; Crestoni, Maria Elisa; Fornarini, Simonetta
2017-11-01
The investigation of the molecular structure and dynamics of ions in gas phase is an item of increasing interest, due the role such species play in many areas of chemistry and physics, not to mention that they often represent elusive intermediates in more complex reaction mechanisms. Infrared Multiple Photon Dissociation spectroscopy is today one of the most advanced technique to this purpose, because of its high sensitivity to even small structure changes. The interpretation of IRMPD spectra strongly relies on high level quantum mechanical computations, so that a close interplay is needed for a detailed understanding of structure and kinetics properties which can be gathered from the many applications of this powerful technique. Recent advances in experiment and theory in this field are here illustrated, with emphasis on recent progresses for the elucidation of the mechanism of action of cisplatin, one of the most widely used anticancer drugs.
A Python-based interface to examine motions in time series of solar images
NASA Astrophysics Data System (ADS)
Campos-Rozo, J. I.; Vargas Domínguez, S.
2017-10-01
Python is considered to be a mature programming language, besides of being widely accepted as an engaging option for scientific analysis in multiple areas, as will be presented in this work for the particular case of solar physics research. SunPy is an open-source library based on Python that has been recently developed to furnish software tools to solar data analysis and visualization. In this work we present a graphical user interface (GUI) based on Python and Qt to effectively compute proper motions for the analysis of time series of solar data. This user-friendly computing interface, that is intended to be incorporated to the Sunpy library, uses a local correlation tracking technique and some extra tools that allows the selection of different parameters to calculate, vizualize and analyze vector velocity fields of solar data, i.e. time series of solar filtergrams and magnetograms.
Management of CAD/CAM information: Key to improved manufacturing productivity
NASA Technical Reports Server (NTRS)
Fulton, R. E.; Brainin, J.
1984-01-01
A key element to improved industry productivity is effective management of CAD/CAM information. To stimulate advancements in this area, a joint NASA/Navy/Industry project designated Integrated Programs for Aerospace-Vehicle Design (IPAD) is underway with the goal of raising aerospace industry productivity through advancement of technology to integrate and manage information involved in the design and manufacturing process. The project complements traditional NASA/DOD research to develop aerospace design technology and the Air Force's Integrated Computer-Aided Manufacturing (ICAM) program to advance CAM technology. IPAD research is guided by an Industry Technical Advisory Board (ITAB) composed of over 100 repesentatives from aerospace and computer companies. The IPAD accomplishments to date in development of requirements and prototype software for various levels of company-wide CAD/CAM data management are summarized and plans for development of technology for management of distributed CAD/CAM data and information required to control future knowledge-based CAD/CAM systems are discussed.
Hemodynamics of a Patient-Specific Aneurysm Model with Proper Orthogonal Decomposition
NASA Astrophysics Data System (ADS)
Han, Suyue; Chang, Gary Han; Modarres-Sadeghi, Yahya
2017-11-01
Wall shear stress (WSS) and oscillatory shear index (OSI) are two of the most-widely studied hemodynamic quantities in cardiovascular systems that have been shown to have the ability to elicit biological responses of the arterial wall, which could be used to predict the aneurysm development and rupture. In this study, a reduced-order model (ROM) of the hemodynamics of a patient-specific cerebral aneurysm is studied. The snapshot Proper Orthogonal Decomposition (POD) is utilized to construct the reduced-order bases of the flow using a CFD training set with known inflow parameters. It was shown that the area of low WSS and high OSI is correlated to higher POD modes. The resulting ROM can reproduce both WSS and OSI computationally for future parametric studies with significantly less computational cost. Agreement was observed between the WSS and OSI values obtained using direct CFD results and ROM results.
Herbei, Radu; Kubatko, Laura
2013-03-26
Markov chains are widely used for modeling in many areas of molecular biology and genetics. As the complexity of such models advances, it becomes increasingly important to assess the rate at which a Markov chain converges to its stationary distribution in order to carry out accurate inference. A common measure of convergence to the stationary distribution is the total variation distance, but this measure can be difficult to compute when the state space of the chain is large. We propose a Monte Carlo method to estimate the total variation distance that can be applied in this situation, and we demonstrate how the method can be efficiently implemented by taking advantage of GPU computing techniques. We apply the method to two Markov chains on the space of phylogenetic trees, and discuss the implications of our findings for the development of algorithms for phylogenetic inference.
Clinical Application of Three-Dimensional Printing Technology in Craniofacial Plastic Surgery
Kim, Namkug
2015-01-01
Three-dimensional (3D) printing has been particularly widely adopted in medical fields. Application of the 3D printing technique has even been extended to bio-cell printing for 3D tissue/organ development, the creation of scaffolds for tissue engineering, and actual clinical application for various medical parts. Of various medical fields, craniofacial plastic surgery is one of areas that pioneered the use of the 3D printing concept. Rapid prototype technology was introduced in the 1990s to medicine via computer-aided design, computer-aided manufacturing. To investigate the current status of 3D printing technology and its clinical application, a systematic review of the literature was conducted. In addition, the benefits and possibilities of the clinical application of 3D printing in craniofacial surgery are reviewed, based on personal experiences with more than 500 craniofacial cases conducted using 3D printing tactile prototype models. PMID:26015880
NASA Technical Reports Server (NTRS)
Seevers, P. M. (Principal Investigator); Drew, J. V.
1976-01-01
The author has identified the following significant results. Evaluation of ERTS-1 imagery for the Sand Hills region of Nebraska has shown that the data can be used to effectively measure several parameters of inventory needs. (1) Vegetative biomass can be estimated with a high degree of confidence using computer compatable tape data. (2) Soils can be mapped to the subgroup level with high altitude aircraft color infrared photography and to the association level with multitemporal ERTS-1 imagery. (3) Water quality in Sand Hills lakes can be estimated utilizing computer compatable tape data. (4) Center pivot irrigation can be inventoried from satellite data and can be monitored regarding site selection and relative success of establishment from high altitude aircraft color infrared photography. (5) ERTS-1 data is of exceptional value in wide-area inventory of natural resource data in the Sand Hills region of Nebraska.
The evolution of computer monitoring of real time data during the Atlas Centaur launch countdown
NASA Technical Reports Server (NTRS)
Thomas, W. F.
1981-01-01
In the last decade, improvements in computer technology have provided new 'tools' for controlling and monitoring critical missile systems. In this connection, computers have gradually taken a large role in monitoring all flights and ground systems on the Atlas Centaur. The wide body Centaur which will be launched in the Space Shuttle Cargo Bay will use computers to an even greater extent. It is planned to use the wide body Centaur to boost the Galileo spacecraft toward Jupiter in 1985. The critical systems which must be monitored prior to liftoff are examined. Computers have now been programmed to monitor all critical parameters continuously. At this time, there are two separate computer systems used to monitor these parameters.
Testing Mercury Porosimetry with 3D Printed Porosity Models
NASA Astrophysics Data System (ADS)
Hasiuk, F.; Ewing, R. P.; Hu, Q.
2014-12-01
Mercury intrusion porosimetry is one of the most widely used techniques to study the porous nature of a geological and man-made materials. In the geosciences, it is commonly used to describe petroleum reservoir and seal rocks as well as to grade aggregates for the design of asphalt and portland cement concretes. It's wide utility stems from its ability to characterize a wide range of pore throat sizes (from nanometers to around a millimeter). The fundamental physical model underlying mercury intrusion porosimetry, the Washburn Equation, is based on the assumption that rock porosity can be described as a bundle of cylindrical tubes. 3D printing technology, also known as rapid prototyping, allows the construction of intricate and accurate models, exactly what is required to build models of rock porosity. We evaluate the applicability of the Washburn Equation by comparing properties (like porosity, pore and pore throat size distribution, and surface area) computed on digital porosity models (built from CT data, CAD designs, or periodic geometries) to properties measured via mercury intrusion porosimetry on 3D printed versions of the same digital porosity models.
Boros, L G; Lepow, C; Ruland, F; Starbuck, V; Jones, S; Flancbaum, L; Townsend, M C
1992-07-01
A powerful method of processing MEDLINE and CINAHL source data uploaded to the IBM 3090 mainframe computer through an IBM/PC is described. Data are first downloaded from the CD-ROM's PC devices to floppy disks. These disks then are uploaded to the mainframe computer through an IBM/PC equipped with WordPerfect text editor and computer network connection (SONNGATE). Before downloading, keywords specifying the information to be accessed are typed at the FIND prompt of the CD-ROM station. The resulting abstracts are downloaded into a file called DOWNLOAD.DOC. The floppy disks containing the information are simply carried to an IBM/PC which has a terminal emulation (TELNET) connection to the university-wide computer network (SONNET) at the Ohio State University Academic Computing Services (OSU ACS). The WordPerfect (5.1) processes and saves the text into DOS format. Using the File Transfer Protocol (FTP, 130,000 bytes/s) of SONNET, the entire text containing the information obtained through the MEDLINE and CINAHL search is transferred to the remote mainframe computer for further processing. At this point, abstracts in the specified area are ready for immediate access and multiple retrieval by any PC having network switch or dial-in connection after the USER ID, PASSWORD and ACCOUNT NUMBER are specified by the user. The system provides the user an on-line, very powerful and quick method of searching for words specifying: diseases, agents, experimental methods, animals, authors, and journals in the research area downloaded. The user can also copy the TItles, AUthors and SOurce with optional parts of abstracts into papers under edition. This arrangement serves the special demands of a research laboratory by handling MEDLINE and CINAHL source data resulting after a search is performed with keywords specified for ongoing projects. Since the Ohio State University has a centrally founded mainframe system, the data upload, storage and mainframe operations are free.
47 CFR 54.518 - Support for wide area networks.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 3 2011-10-01 2011-10-01 false Support for wide area networks. 54.518 Section... area networks. To the extent that schools, libraries or consortia that include an eligible school or library build or purchase a wide area network to provide telecommunications services, the cost of such...
A coarse-to-fine kernel matching approach for mean-shift based visual tracking
NASA Astrophysics Data System (ADS)
Liangfu, L.; Zuren, F.; Weidong, C.; Ming, J.
2009-03-01
Mean shift is an efficient pattern match algorithm. It is widely used in visual tracking fields since it need not perform whole search in the image space. It employs gradient optimization method to reduce the time of feature matching and realize rapid object localization, and uses Bhattacharyya coefficient as the similarity measure between object template and candidate template. This thesis presents a mean shift algorithm based on coarse-to-fine search for the best kernel matching. This paper researches for object tracking with large motion area based on mean shift. To realize efficient tracking of such an object, we present a kernel matching method from coarseness to fine. If the motion areas of the object between two frames are very large and they are not overlapped in image space, then the traditional mean shift method can only obtain local optimal value by iterative computing in the old object window area, so the real tracking position cannot be obtained and the object tracking will be disabled. Our proposed algorithm can efficiently use a similarity measure function to realize the rough location of motion object, then use mean shift method to obtain the accurate local optimal value by iterative computing, which successfully realizes object tracking with large motion. Experimental results show its good performance in accuracy and speed when compared with background-weighted histogram algorithm in the literature.
NASA Astrophysics Data System (ADS)
Ramamurthy, M. K.
2006-05-01
We live in an era of an unprecedented data volumes, multidisciplinary analysis and synthesis, and active, learner-centered education emphasis. For instance, a new generation of satellite instruments is being designed for GOES-R and NPOESS programs to deliver terabytes of data each day. Similarly, high-resolution, coupled models run over a wide range of temporal scales are generating data at unprecedented rates. Complex environmental problems such as El Nino/Southern Oscillation, climate change, and water cycle transcend not only disciplinary but also geographic boundaries, with their impacts and implications touching every region and community of the world. The understanding and solution to these inherently global scientific and social problems requires integrated observations that cover all areas of the globe, international sharing and flow of data, and earth system science approaches. Contemporary education strategies recommend adopting an Earth system science approach for teaching the geosciences, employing new pedagogical techniques such as enquiry-based learning and hands-on activities. Needless to add, today's education and research enterprise depends heavily on easy to use, robust, flexible and scalable cyberinfrastructure, especially on the ready availability of quality data and appropriate tools to manipulate and integrate those data. Fortunately, rapid advances in computing, communication and information technologies have provided solutions that can are being applied to advance teaching, research, and service. The exponential growth in the use of the Internet in education and research, largely due to the advent of the World Wide Web, is well documented. On the other hand, how other technological and community trends have shaped the development and application of cyberinfrastructure, especially in the data services area, is less well understood. For example, the computing industry is converging on an approach called Web services that enables a standard and yet revolutionary way of building applications and methods to connect and exchange information over the Web. This new approach, based on XML - a widely accepted format for exchanging data and corresponding semantics over the Internet - enables applications, computer systems, and information processes to work together in fundamentally different ways. Likewise, the advent of digital libraries, grid computing platforms, interoperable frameworks, standards and protocols, open-source software, and community atmospheric models have been important drivers in shaping the use of a new generation of end-to-end cyberinfrastructure for solving some of the most challenging scientific and educational problems. In this talk, I will present an overview of the scientific, technological, and educational landscape, discuss recent developments in cyberinfrastructure, and Unidata's role in and vision for providing easy-to use, robust, end-to-end data services for solving geoscientific problems and advancing student learning.
A modified conjugate gradient method based on the Tikhonov system for computerized tomography (CT).
Wang, Qi; Wang, Huaxiang
2011-04-01
During the past few decades, computerized tomography (CT) was widely used for non-destructive testing (NDT) and non-destructive examination (NDE) in the industrial area because of its characteristics of non-invasiveness and visibility. Recently, CT technology has been applied to multi-phase flow measurement. Using the principle of radiation attenuation measurements along different directions through the investigated object with a special reconstruction algorithm, cross-sectional information of the scanned object can be worked out. It is a typical inverse problem and has always been a challenge for its nonlinearity and ill-conditions. The Tikhonov regulation method is widely used for similar ill-posed problems. However, the conventional Tikhonov method does not provide reconstructions with qualities good enough, the relative errors between the reconstructed images and the real distribution should be further reduced. In this paper, a modified conjugate gradient (CG) method is applied to a Tikhonov system (MCGT method) for reconstructing CT images. The computational load is dominated by the number of independent measurements m, and a preconditioner is imported to lower the condition number of the Tikhonov system. Both simulation and experiment results indicate that the proposed method can reduce the computational time and improve the quality of image reconstruction. Copyright © 2010 ISA. Published by Elsevier Ltd. All rights reserved.
A flexible spatial scan statistic with a restricted likelihood ratio for detecting disease clusters.
Tango, Toshiro; Takahashi, Kunihiko
2012-12-30
Spatial scan statistics are widely used tools for detection of disease clusters. Especially, the circular spatial scan statistic proposed by Kulldorff (1997) has been utilized in a wide variety of epidemiological studies and disease surveillance. However, as it cannot detect noncircular, irregularly shaped clusters, many authors have proposed different spatial scan statistics, including the elliptic version of Kulldorff's scan statistic. The flexible spatial scan statistic proposed by Tango and Takahashi (2005) has also been used for detecting irregularly shaped clusters. However, this method sets a feasible limitation of a maximum of 30 nearest neighbors for searching candidate clusters because of heavy computational load. In this paper, we show a flexible spatial scan statistic implemented with a restricted likelihood ratio proposed by Tango (2008) to (1) eliminate the limitation of 30 nearest neighbors and (2) to have surprisingly much less computational time than the original flexible spatial scan statistic. As a side effect, it is shown to be able to detect clusters with any shape reasonably well as the relative risk of the cluster becomes large via Monte Carlo simulation. We illustrate the proposed spatial scan statistic with data on mortality from cerebrovascular disease in the Tokyo Metropolitan area, Japan. Copyright © 2012 John Wiley & Sons, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arp, J.A.; Bower, J.C.; Burnett, R.A.
The Federal Emergency Management Information System (FEMIS) is an emergency management planning and response tool that was developed by the Pacific Northwest National Laboratory (PNNL) under the direction of the U.S. Army Chemical Biological Defense Command. The FEMIS System Administration Guide provides information necessary for the system administrator to maintain the FEMIS system. The FEMIS system is designed for a single Chemical Stockpile Emergency Preparedness Program (CSEPP) site that has multiple Emergency Operations Centers (EOCs). Each EOC has personal computers (PCs) that emergency planners and operations personnel use to do their jobs. These PCs are corrected via a local areamore » network (LAN) to servers that provide EOC-wide services. Each EOC is interconnected to other EOCs via a Wide Area Network (WAN). Thus, FEMIS is an integrated software product that resides on client/server computer architecture. The main body of FEMIS software, referred to as the FEMIS Application Software, resides on the PC client(s) and is directly accessible to emergency management personnel. The remainder of the FEMIS software, referred to as the FEMIS Support Software, resides on the UNIX server. The Support Software provides the communication data distribution and notification functionality necessary to operate FEMIS in a networked, client/server environment.« less
Federal Emergency Management Information System (FEMIS), Installation Guide for FEMIS 1.4.6
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arp, J.A.; Burnett, R.A.; Carter, R.J.
The Federal Emergency Management Information System (FEMIS) is an emergency management planning and response tool that was developed by the Pacific Northwest National Laboratory (PNNL) under the direction of the U.S. Army Chemical Biological Defense Command. The FEMIS System Administration Guide provides information necessary for the system administrator to maintain the FEMIS system. The FEMIS system is designed for a single Chemical Stockpile Emergency Preparedness Program (CSEPP) site that has multiple Emergency Operations Centers (EOCs). Each EOC has personal computers (PCs) that emergency planners and operations personnel use to do their jobs. These PCs are corrected via a local areamore » network (LAN) to servers that provide EOC-wide services. Each EOC is interconnected to other EOCs via a Wide Area Network (WAN). Thus, FEMIS is an integrated software product that resides on client/server computer architecture. The main body of FEMIS software, referred to as the FEMIS Application Software, resides on the PC client(s) and is directly accessible to emergency management personnel. The remainder of the FEMIS software, referred to as the FEMIS Support Software, resides on the UNIX server. The Support Software provides the communication data distribution and notification functionality necessary to operate FEMIS in a networked, client/server environment.« less
Flexible session management in a distributed environment
NASA Astrophysics Data System (ADS)
Miller, Zach; Bradley, Dan; Tannenbaum, Todd; Sfiligoi, Igor
2010-04-01
Many secure communication libraries used by distributed systems, such as SSL, TLS, and Kerberos, fail to make a clear distinction between the authentication, session, and communication layers. In this paper we introduce CEDAR, the secure communication library used by the Condor High Throughput Computing software, and present the advantages to a distributed computing system resulting from CEDAR's separation of these layers. Regardless of the authentication method used, CEDAR establishes a secure session key, which has the flexibility to be used for multiple capabilities. We demonstrate how a layered approach to security sessions can avoid round-trips and latency inherent in network authentication. The creation of a distinct session management layer allows for optimizations to improve scalability by way of delegating sessions to other components in the system. This session delegation creates a chain of trust that reduces the overhead of establishing secure connections and enables centralized enforcement of system-wide security policies. Additionally, secure channels based upon UDP datagrams are often overlooked by existing libraries; we show how CEDAR's structure accommodates this as well. As an example of the utility of this work, we show how the use of delegated security sessions and other techniques inherent in CEDAR's architecture enables US CMS to meet their scalability requirements in deploying Condor over large-scale, wide-area grid systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Angel, L.K.; Bower, J.C.; Burnett, R.A.
1999-06-29
The Federal Emergency Management Information System (FEMIS) is an emergency management planning and response tool that was developed by the Pacific Northwest National Laboratory (PNNL) under the direction of the U.S. Army Chemical Biological Defense Command. The FEMIS System Administration Guide provides information necessary for the system administrator to maintain the FEMIS system. The FEMIS system is designed for a single Chemical Stockpile Emergency Preparedness Program (CSEPP) site that has multiple Emergency Operations Centers (EOCs). Each EOC has personal computers (PCs) that emergency planners and operations personnel use to do their jobs. These PCs are corrected via a local areamore » network (LAN) to servers that provide EOC-wide services. Each EOC is interconnected to other EOCs via a Wide Area Network (WAN). Thus, FEMIS is an integrated software product that resides on client/server computer architecture. The main body of FEMIS software, referred to as the FEMIS Application Software, resides on the PC client(s) and is directly accessible to emergency management personnel. The remainder of the FEMIS software, referred to as the FEMIS Support Software, resides on the UNIX server. The Support Software provides the communication data distribution and notification functionality necessary to operate FEMIS in a networked, client/server environment.« less
Gao, Li; Shigeta, Kazuki; Vazquez-Guardado, Abraham; Progler, Christopher J; Bogart, Gregory R; Rogers, John A; Chanda, Debashis
2014-06-24
We report advances in materials, designs, and fabrication schemes for large-area negative index metamaterials (NIMs) in multilayer "fishnet" layouts that offer negative index behavior at wavelengths into the visible regime. A simple nanoimprinting scheme capable of implementation using standard, widely available tools followed by a subtractive, physical liftoff step provides an enabling route for the fabrication. Computational analysis of reflection and transmission measurements suggests that the resulting structures offer negative index of refraction that spans both the visible wavelength range (529-720 nm) and the telecommunication band (1.35-1.6 μm). The data reveal that these large (>75 cm(2)) imprinted NIMs have predictable behaviors, good spatial uniformity in properties, and figures of merit as high as 4.3 in the visible range.
Using wide area differential GPS to improve total system error for precision flight operations
NASA Astrophysics Data System (ADS)
Alter, Keith Warren
Total System Error (TSE) refers to an aircraft's total deviation from the desired flight path. TSE can be divided into Navigational System Error (NSE), the error attributable to the aircraft's navigation system, and Flight Technical Error (FTE), the error attributable to pilot or autopilot control. Improvement in either NSE or FTE reduces TSE and leads to the capability to fly more precise flight trajectories. The Federal Aviation Administration's Wide Area Augmentation System (WAAS) became operational for non-safety critical applications in 2000 and will become operational for safety critical applications in 2002. This navigation service will provide precise 3-D positioning (demonstrated to better than 5 meters horizontal and vertical accuracy) for civil aircraft in the United States. Perhaps more importantly, this navigation system, which provides continuous operation across large regions, enables new flight instrumentation concepts which allow pilots to fly aircraft significantly more precisely, both for straight and curved flight paths. This research investigates the capabilities of some of these new concepts, including the Highway-In-The Sky (HITS) display, which not only improves FTE but also reduces pilot workload when compared to conventional flight instrumentation. Augmentation to the HITS display, including perspective terrain and terrain alerting, improves pilot situational awareness. Flight test results from demonstrations in Juneau, AK, and Lake Tahoe, CA, provide evidence of the overall feasibility of integrated, low-cost flight navigation systems based on these concepts. These systems, requiring no more computational power than current-generation low-end desktop computers, have immediate applicability to general aviation flight from Cessnas to business jets and can support safer and ultimately more economical flight operations. Commercial airlines may also, over time, benefit from these new technologies.
Monte Carlo Methodology Serves Up a Software Success
NASA Technical Reports Server (NTRS)
2003-01-01
Widely used for the modeling of gas flows through the computation of the motion and collisions of representative molecules, the Direct Simulation Monte Carlo method has become the gold standard for producing research and engineering predictions in the field of rarefied gas dynamics. Direct Simulation Monte Carlo was first introduced in the early 1960s by Dr. Graeme Bird, a professor at the University of Sydney, Australia. It has since proved to be a valuable tool to the aerospace and defense industries in providing design and operational support data, as well as flight data analysis. In 2002, NASA brought to the forefront a software product that maintains the same basic physics formulation of Dr. Bird's method, but provides effective modeling of complex, three-dimensional, real vehicle simulations and parallel processing capabilities to handle additional computational requirements, especially in areas where computational fluid dynamics (CFD) is not applicable. NASA's Direct Simulation Monte Carlo Analysis Code (DAC) software package is now considered the Agency s premier high-fidelity simulation tool for predicting vehicle aerodynamics and aerothermodynamic environments in rarified, or low-density, gas flows.
A software tool for modeling and simulation of numerical P systems.
Buiu, Catalin; Arsene, Octavian; Cipu, Corina; Patrascu, Monica
2011-03-01
A P system represents a distributed and parallel bio-inspired computing model in which basic data structures are multi-sets or strings. Numerical P systems have been recently introduced and they use numerical variables and local programs (or evolution rules), usually in a deterministic way. They may find interesting applications in areas such as computational biology, process control or robotics. The first simulator of numerical P systems (SNUPS) has been designed, implemented and made available to the scientific community by the authors of this paper. SNUPS allows a wide range of applications, from modeling and simulation of ordinary differential equations, to the use of membrane systems as computational blocks of cognitive architectures, and as controllers for autonomous mobile robots. This paper describes the functioning of a numerical P system and presents an overview of SNUPS capabilities together with an illustrative example. SNUPS is freely available to researchers as a standalone application and may be downloaded from a dedicated website, http://snups.ics.pub.ro/, which includes an user manual and sample membrane structures. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Supercomputers ready for use as discovery machines for neuroscience.
Helias, Moritz; Kunkel, Susanne; Masumoto, Gen; Igarashi, Jun; Eppler, Jochen Martin; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus
2012-01-01
NEST is a widely used tool to simulate biological spiking neural networks. Here we explain the improvements, guided by a mathematical model of memory consumption, that enable us to exploit for the first time the computational power of the K supercomputer for neuroscience. Multi-threaded components for wiring and simulation combine 8 cores per MPI process to achieve excellent scaling. K is capable of simulating networks corresponding to a brain area with 10(8) neurons and 10(12) synapses in the worst case scenario of random connectivity; for larger networks of the brain its hierarchical organization can be exploited to constrain the number of communicating computer nodes. We discuss the limits of the software technology, comparing maximum filling scaling plots for K and the JUGENE BG/P system. The usability of these machines for network simulations has become comparable to running simulations on a single PC. Turn-around times in the range of minutes even for the largest systems enable a quasi interactive working style and render simulations on this scale a practical tool for computational neuroscience.
A Low Complexity System Based on Multiple Weighted Decision Trees for Indoor Localization
Sánchez-Rodríguez, David; Hernández-Morera, Pablo; Quinteiro, José Ma.; Alonso-González, Itziar
2015-01-01
Indoor position estimation has become an attractive research topic due to growing interest in location-aware services. Nevertheless, satisfying solutions have not been found with the considerations of both accuracy and system complexity. From the perspective of lightweight mobile devices, they are extremely important characteristics, because both the processor power and energy availability are limited. Hence, an indoor localization system with high computational complexity can cause complete battery drain within a few hours. In our research, we use a data mining technique named boosting to develop a localization system based on multiple weighted decision trees to predict the device location, since it has high accuracy and low computational complexity. The localization system is built using a dataset from sensor fusion, which combines the strength of radio signals from different wireless local area network access points and device orientation information from a digital compass built-in mobile device, so that extra sensors are unnecessary. Experimental results indicate that the proposed system leads to substantial improvements on computational complexity over the widely-used traditional fingerprinting methods, and it has a better accuracy than they have. PMID:26110413
JANUS: A Compilation System for Balancing Parallelism and Performance in OpenVX
NASA Astrophysics Data System (ADS)
Omidian, Hossein; Lemieux, Guy G. F.
2018-04-01
Embedded systems typically do not have enough on-chip memory for entire an image buffer. Programming systems like OpenCV operate on entire image frames at each step, making them use excessive memory bandwidth and power. In contrast, the paradigm used by OpenVX is much more efficient; it uses image tiling, and the compilation system is allowed to analyze and optimize the operation sequence, specified as a compute graph, before doing any pixel processing. In this work, we are building a compilation system for OpenVX that can analyze and optimize the compute graph to take advantage of parallel resources in many-core systems or FPGAs. Using a database of prewritten OpenVX kernels, it automatically adjusts the image tile size as well as using kernel duplication and coalescing to meet a defined area (resource) target, or to meet a specified throughput target. This allows a single compute graph to target implementations with a wide range of performance needs or capabilities, e.g. from handheld to datacenter, that use minimal resources and power to reach the performance target.
Supercomputers Ready for Use as Discovery Machines for Neuroscience
Helias, Moritz; Kunkel, Susanne; Masumoto, Gen; Igarashi, Jun; Eppler, Jochen Martin; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus
2012-01-01
NEST is a widely used tool to simulate biological spiking neural networks. Here we explain the improvements, guided by a mathematical model of memory consumption, that enable us to exploit for the first time the computational power of the K supercomputer for neuroscience. Multi-threaded components for wiring and simulation combine 8 cores per MPI process to achieve excellent scaling. K is capable of simulating networks corresponding to a brain area with 108 neurons and 1012 synapses in the worst case scenario of random connectivity; for larger networks of the brain its hierarchical organization can be exploited to constrain the number of communicating computer nodes. We discuss the limits of the software technology, comparing maximum filling scaling plots for K and the JUGENE BG/P system. The usability of these machines for network simulations has become comparable to running simulations on a single PC. Turn-around times in the range of minutes even for the largest systems enable a quasi interactive working style and render simulations on this scale a practical tool for computational neuroscience. PMID:23129998
Jungle Computing: Distributed Supercomputing Beyond Clusters, Grids, and Clouds
NASA Astrophysics Data System (ADS)
Seinstra, Frank J.; Maassen, Jason; van Nieuwpoort, Rob V.; Drost, Niels; van Kessel, Timo; van Werkhoven, Ben; Urbani, Jacopo; Jacobs, Ceriel; Kielmann, Thilo; Bal, Henri E.
In recent years, the application of high-performance and distributed computing in scientific practice has become increasingly wide spread. Among the most widely available platforms to scientists are clusters, grids, and cloud systems. Such infrastructures currently are undergoing revolutionary change due to the integration of many-core technologies, providing orders-of-magnitude speed improvements for selected compute kernels. With high-performance and distributed computing systems thus becoming more heterogeneous and hierarchical, programming complexity is vastly increased. Further complexities arise because urgent desire for scalability and issues including data distribution, software heterogeneity, and ad hoc hardware availability commonly force scientists into simultaneous use of multiple platforms (e.g., clusters, grids, and clouds used concurrently). A true computing jungle.
Bathymetric contours of Breckenridge Reservoir, Quantico, Virginia
Wicklein, S.M.; Lotspeich, R.R.; Banks, R.B.
2012-01-01
Breckenridge Reservoir, built in 1938, is fed by Chopawamsic Creek and South Branch Chopawamsic Creek. The Reservoir is a main source of drinking water for the U.S. Marine Corps (USMC) Base in Quantico, Virginia. The U.S. Geological Survey (USGS), in cooperation with the USMC, conducted a bathymetric survey of Breckenridge Reservoir in March 2009. The survey was conducted to provide the USMC Natural Resources and Environmental Affairs (NREA) with information regarding reservoir storage capacity and general bathymetric properties. The bathymetric survey can provide a baseline for future work on sediment loads and deposition rates for the reservoir. Bathymetric data were collected using a boat-mounted Wide Area Augmentation System (WAAS) differential global positioning system (DGPS), echo depth-sounding equipment, and computer software. Data were exported into a geographic information system (GIS) for mapping and calculating area and volume. Reservoir storage volume at the time of the survey was about 22,500,000 cubic feet (517 acre-feet) with a surface area of about 1,820,000 square feet (41.9 acres).
EPA Brownfields Area-Wide Planning Recipients Selected for FY13 Grant Funding
EPA has selected the following entities as Brownfields Area-Wide Planning grant recipients. These recipients will work with their local community members, other stakeholders and project partners to develop an area-wide plan and implementation strategy for
Application of computational physics within Northrop
NASA Technical Reports Server (NTRS)
George, M. W.; Ling, R. T.; Mangus, J. F.; Thompkins, W. T.
1987-01-01
An overview of Northrop programs in computational physics is presented. These programs depend on access to today's supercomputers, such as the Numerical Aerodynamical Simulator (NAS), and future growth on the continuing evolution of computational engines. Descriptions here are concentrated on the following areas: computational fluid dynamics (CFD), computational electromagnetics (CEM), computer architectures, and expert systems. Current efforts and future directions in these areas are presented. The impact of advances in the CFD area is described, and parallels are drawn to analagous developments in CEM. The relationship between advances in these areas and the development of advances (parallel) architectures and expert systems is also presented.
Resource requirements of inclusive urban development in India: insights from ten cities
NASA Astrophysics Data System (ADS)
Singh Nagpure, Ajay; Reiner, Mark; Ramaswami, Anu
2018-02-01
This paper develops a methodology to assess the resource requirements of inclusive urban development in India and compares those requirements to current community-wide material and energy flows. Methods include: (a) identifying minimum service level benchmarks for the provision of infrastructure services including housing, electricity and clean cooking fuels; (b) assessing the percentage of homes that lack access to infrastructure or that consume infrastructure services below the identified benchmarks; (c) quantifying the material requirements to provide basic infrastructure services using India-specific design data; and (d) computing material and energy requirements for inclusive development and comparing it with current community-wide material and energy flows. Applying the method to ten Indian cities, we find that: 1%-6% of households do not have electricity, 14%-71% use electricity below the benchmark of 25 kWh capita-month-1 4%-16% lack structurally sound housing; 50%-75% live in floor area less than the benchmark of 8.75 m2 floor area/capita; 10%-65% lack clean cooking fuel; and 6%-60% lack connection to a sewerage system. Across the ten cities examined, to provide basic electricity (25 kWh capita-month-1) to all will require an addition of only 1%-10% in current community-wide electricity use. To provide basic clean LPG fuel (1.2 kg capita-month-1) to all requires an increase of 5%-40% in current community-wide LPG use. Providing permanent shelter (implemented over a ten year period) to populations living in non-permanent housing in Delhi and Chandigarh would require a 6%-14% increase over current annual community-wide cement use. Conversely, to provide permanent housing to all people living in structurally unsound housing and those living in overcrowded housing (<5 m cap-2) would require 32%-115% of current community-wide cement flows. Except for the last scenario, these results suggest that social policies that seek to provide basic infrastructure provisioning for all residents would not dramatically increasing current community-wide resource flows.
Wide-field computational imaging of pathology slides using lens-free on-chip microscopy.
Greenbaum, Alon; Zhang, Yibo; Feizi, Alborz; Chung, Ping-Luen; Luo, Wei; Kandukuri, Shivani R; Ozcan, Aydogan
2014-12-17
Optical examination of microscale features in pathology slides is one of the gold standards to diagnose disease. However, the use of conventional light microscopes is partially limited owing to their relatively high cost, bulkiness of lens-based optics, small field of view (FOV), and requirements for lateral scanning and three-dimensional (3D) focus adjustment. We illustrate the performance of a computational lens-free, holographic on-chip microscope that uses the transport-of-intensity equation, multi-height iterative phase retrieval, and rotational field transformations to perform wide-FOV imaging of pathology samples with comparable image quality to a traditional transmission lens-based microscope. The holographically reconstructed image can be digitally focused at any depth within the object FOV (after image capture) without the need for mechanical focus adjustment and is also digitally corrected for artifacts arising from uncontrolled tilting and height variations between the sample and sensor planes. Using this lens-free on-chip microscope, we successfully imaged invasive carcinoma cells within human breast sections, Papanicolaou smears revealing a high-grade squamous intraepithelial lesion, and sickle cell anemia blood smears over a FOV of 20.5 mm(2). The resulting wide-field lens-free images had sufficient image resolution and contrast for clinical evaluation, as demonstrated by a pathologist's blinded diagnosis of breast cancer tissue samples, achieving an overall accuracy of ~99%. By providing high-resolution images of large-area pathology samples with 3D digital focus adjustment, lens-free on-chip microscopy can be useful in resource-limited and point-of-care settings. Copyright © 2014, American Association for the Advancement of Science.
Empirical Determination of Competence Areas to Computer Science Education
ERIC Educational Resources Information Center
Zendler, Andreas; Klaudt, Dieter; Seitz, Cornelia
2014-01-01
The authors discuss empirically determined competence areas to K-12 computer science education, emphasizing the cognitive level of competence. The results of a questionnaire with 120 professors of computer science serve as a database. By using multi-dimensional scaling and cluster analysis, four competence areas to computer science education…
Computational Infrastructure for Geodynamics (CIG)
NASA Astrophysics Data System (ADS)
Gurnis, M.; Kellogg, L. H.; Bloxham, J.; Hager, B. H.; Spiegelman, M.; Willett, S.; Wysession, M. E.; Aivazis, M.
2004-12-01
Solid earth geophysicists have a long tradition of writing scientific software to address a wide range of problems. In particular, computer simulations came into wide use in geophysics during the decade after the plate tectonic revolution. Solution schemes and numerical algorithms that developed in other areas of science, most notably engineering, fluid mechanics, and physics, were adapted with considerable success to geophysics. This software has largely been the product of individual efforts and although this approach has proven successful, its strength for solving problems of interest is now starting to show its limitations as we try to share codes and algorithms or when we want to recombine codes in novel ways to produce new science. With funding from the NSF, the US community has embarked on a Computational Infrastructure for Geodynamics (CIG) that will develop, support, and disseminate community-accessible software for the greater geodynamics community from model developers to end-users. The software is being developed for problems involving mantle and core dynamics, crustal and earthquake dynamics, magma migration, seismology, and other related topics. With a high level of community participation, CIG is leveraging state-of-the-art scientific computing into a suite of open-source tools and codes. The infrastructure that we are now starting to develop will consist of: (a) a coordinated effort to develop reusable, well-documented and open-source geodynamics software; (b) the basic building blocks - an infrastructure layer - of software by which state-of-the-art modeling codes can be quickly assembled; (c) extension of existing software frameworks to interlink multiple codes and data through a superstructure layer; (d) strategic partnerships with the larger world of computational science and geoinformatics; and (e) specialized training and workshops for both the geodynamics and broader Earth science communities. The CIG initiative has already started to leverage and develop long-term strategic partnerships with open source development efforts within the larger thrusts of scientific computing and geoinformatics. These strategic partnerships are essential as the frontier has moved into multi-scale and multi-physics problems in which many investigators now want to use simulation software for data interpretation, data assimilation, and hypothesis testing.
Computers in Academic Architecture Libraries.
ERIC Educational Resources Information Center
Willis, Alfred; And Others
1992-01-01
Computers are widely used in architectural research and teaching in U.S. schools of architecture. A survey of libraries serving these schools sought information on the emphasis placed on computers by the architectural curriculum, accessibility of computers to library staff, and accessibility of computers to library patrons. Survey results and…
Global Gravity Field Determination by Combination of terrestrial and Satellite Gravity Data
NASA Astrophysics Data System (ADS)
Fecher, T.; Pail, R.; Gruber, T.
2011-12-01
A multitude of impressive results document the success of the satellite gravity field mission GOCE with a wide field of applications in geodesy, geophysics and oceanography. The high performance of GOCE gravity field models can be further improved by combination with GRACE data, which is contributing the long wavelength signal content of the gravity field with very high accuracy. An example for such a consistent combination of satellite gravity data are the satellite-only models GOCO01S and GOCO02S. However, only the further combination with terrestrial and altimetric gravity data enables to expand gravity field models up to very high spherical harmonic degrees and thus to achieve a spatial resolution down to 20-30 km. First numerical studies for high-resolution global gravity field models combining GOCE, GRACE and terrestrial/altimetric data on basis of the DTU10 model have already been presented. Computations up to degree/order 600 based on full normal equations systems to preserve the full variance-covariance information, which results mainly from different weights of individual terrestrial/altimetric data sets, have been successfully performed. We could show that such large normal equations systems (degree/order 600 corresponds to a memory demand of almost 1TByte), representing an immense computational challenge as computation time and memory requirements put high demand on computational resources, can be handled. The DTU10 model includes gravity anomalies computed from the global model EGM08 in continental areas. Therefore, the main focus of this presentation lies on the computation of high-resolution combined gravity field models based on real terrestrial gravity anomaly data sets. This is a challenge due to the inconsistency of these data sets, including also systematic error components, but a further step to a real independent gravity field model. This contribution will present our recent developments and progress by using independent data sets at certain land areas, which are combined with DTU10 in the ocean areas, as well as satellite gravity data. Investigations have been made concerning the preparation and optimum weighting of the different data sources. The results, which should be a major step towards a GOCO-C model, will be validated using external gravity field data and by applying different validation methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jon Eisenberg, Director, CSTB
The Computer Science and Telecommunications Board of the National Research Council considers technical and policy issues pertaining to computer science (CS), telecommunications, and information technology (IT). The functions of the board include: (1) monitoring and promoting the health of the CS, IT, and telecommunications fields, including attention as appropriate to issues of human resources and funding levels and program structures for research; (2) initiating studies involving CS, IT, and telecommunications as critical resources and sources of national economic strength; (3) responding to requests from the government, non-profit organizations, and private industry for expert advice on CS, IT, and telecommunications issues;more » and to requests from the government for expert advice on computer and telecommunications systems planning, utilization, and modernization; (4) fostering interaction among CS, IT, and telecommunications researchers and practitioners, and with other disciplines; and providing a base of expertise in the National Research Council in the areas of CS, IT, and telecommunications. This award has supported the overall operation of CSTB. Reports resulting from the Board's efforts have been widely disseminated in both electronic and print form, and all CSTB reports are available at its World Wide Web home page at cstb.org. The following reports, resulting from projects that were separately funded by a wide array of sponsors, were completed and released during the award period: 2007: * Summary of a Workshop on Software-Intensive Systems and Uncertainty at Scale * Social Security Administration Electronic Service Provision: A Strategic Assessment * Toward a Safer and More Secure Cyberspace * Software for Dependable Systems: Sufficient Evidence? * Engaging Privacy and Information Technology in a Digital Age * Improving Disaster Management: The Role of IT in Mitigation, Preparedness, Response, and Recovery 2006: * Renewing U.S. Telecommunications Research * Letter Report on Electronic Voting * Summary of a Workshop on the Technology, Policy, and Cultural Dimensions of Biometric System 2005: * Catalyzing Inquiry at the Interface of Computing and Biology * Summary of a Workshop on Using IT to Enhance Disaster Management * Asking the Right Questions About Electronic Voting * Building an Electronic Records Archive at NARA: Recommendations for a Long-Term Strategy * Signposts in Cyberspace: The Domain Name System and Internet Navigation 2004: * ITCP: Information Technology and Creative Practices (brochure) * Radio Frequency Identification (RFID) Technologies: A Workshop Summary * Getting up to Speed: The Future of Supercomputing * Summary of a Workshop on Software Certification and Dependability * Computer Science: Reflections on the Field, Reflections from the Field CSTB conducted numerous briefings of these reports and transmitted copies of these reports to researchers and key decision makers in the public and private sectors. It developed articles for journals based on several of these reports. As requested, and in fulfillment of its congressional charter to act as an independent advisor to the federal government, it arranged for congressional testimony on several of these reports. CSTB also convenes a number of workshops and other events, either as part of studies or in conjunctions with meetings of the CSTB members. These events have included the following: two 2007 workshops explored issues and challenges related to state voter registration databases, record matching, and database interoperability. A Sept. 2007 workshop, Trends in Computing Performance, explored fundamental trends in areas such as power, storage, programming, and applications. An Oct. 2007, workshop presented highlights of CSTB's May 2007 report, Software for Dependable Systems: Sufficient Evidence?, along with several panels discussing the report's conclusions and their implications. A Jan. 2007 workshop, Uncertainty at Scale, explored engineering uncertainty, system complexity, and scale issues in developing large software systems. A Feb. 2007 workshop explored China's and India's roles in the IT R&D ecosystem; observations about the ecosystem over the long term; perspectives from serial entrepreneurs about the evolution of the ecosystem; and a cross-industry, global view of the R&D ecosystem. A Nov. 2006 event brought together participants from government, industry, and academia to share their perspectives on the health of the ecosystem, patterns of funding and investment, and the Potomac-area IT startup environment. A symposium entitled 2016, held in Oct. 2006, featured a number of distinguished speakers who shared their views on how computer science and telecommunications will look in 10 years. This well-attended event was also the subject of an Oct. 31, 2006, feature essay in the New York Times, "Computing, 2016: What Won't Be Possible?"« less
Designing application software in wide area network settings
NASA Technical Reports Server (NTRS)
Makpangou, Mesaac; Birman, Ken
1990-01-01
Progress in methodologies for developing robust local area network software has not been matched by similar results for wide area settings. The design of application software spanning multiple local area environments is examined. For important classes of applications, simple design techniques are presented that yield fault tolerant wide area programs. An implementation of these techniques as a set of tools for use within the ISIS system is described.
NASA Technical Reports Server (NTRS)
Dominguez, Margaret Z.; Content, David A.; Gong, Qian; Griesmann, Ulf; Hagopian, John G.; Marx, Catherine T; Whipple, Arthur L.
2017-01-01
Infrared Computer Generated Holograms (CGHs) were designed, manufactured and used to measure the performance of the grism (grating prism) prototype which includes testing Diffractive Optical Elements (DOE). The grism in the Wide Field Infrared Survey Telescope (WFIRST) will allow the surveying of a large section of the sky to find bright galaxies.
Industry-Wide Workshop on Computational Turbulence Modeling
NASA Technical Reports Server (NTRS)
Shabbir, Aamir (Compiler)
1995-01-01
This publication contains the presentations made at the Industry-Wide Workshop on Computational Turbulence Modeling which took place on October 6-7, 1994. The purpose of the workshop was to initiate the transfer of technology developed at Lewis Research Center to industry and to discuss the current status and the future needs of turbulence models in industrial CFD.
Gender Equity in Advertising on the World-Wide Web: Can it be Found?
ERIC Educational Resources Information Center
Kramer, Kevin M.; Knupfer, Nancy Nelson
Recent attention to gender equity in computer environments, as well as in print-based and televised advertising for technological products, suggests that gender bias in the computer environment continues. This study examined gender messages within World Wide Web advertisements, specifically the type and number of visual images used in Web banner…
Design and optimization of all-optical networks
NASA Astrophysics Data System (ADS)
Xiao, Gaoxi
1999-10-01
In this thesis, we present our research results on the design and optimization of all-optical networks. We divide our results into the following four parts: 1.In the first part, we consider broadcast-and-select networks. In our research, we propose an alternative and cheaper network configuration to hide the tuning time. In addition, we derive lower bounds on the optimal schedule lengths and prove that they are tighter than the best existing bounds. 2.In the second part, we consider all-optical wide area networks. We propose a set of algorithms for allocating a given number of WCs to the nodes. We adopt a simulation-based optimization approach, in which we collect utilization statistics of WCs from computer simulation and then perform optimization to allocate the WCs. Therefore, our algorithms are widely applicable and they are not restricted to any particular model and assumption. We have conducted extensive computer simulation on regular and irregular networks under both uniform and non-uniform traffic. We see that our method can get nearly the same performance as that of full wavelength conversion by using a much smaller number of WCs. Compared with the best existing method, the results show that our algorithms can significantly reduce (1)the overall blocking probability (i.e., better mean quality of service) and (2)the maximum of the blocking probabilities experienced at all the source nodes (i.e., better fairness). Equivalently, for a given performance requirement on blocking probability, our algorithms can significantly reduce the number of WCs required. 3.In the third part, we design and optimize the physical topology of all-optical wide area networks. We show that the design problem is NP-complete and we propose a heuristic algorithm called two-stage cut saturation algorithm for this problem. Simulation results show that (1)the proposed algorithm can efficiently design networks with low cost and high utilization, and (2)if wavelength converters are available to support full wavelength conversion, the cost of the links can be significantly reduced. 4.In the fourth part, we consider all-optical wide area networks with multiple fibers per link. We design a node configuration for all-optical networks. We exploit the flexibility that, to establish a lightpath across a node, we can select any one of the available channels in the incoming link and any one of the available channels in the outgoing link. As a result, the proposed node configuration requires a small number of small optical switches while it can achieve nearly the same performance as the existing one. And there is no additional crosstalk other than the intrinsic crosstalk within each single-chip optical switch.* (Abstract shortened by UMI.) *Originally published in DAI Vol. 60, No. 2. Reprinted here with corrected author name.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Kristan D.; Faraj, Daniel A.
In a parallel computer, a largest logical plane from a plurality of logical planes formed of compute nodes of a subcommunicator may be identified by: identifying, by each compute node of the subcommunicator, all logical planes that include the compute node; calculating, by each compute node for each identified logical plane that includes the compute node, an area of the identified logical plane; initiating, by a root node of the subcommunicator, a gather operation; receiving, by the root node from each compute node of the subcommunicator, each node's calculated areas as contribution data to the gather operation; and identifying, bymore » the root node in dependence upon the received calculated areas, a logical plane of the subcommunicator having the greatest area.« less
BlueSNP: R package for highly scalable genome-wide association studies using Hadoop clusters.
Huang, Hailiang; Tata, Sandeep; Prill, Robert J
2013-01-01
Computational workloads for genome-wide association studies (GWAS) are growing in scale and complexity outpacing the capabilities of single-threaded software designed for personal computers. The BlueSNP R package implements GWAS statistical tests in the R programming language and executes the calculations across computer clusters configured with Apache Hadoop, a de facto standard framework for distributed data processing using the MapReduce formalism. BlueSNP makes computationally intensive analyses, such as estimating empirical p-values via data permutation, and searching for expression quantitative trait loci over thousands of genes, feasible for large genotype-phenotype datasets. http://github.com/ibm-bioinformatics/bluesnp
Java and its future in biomedical computing.
Rodgers, R P
1996-01-01
Java, a new object-oriented computing language related to C++, is receiving considerable attention due to its use in creating network-sharable, platform-independent software modules (known as "applets") that can be used with the World Wide Web. The Web has rapidly become the most commonly used information-retrieval tool associated with the global computer network known as the Internet, and Java has the potential to further accelerate the Web's application to medical problems. Java's potentially wide acceptance due to its Web association and its own technical merits also suggests that it may become a popular language for non-Web-based, object-oriented computing. PMID:8880677
India's Computational Biology Growth and Challenges.
Chakraborty, Chiranjib; Bandyopadhyay, Sanghamitra; Agoramoorthy, Govindasamy
2016-09-01
India's computational science is growing swiftly due to the outburst of internet and information technology services. The bioinformatics sector of India has been transforming rapidly by creating a competitive position in global bioinformatics market. Bioinformatics is widely used across India to address a wide range of biological issues. Recently, computational researchers and biologists are collaborating in projects such as database development, sequence analysis, genomic prospects and algorithm generations. In this paper, we have presented the Indian computational biology scenario highlighting bioinformatics-related educational activities, manpower development, internet boom, service industry, research activities, conferences and trainings undertaken by the corporate and government sectors. Nonetheless, this new field of science faces lots of challenges.
NASA Astrophysics Data System (ADS)
Zhuang, Wei; Mountrakis, Giorgos
2014-09-01
Large footprint waveform LiDAR sensors have been widely used for numerous airborne studies. Ground peak identification in a large footprint waveform is a significant bottleneck in exploring full usage of the waveform datasets. In the current study, an accurate and computationally efficient algorithm was developed for ground peak identification, called Filtering and Clustering Algorithm (FICA). The method was evaluated on Land, Vegetation, and Ice Sensor (LVIS) waveform datasets acquired over Central NY. FICA incorporates a set of multi-scale second derivative filters and a k-means clustering algorithm in order to avoid detecting false ground peaks. FICA was tested in five different land cover types (deciduous trees, coniferous trees, shrub, grass and developed area) and showed more accurate results when compared to existing algorithms. More specifically, compared with Gaussian decomposition, the RMSE ground peak identification by FICA was 2.82 m (5.29 m for GD) in deciduous plots, 3.25 m (4.57 m for GD) in coniferous plots, 2.63 m (2.83 m for GD) in shrub plots, 0.82 m (0.93 m for GD) in grass plots, and 0.70 m (0.51 m for GD) in plots of developed areas. FICA performance was also relatively consistent under various slope and canopy coverage (CC) conditions. In addition, FICA showed better computational efficiency compared to existing methods. FICA's major computational and accuracy advantage is a result of the adopted multi-scale signal processing procedures that concentrate on local portions of the signal as opposed to the Gaussian decomposition that uses a curve-fitting strategy applied in the entire signal. The FICA algorithm is a good candidate for large-scale implementation on future space-borne waveform LiDAR sensors.
A Survey of Distributed Optimization and Control Algorithms for Electric Power Systems
Molzahn, Daniel K.; Dorfler, Florian K.; Sandberg, Henrik; ...
2017-07-25
Historically, centrally computed algorithms have been the primary means of power system optimization and control. With increasing penetrations of distributed energy resources requiring optimization and control of power systems with many controllable devices, distributed algorithms have been the subject of significant research interest. Here, this paper surveys the literature of distributed algorithms with applications to optimization and control of power systems. In particular, this paper reviews distributed algorithms for offline solution of optimal power flow (OPF) problems as well as online algorithms for real-time solution of OPF, optimal frequency control, optimal voltage control, and optimal wide-area control problems.
A Survey of Distributed Optimization and Control Algorithms for Electric Power Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Molzahn, Daniel K.; Dorfler, Florian K.; Sandberg, Henrik
Historically, centrally computed algorithms have been the primary means of power system optimization and control. With increasing penetrations of distributed energy resources requiring optimization and control of power systems with many controllable devices, distributed algorithms have been the subject of significant research interest. Here, this paper surveys the literature of distributed algorithms with applications to optimization and control of power systems. In particular, this paper reviews distributed algorithms for offline solution of optimal power flow (OPF) problems as well as online algorithms for real-time solution of OPF, optimal frequency control, optimal voltage control, and optimal wide-area control problems.
Childrens' health, community networks, and the NII: making the connections
NASA Astrophysics Data System (ADS)
Deutsch, Larry; Bronzino, Joseph D.; Farmer, Samuel J.
1996-02-01
To provide quality health care, clinicians need to be well informed. For health care to be cost effective and efficient, redundant services must be eliminated. Urban centers and rural areas need regional health information networks to ensure that primary health care is delivered with good continuity and coordination among providers. This paper describes the development of a city-wide computer-based pediatric health care network to improve decision-making and follow-through, and to provide aggregate data for public health purposes. The design criteria and process for this regional system are presented, addressing issues of network architecture, establishment of a uniform data base, and confidentiality.
Nonhuman gamblers: lessons from rodents, primates, and robots
Paglieri, Fabio; Addessi, Elsa; De Petrillo, Francesca; Laviola, Giovanni; Mirolli, Marco; Parisi, Domenico; Petrosino, Giancarlo; Ventricelli, Marialba; Zoratto, Francesca; Adriani, Walter
2014-01-01
The search for neuronal and psychological underpinnings of pathological gambling in humans would benefit from investigating related phenomena also outside of our species. In this paper, we present a survey of studies in three widely different populations of agents, namely rodents, non-human primates, and robots. Each of these populations offers valuable and complementary insights on the topic, as the literature demonstrates. In addition, we highlight the deep and complex connections between relevant results across these different areas of research (i.e., cognitive and computational neuroscience, neuroethology, cognitive primatology, neuropsychiatry, evolutionary robotics), to make the case for a greater degree of methodological integration in future studies on pathological gambling. PMID:24574984
NASA Technical Reports Server (NTRS)
Moore, Reagan W.; Jagatheesan, Arun; Rajasekar, Arcot; Wan, Michael; Schroeder, Wayne
2004-01-01
The "Grid" is an emerging infrastructure for coordinating access across autonomous organizations to distributed, heterogeneous computation and data resources. Data grids are being built around the world as the next generation data handling systems for sharing, publishing, and preserving data residing on storage systems located in multiple administrative domains. A data grid provides logical namespaces for users, digital entities and storage resources to create persistent identifiers for controlling access, enabling discovery, and managing wide area latencies. This paper introduces data grids and describes data grid use cases. The relevance of data grids to digital libraries and persistent archives is demonstrated, and research issues in data grids and grid dataflow management systems are discussed.
Graphene-Based Josephson-Junction Single-Photon Detector
NASA Astrophysics Data System (ADS)
Walsh, Evan D.; Efetov, Dmitri K.; Lee, Gil-Ho; Heuck, Mikkel; Crossno, Jesse; Ohki, Thomas A.; Kim, Philip; Englund, Dirk; Fong, Kin Chung
2017-08-01
We propose to use graphene-based Josephson junctions (GJJs) to detect single photons in a wide electromagnetic spectrum from visible to radio frequencies. Our approach takes advantage of the exceptionally low electronic heat capacity of monolayer graphene and its constricted thermal conductance to its phonon degrees of freedom. Such a system could provide high-sensitivity photon detection required for research areas including quantum information processing and radio astronomy. As an example, we present our device concepts for GJJ single-photon detectors in both the microwave and infrared regimes. The dark count rate and intrinsic quantum efficiency are computed based on parameters from a measured GJJ, demonstrating feasibility within existing technologies.
Swarm intelligence metaheuristics for enhanced data analysis and optimization.
Hanrahan, Grady
2011-09-21
The swarm intelligence (SI) computing paradigm has proven itself as a comprehensive means of solving complicated analytical chemistry problems by emulating biologically-inspired processes. As global optimum search metaheuristics, associated algorithms have been widely used in training neural networks, function optimization, prediction and classification, and in a variety of process-based analytical applications. The goal of this review is to provide readers with critical insight into the utility of swarm intelligence tools as methods for solving complex chemical problems. Consideration will be given to algorithm development, ease of implementation and model performance, detailing subsequent influences on a number of application areas in the analytical, bioanalytical and detection sciences.
Dynamic Programming and Graph Algorithms in Computer Vision*
Felzenszwalb, Pedro F.; Zabih, Ramin
2013-01-01
Optimization is a powerful paradigm for expressing and solving problems in a wide range of areas, and has been successfully applied to many vision problems. Discrete optimization techniques are especially interesting, since by carefully exploiting problem structure they often provide non-trivial guarantees concerning solution quality. In this paper we briefly review dynamic programming and graph algorithms, and discuss representative examples of how these discrete optimization techniques have been applied to some classical vision problems. We focus on the low-level vision problem of stereo; the mid-level problem of interactive object segmentation; and the high-level problem of model-based recognition. PMID:20660950
An Efficient Image Recovery Algorithm for Diffraction Tomography Systems
NASA Technical Reports Server (NTRS)
Jin, Michael Y.
1993-01-01
A diffraction tomography system has potential application in ultrasonic medical imaging area. It is capable of achieving imagery with the ultimate resolution of one quarter the wavelength by collecting ultrasonic backscattering data from a circular array of sensors and reconstructing the object reflectivity using a digital image recovery algorithm performed by a computer. One advantage of such a system is that is allows a relatively lower frequency wave to penetrate more deeply into the object and still achieve imagery with a reasonable resolution. An efficient image recovery algorithm for the diffraction tomography system was originally developed for processing a wide beam spaceborne SAR data...
Computers in Undergraduate Science Education. Conference Proceedings.
ERIC Educational Resources Information Center
Blum, Ronald, Ed.
Six areas of computer use in undergraduate education, particularly in the fields of mathematics and physics, are discussed in these proceedings. The areas included are: the computational mode; computer graphics; the simulation mode; analog computing; computer-assisted instruction; and the current politics and management of college level computer…
5 CFR 591.226 - How does OPM apply the CPIs?
Code of Federal Regulations, 2013 CFR
2013-01-01
... survey. (1) Step 1. OPM computes the annual or biennial CPI change for the COLA area. (2) Step 2. OPM computes the annual or biennial CPI change for the DC area. (3) Step 3. OPM multiplies the COLA area price index from the last survey by the COLA area CPI change computed in step 1 divided by the DC area CPI...
5 CFR 591.226 - How does OPM apply the CPIs?
Code of Federal Regulations, 2014 CFR
2014-01-01
... survey. (1) Step 1. OPM computes the annual or biennial CPI change for the COLA area. (2) Step 2. OPM computes the annual or biennial CPI change for the DC area. (3) Step 3. OPM multiplies the COLA area price index from the last survey by the COLA area CPI change computed in step 1 divided by the DC area CPI...
5 CFR 591.226 - How does OPM apply the CPIs?
Code of Federal Regulations, 2011 CFR
2011-01-01
... survey. (1) Step 1. OPM computes the annual or biennial CPI change for the COLA area. (2) Step 2. OPM computes the annual or biennial CPI change for the DC area. (3) Step 3. OPM multiplies the COLA area price index from the last survey by the COLA area CPI change computed in step 1 divided by the DC area CPI...
5 CFR 591.226 - How does OPM apply the CPIs?
Code of Federal Regulations, 2012 CFR
2012-01-01
... survey. (1) Step 1. OPM computes the annual or biennial CPI change for the COLA area. (2) Step 2. OPM computes the annual or biennial CPI change for the DC area. (3) Step 3. OPM multiplies the COLA area price index from the last survey by the COLA area CPI change computed in step 1 divided by the DC area CPI...
5 CFR 591.226 - How does OPM apply the CPIs?
Code of Federal Regulations, 2010 CFR
2010-01-01
... survey. (1) Step 1. OPM computes the annual or biennial CPI change for the COLA area. (2) Step 2. OPM computes the annual or biennial CPI change for the DC area. (3) Step 3. OPM multiplies the COLA area price index from the last survey by the COLA area CPI change computed in step 1 divided by the DC area CPI...
NASA Technical Reports Server (NTRS)
Johnston, William E.; Gannon, Dennis; Nitzberg, Bill; Feiereisen, William (Technical Monitor)
2000-01-01
The term "Grid" refers to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. The vision for NASN's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks that will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: The scientist / design engineer whose primary interest is problem solving (e.g., determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user if the tool designer: The computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. This paper describes the current state of IPG (the operational testbed), the set of capabilities being put into place for the operational prototype IPG, as well as some of the longer term R&D tasks.
Reis, H; Rasulev, B; Papadopoulos, M G; Leszczynski, J
2015-01-01
Fullerene and its derivatives are currently one of the most intensively investigated species in the area of nanomedicine and nanochemistry. Various unique properties of fullerenes are responsible for their wide range applications in industry, biology and medicine. A large pool of functionalized C60 and C70 fullerenes is investigated theoretically at different levels of quantum-mechanical theory. The semiempirial PM6 method, density functional theory with the B3LYP functional, and correlated ab initio MP2 method are employed to compute the optimized structures, and an array of properties for the considered species. In addition to the calculations for isolated molecules, the results of solution calculations are also reported at the DFT level, using the polarizable continuum model (PCM). Ionization potentials (IPs) and electron affinities (EAs) are computed by means of Koopmans' theorem as well as with the more accurate but computationally expensive ΔSCF method. Both procedures yield comparable values, while comparison of IPs and EAs computed with different quantum-mechanical methods shows surprisingly large differences. Harmonic vibrational frequencies are computed at the PM6 and B3LYP levels of theory and compared with each other. A possible application of the frequencies as 3D descriptors in the EVA (EigenVAlues) method is shown. All the computed data are made available, and may be used to replace experimental data in routine applications where large amounts of data are required, e.g. in structure-activity relationship studies of the toxicity of fullerene derivatives.
NASA Astrophysics Data System (ADS)
Contreras Vargas, M. T.; Escauriaza, C. R.; Westerink, J. J.
2017-12-01
In recent years, the occurrence of flash floods and landslides produced by hydrometeorological events in Andean watersheds has had devastating consequences in urban and rural areas near the mountains. Two factors have hindered the hazard forecast in the region: 1) The spatial and temporal variability of climate conditions, which reduce the time range that the storm features can be predicted; and 2) The complexity of the basin morphology that characterizes the Andean region, and increases the velocity and the sediment transport capacity of flows that reach urbanized areas. Hydrodynamic models have become key tools to assess potential flood risks. Two-dimensional (2D) models based on the shallow-water equations are widely used to determine with high accuracy and resolution, the evolution of flow depths and velocities during floods. However, the high-computational requirements and long computational times have encouraged research to develop more efficient methodologies for predicting the flood propagation on real time. Our objective is to develop new surrogate models (i.e. metamodeling) to quasi-instantaneously evaluate floods propagation in the Andes foothills. By means a small set of parameters, we define storms for a wide range of meteorological conditions. Using a 2D hydrodynamic model coupled in mass and momentum with the sediment concentration, we compute on high-fidelity the propagation of a flood set. Results are used as a database to perform sophisticated interpolation/regression, and approximate efficiently the flow depth and velocities in critical points during real storms. This is the first application of surrogate models to evaluate flood propagation in the Andes foothills, improving the efficiency of flood hazard prediction. The model also opens new opportunities to improve early warning systems, helping decision makers to inform citizens, enhancing the reslience of cities near mountain regions. This work has been supported by CONICYT/FONDAP grant 15110017, and by the Vice Chancellor of Research of the Pontificia Universidad Catolica de Chile, through the Research Internationalization Grant, PUC1566 funded by MINEDUC.
40 CFR 52.326 - Area-wide nitrogen oxides (NOX) exemptions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 3 2010-07-01 2010-07-01 false Area-wide nitrogen oxides (NOX) exemptions. 52.326 Section 52.326 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS Colorado § 52.326 Area-wide nitrogen...
40 CFR 52.326 - Area-wide nitrogen oxides (NOX) exemptions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 3 2011-07-01 2011-07-01 false Area-wide nitrogen oxides (NOX) exemptions. 52.326 Section 52.326 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS Colorado § 52.326 Area-wide nitrogen...
40 CFR 52.326 - Area-wide nitrogen oxides (NOX) exemptions.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 3 2012-07-01 2012-07-01 false Area-wide nitrogen oxides (NOX) exemptions. 52.326 Section 52.326 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS Colorado § 52.326 Area-wide nitrogen...
40 CFR 52.326 - Area-wide nitrogen oxides (NOX) exemptions.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 3 2014-07-01 2014-07-01 false Area-wide nitrogen oxides (NOX) exemptions. 52.326 Section 52.326 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS Colorado § 52.326 Area-wide nitrogen...
40 CFR 52.326 - Area-wide nitrogen oxides (NOX) exemptions.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 3 2013-07-01 2013-07-01 false Area-wide nitrogen oxides (NOX) exemptions. 52.326 Section 52.326 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS Colorado § 52.326 Area-wide nitrogen...
Scilab software as an alternative low-cost computing in solving the linear equations problem
NASA Astrophysics Data System (ADS)
Agus, Fahrul; Haviluddin
2017-02-01
Numerical computation packages are widely used both in teaching and research. These packages consist of license (proprietary) and open source software (non-proprietary). One of the reasons to use the package is a complexity of mathematics function (i.e., linear problems). Also, number of variables in a linear or non-linear function has been increased. The aim of this paper was to reflect on key aspects related to the method, didactics and creative praxis in the teaching of linear equations in higher education. If implemented, it could be contribute to a better learning in mathematics area (i.e., solving simultaneous linear equations) that essential for future engineers. The focus of this study was to introduce an additional numerical computation package of Scilab as an alternative low-cost computing programming. In this paper, Scilab software was proposed some activities that related to the mathematical models. In this experiment, four numerical methods such as Gaussian Elimination, Gauss-Jordan, Inverse Matrix, and Lower-Upper Decomposition (LU) have been implemented. The results of this study showed that a routine or procedure in numerical methods have been created and explored by using Scilab procedures. Then, the routine of numerical method that could be as a teaching material course has exploited.
Computation of discharge using the index-velocity method in tidally affected areas
Ruhl, Catherine A.; Simpson, Michael R.
2005-01-01
Computation of a discharge time-series in a tidally affected area is a two-step process. First, the cross-sectional area is computed on the basis of measured water levels and the mean cross-sectional velocity is computed on the basis of the measured index velocity. Then discharge is calculated as the product of the area and mean velocity. Daily mean discharge is computed as the daily average of the low-pass filtered discharge. The Sacramento-San Joaquin River Delta and San Francisco Bay, California, is an area that is strongly influenced by the tides, and therefore is used as an example of how this methodology is used.
Engineering Near-Field Transport of Energy using Nanostructured Materials
2015-12-12
increasingly important for a wide range of nanotechnology applications. Recent computational studies on near- field radiative heat transfer (NFRHT) suggest...SECURITY CLASSIFICATION OF: The transport of heat at the nanometer scale is becoming increasingly important for a wide range of nanotechnology...applications. Recent computational studies on near- field radiative heat transfer (NFRHT) suggest that radiative energy transport between suitably chosen
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wardle, K.E.
2013-07-01
Liquid-liquid contacting equipment used in solvent extraction processes has the dual purpose of mixing and separating two immiscible fluids. Consequently, such devices inherently encompass a wide variety of multiphase flow regimes. A hybrid multiphase computational fluid dynamics (CFD) solver which combines the Eulerian multi-fluid method with VOF (volume of fluid) sharp interface capturing has been developed for application to annular centrifugal contactors. This solver has been extended to enable prediction of mean droplet size and liquid-liquid interfacial area through a single moment population balance method. Simulations of liquid-liquid mixing in a simplified geometry and a model annular centrifugal contactor aremore » reported with droplet breakup/coalescence models being calibrated versus available experimental data. Quantitative comparison is made for two different housing vane geometries and it is found that the predicted droplet size is significantly smaller for vane geometries which result in higher annular liquid holdup.« less
Sandia National Laboratories analysis code data base
NASA Astrophysics Data System (ADS)
Peterson, C. W.
1994-11-01
Sandia National Laboratories' mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The laboratories' strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia's technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems, and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code 'ownership' and release status, and references describing the physical models and numerical implementation.
Evaluation of terrestrial photogrammetric point clouds derived from thermal imagery
NASA Astrophysics Data System (ADS)
Metcalf, Jeremy P.; Olsen, Richard C.
2016-05-01
Computer vision and photogrammetric techniques have been widely applied to digital imagery producing high density 3D point clouds. Using thermal imagery as input, the same techniques can be applied to infrared data to produce point clouds in 3D space, providing surface temperature information. The work presented here is an evaluation of the accuracy of 3D reconstruction of point clouds produced using thermal imagery. An urban scene was imaged over an area at the Naval Postgraduate School, Monterey, CA, viewing from above as with an airborne system. Terrestrial thermal and RGB imagery were collected from a rooftop overlooking the site using a FLIR SC8200 MWIR camera and a Canon T1i DSLR. In order to spatially align each dataset, ground control points were placed throughout the study area using Trimble R10 GNSS receivers operating in RTK mode. Each image dataset is processed to produce a dense point cloud for 3D evaluation.
Applications of software-defined radio (SDR) technology in hospital environments.
Chávez-Santiago, Raúl; Mateska, Aleksandra; Chomu, Konstantin; Gavrilovska, Liljana; Balasingham, Ilangko
2013-01-01
A software-defined radio (SDR) is a radio communication system where the major part of its functionality is implemented by means of software in a personal computer or embedded system. Such a design paradigm has the major advantage of producing devices that can receive and transmit widely different radio protocols based solely on the software used. This flexibility opens several application opportunities in hospital environments, where a large number of wired and wireless electronic devices must coexist in confined areas like operating rooms and intensive care units. This paper outlines some possible applications in the 2360-2500 MHz frequency band. These applications include the integration of wireless medical devices in a common communication platform for seamless interoperability, and cognitive radio (CR) for body area networks (BANs) and wireless sensor networks (WSNs) for medical environmental surveillance. The description of a proof-of-concept CR prototype is also presented.
ERIC Educational Resources Information Center
Hofstetter, Fred T.
Dealing exclusively with instructional computing, this paper describes how computers are delivering instruction in a wide variety of subjects to students of all ages and explains why computer-based education is currently having a profound impact on education. After a discussion of roots and origins, computer applications are described for…
ERIC Educational Resources Information Center
Lin, Che-Li; Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung
2013-01-01
Teacher-centered instruction has been widely adopted in college computer science classrooms and has some benefits in training computer science undergraduates. Meanwhile, student-centered contexts have been advocated to promote computer science education. How computer science learners respond to or prefer the two types of teacher authority,…
Johnson, Timothy C.; Versteeg, Roelof J.; Ward, Andy; Day-Lewis, Frederick D.; Revil, André
2010-01-01
Electrical geophysical methods have found wide use in the growing discipline of hydrogeophysics for characterizing the electrical properties of the subsurface and for monitoring subsurface processes in terms of the spatiotemporal changes in subsurface conductivity, chargeability, and source currents they govern. Presently, multichannel and multielectrode data collections systems can collect large data sets in relatively short periods of time. Practitioners, however, often are unable to fully utilize these large data sets and the information they contain because of standard desktop-computer processing limitations. These limitations can be addressed by utilizing the storage and processing capabilities of parallel computing environments. We have developed a parallel distributed-memory forward and inverse modeling algorithm for analyzing resistivity and time-domain induced polar-ization (IP) data. The primary components of the parallel computations include distributed computation of the pole solutions in forward mode, distributed storage and computation of the Jacobian matrix in inverse mode, and parallel execution of the inverse equation solver. We have tested the corresponding parallel code in three efforts: (1) resistivity characterization of the Hanford 300 Area Integrated Field Research Challenge site in Hanford, Washington, U.S.A., (2) resistivity characterization of a volcanic island in the southern Tyrrhenian Sea in Italy, and (3) resistivity and IP monitoring of biostimulation at a Superfund site in Brandywine, Maryland, U.S.A. Inverse analysis of each of these data sets would be limited or impossible in a standard serial computing environment, which underscores the need for parallel high-performance computing to fully utilize the potential of electrical geophysical methods in hydrogeophysical applications.
Computer Science Research at Langley
NASA Technical Reports Server (NTRS)
Voigt, S. J. (Editor)
1982-01-01
A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.
FAA computer security : recommendations to address continuing weaknesses
DOT National Transportation Integrated Search
2000-12-01
In September, testimony before the Committee on Science, House of Representatives, focused on the Federal Aviation Administration's (FAA) computer security program. In brief, we reported that FAA's agency-wide computer security program has serious, p...
Luckman, Matthew; Hans, Didier; Cortez, Natalia; Nishiyama, Kyle K; Agarawal, Sanchita; Zhang, Chengchen; Nikkel, Lucas; Iyer, Sapna; Fusaro, Maria; Guo, Edward X; McMahon, Donald J; Shane, Elizabeth; Nickolas, Thomas L
2017-04-03
Studies using high-resolution peripheral quantitative computed tomography showed progressive abnormalities in cortical and trabecular microarchitecture and biomechanical competence over the first year after kidney transplantation. However, high-resolution peripheral computed tomography is a research tool lacking wide availability. In contrast, the trabecular bone score is a novel and widely available tool that uses gray-scale variograms of the spine image from dual-energy x-ray absorptiometry to assess trabecular quality. There are no studies assessing whether trabecular bone score characterizes bone quality in kidney transplant recipients. Between 2009 and 2010, we conducted a study to assess changes in peripheral skeletal microarchitecture, measured by high-resolution peripheral computed tomography, during the first year after transplantation in 47 patients managed with early corticosteroid-withdrawal immunosuppression. All adult first-time transplant candidates were eligible. Patients underwent imaging with high-resolution peripheral computed tomography and dual-energy x-ray absorptiometry pretransplantation and 3, 6, and 12 months post-transplantation. We now test if, during the first year after transplantation, trabecular bone score assesses the evolution of bone microarchitecture and biomechanical competence as determined by high-resolution peripheral computed tomography. At baseline and follow-up, among the 72% and 78%, respectively, of patients having normal bone mineral density by dual-energy x-ray absorptiometry, 53% and 50%, respectively, were classified by trabecular bone score as having high fracture risk. At baseline, trabecular bone score correlated with spine, hip, and ultradistal radius bone mineral density by dual-energy x-ray absorptiometry and cortical area, density, thickness, and porosity; trabecular density, thickness, separation, and heterogeneity; and stiffness and failure load by high-resolution peripheral computed tomography. Longitudinally, each percentage increase in trabecular bone score was associated with increases in trabecular number (0.35%±1.4%); decreases in trabecular thickness (-0.45%±0.15%), separation (-0.40%±0.15%), and network heterogeneity (-0.48%±0.20%); and increases in failure load (0.22%±0.09%) by high-resolution peripheral computed tomography (all P <0.05). Trabecular bone score may be a useful method to assess and monitor bone quality and strength and classify fracture risk in kidney transplant recipients. Copyright © 2017 by the American Society of Nephrology.
Luckman, Matthew; Hans, Didier; Cortez, Natalia; Nishiyama, Kyle K.; Agarawal, Sanchita; Zhang, Chengchen; Nikkel, Lucas; Iyer, Sapna; Fusaro, Maria; Guo, Edward X.; McMahon, Donald J.; Shane, Elizabeth
2017-01-01
Background and objectives Studies using high-resolution peripheral quantitative computed tomography showed progressive abnormalities in cortical and trabecular microarchitecture and biomechanical competence over the first year after kidney transplantation. However, high-resolution peripheral computed tomography is a research tool lacking wide availability. In contrast, the trabecular bone score is a novel and widely available tool that uses gray-scale variograms of the spine image from dual-energy x-ray absorptiometry to assess trabecular quality. There are no studies assessing whether trabecular bone score characterizes bone quality in kidney transplant recipients. Design, settings, participants, & measurements Between 2009 and 2010, we conducted a study to assess changes in peripheral skeletal microarchitecture, measured by high-resolution peripheral computed tomography, during the first year after transplantation in 47 patients managed with early corticosteroid–withdrawal immunosuppression. All adult first-time transplant candidates were eligible. Patients underwent imaging with high-resolution peripheral computed tomography and dual-energy x-ray absorptiometry pretransplantation and 3, 6, and 12 months post-transplantation. We now test if, during the first year after transplantation, trabecular bone score assesses the evolution of bone microarchitecture and biomechanical competence as determined by high-resolution peripheral computed tomography. Results At baseline and follow-up, among the 72% and 78%, respectively, of patients having normal bone mineral density by dual-energy x-ray absorptiometry, 53% and 50%, respectively, were classified by trabecular bone score as having high fracture risk. At baseline, trabecular bone score correlated with spine, hip, and ultradistal radius bone mineral density by dual-energy x-ray absorptiometry and cortical area, density, thickness, and porosity; trabecular density, thickness, separation, and heterogeneity; and stiffness and failure load by high-resolution peripheral computed tomography. Longitudinally, each percentage increase in trabecular bone score was associated with increases in trabecular number (0.35%±1.4%); decreases in trabecular thickness (−0.45%±0.15%), separation (−0.40%±0.15%), and network heterogeneity (−0.48%±0.20%); and increases in failure load (0.22%±0.09%) by high-resolution peripheral computed tomography (all P<0.05). Conclusions Trabecular bone score may be a useful method to assess and monitor bone quality and strength and classify fracture risk in kidney transplant recipients. PMID:28348031
Experiments and Analyses of Data Transfers Over Wide-Area Dedicated Connections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S.; Liu, Qiang; Sen, Satyabrata
Dedicated wide-area network connections are increasingly employed in high-performance computing and big data scenarios. One might expect the performance and dynamics of data transfers over such connections to be easy to analyze due to the lack of competing traffic. However, non-linear transport dynamics and end-system complexities (e.g., multi-core hosts and distributed filesystems) can in fact make analysis surprisingly challenging. We present extensive measurements of memory-to-memory and disk-to-disk file transfers over 10 Gbps physical and emulated connections with 0–366 ms round trip times (RTTs). For memory-to-memory transfers, profiles of both TCP and UDT throughput as a function of RTT show concavemore » and convex regions; large buffer sizes and more parallel flows lead to wider concave regions, which are highly desirable. TCP and UDT both also display complex throughput dynamics, as indicated by their Poincare maps and Lyapunov exponents. For disk-to-disk transfers, we determine that high throughput can be achieved via a combination of parallel I/O threads, parallel network threads, and direct I/O mode. Our measurements also show that Lustre filesystems can be mounted over long-haul connections using LNet routers, although challenges remain in jointly optimizing file I/O and transport method parameters to achieve peak throughput.« less
On Data Transfers Over Wide-Area Dedicated Connections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S.; Liu, Qiang
Dedicated wide-area network connections are employed in big data and high-performance computing scenarios, since the absence of cross-traffic promises to make it easier to analyze and optimize data transfers over them. However, nonlinear transport dynamics and end-system complexity due to multi-core hosts and distributed file systems make these tasks surprisingly challenging. We present an overview of methods to analyze memory and disk file transfers using extensive measurements over 10 Gbps physical and emulated connections with 0–366 ms round trip times (RTTs). For memory transfers, we derive performance profiles of TCP and UDT throughput as a function of RTT, which showmore » concave regions in contrast to entirely convex regions predicted by previous models. These highly desirable concave regions can be expanded by utilizing large buffers and more parallel flows. We also present Poincar´e maps and Lyapunov exponents of TCP and UDT throughputtraces that indicate complex throughput dynamics. For disk file transfers, we show that throughput can be optimized using a combination of parallel I/O and network threads under direct I/O mode. Our initial throughput measurements of Lustre filesystems mounted over long-haul connections using LNet routers show convex profiles indicative of I/O limits.« less
A novel 3D shape descriptor for automatic retrieval of anatomical structures from medical images
NASA Astrophysics Data System (ADS)
Nunes, Fátima L. S.; Bergamasco, Leila C. C.; Delmondes, Pedro H.; Valverde, Miguel A. G.; Jackowski, Marcel P.
2017-03-01
Content-based image retrieval (CBIR) aims at retrieving from a database objects that are similar to an object provided by a query, by taking into consideration a set of extracted features. While CBIR has been widely applied in the two-dimensional image domain, the retrieval of3D objects from medical image datasets using CBIR remains to be explored. In this context, the development of descriptors that can capture information specific to organs or structures is desirable. In this work, we focus on the retrieval of two anatomical structures commonly imaged by Magnetic Resonance Imaging (MRI) and Computed Tomography (CT) techniques, the left ventricle of the heart and blood vessels. Towards this aim, we developed the Area-Distance Local Descriptor (ADLD), a novel 3D local shape descriptor that employs mesh geometry information, namely facet area and distance from centroid to surface, to identify shape changes. Because ADLD only considers surface meshes extracted from volumetric medical images, it substantially diminishes the amount of data to be analyzed. A 90% precision rate was obtained when retrieving both convex (left ventricle) and non-convex structures (blood vessels), allowing for detection of abnormalities associated with changes in shape. Thus, ADLD has the potential to aid in the diagnosis of a wide range of vascular and cardiac diseases.
ERIC Educational Resources Information Center
Kieren, Thomas E.
This last paper in a set of four reviews research on a wide variety of computer applications in the mathematics classroom. It covers computer-based instruction, especially drill-and-practice and tutorial modes; computer-managed instruction; and computer-augmented problem-solving. Analytical comments on the findings and status of the research are…
Summary of the 2012 Wide Area Recovery and Resiliency Program (WARRP) Waste Management Workshop
Workshop advanced the planning of federal, state and local officials in the area of waste management following a chemical, biological or radiological wide-area incident in the Denver, Colorado urban area.
Deterministic object tracking using Gaussian ringlet and directional edge features
NASA Astrophysics Data System (ADS)
Krieger, Evan W.; Sidike, Paheding; Aspiras, Theus; Asari, Vijayan K.
2017-10-01
Challenges currently existing for intensity-based histogram feature tracking methods in wide area motion imagery (WAMI) data include object structural information distortions, background variations, and object scale change. These issues are caused by different pavement or ground types and from changing the sensor or altitude. All of these challenges need to be overcome in order to have a robust object tracker, while attaining a computation time appropriate for real-time processing. To achieve this, we present a novel method, Directional Ringlet Intensity Feature Transform (DRIFT), which employs Kirsch kernel filtering for edge features and a ringlet feature mapping for rotational invariance. The method also includes an automatic scale change component to obtain accurate object boundaries and improvements for lowering computation times. We evaluated the DRIFT algorithm on two challenging WAMI datasets, namely Columbus Large Image Format (CLIF) and Large Area Image Recorder (LAIR), to evaluate its robustness and efficiency. Additional evaluations on general tracking video sequences are performed using the Visual Tracker Benchmark and Visual Object Tracking 2014 databases to demonstrate the algorithms ability with additional challenges in long complex sequences including scale change. Experimental results show that the proposed approach yields competitive results compared to state-of-the-art object tracking methods on the testing datasets.
A computer-aided ECG diagnostic tool.
Oweis, Rami; Hijazi, Lily
2006-03-01
Jordan lacks companies that provide local medical facilities with products that are of help in daily performed medical procedures. Because of this, the country imports most of these expensive products. Consequently, a local interest in producing such products has emerged and resulted in serious research efforts in this area. The main goal of this paper is to provide local (the north of Jordan) clinics with a computer-aided electrocardiogram (ECG) diagnostic tool in an attempt to reduce time and work demands for busy physicians especially in areas where only one general medicine doctor is employed and a bulk of cases are to be diagnosed. The tool was designed to help in detecting heart defects such as arrhythmias and heart blocks using ECG signal analysis depending on the time-domain representation, the frequency-domain spectrum, and the relationship between them. The application studied here represents a state of the art ECG diagnostic tool that was designed, implemented, and tested in Jordan to serve wide spectrum of population who are from poor families. The results of applying the tool on randomly selected representative sample showed about 99% matching with those results obtained at specialized medical facilities. Costs, ease of interface, and accuracy indicated the usefulness of the tool and its use as an assisting diagnostic tool.
A Versatile Nonlinear Method for Predictive Modeling
NASA Technical Reports Server (NTRS)
Liou, Meng-Sing; Yao, Weigang
2015-01-01
As computational fluid dynamics techniques and tools become widely accepted for realworld practice today, it is intriguing to ask: what areas can it be utilized to its potential in the future. Some promising areas include design optimization and exploration of fluid dynamics phenomena (the concept of numerical wind tunnel), in which both have the common feature where some parameters are varied repeatedly and the computation can be costly. We are especially interested in the need for an accurate and efficient approach for handling these applications: (1) capturing complex nonlinear dynamics inherent in a system under consideration and (2) versatility (robustness) to encompass a range of parametric variations. In our previous paper, we proposed to use first-order Taylor expansion collected at numerous sampling points along a trajectory and assembled together via nonlinear weighting functions. The validity and performance of this approach was demonstrated for a number of problems with a vastly different input functions. In this study, we are especially interested in enhancing the method's accuracy; we extend it to include the second-orer Taylor expansion, which however requires a complicated evaluation of Hessian matrices for a system of equations, like in fluid dynamics. We propose a method to avoid these Hessian matrices, while maintaining the accuracy. Results based on the method are presented to confirm its validity.
Fuzzy logic, neural networks, and soft computing
NASA Technical Reports Server (NTRS)
Zadeh, Lofti A.
1994-01-01
The past few years have witnessed a rapid growth of interest in a cluster of modes of modeling and computation which may be described collectively as soft computing. The distinguishing characteristic of soft computing is that its primary aims are to achieve tractability, robustness, low cost, and high MIQ (machine intelligence quotient) through an exploitation of the tolerance for imprecision and uncertainty. Thus, in soft computing what is usually sought is an approximate solution to a precisely formulated problem or, more typically, an approximate solution to an imprecisely formulated problem. A simple case in point is the problem of parking a car. Generally, humans can park a car rather easily because the final position of the car is not specified exactly. If it were specified to within, say, a few millimeters and a fraction of a degree, it would take hours or days of maneuvering and precise measurements of distance and angular position to solve the problem. What this simple example points to is the fact that, in general, high precision carries a high cost. The challenge, then, is to exploit the tolerance for imprecision by devising methods of computation which lead to an acceptable solution at low cost. By its nature, soft computing is much closer to human reasoning than the traditional modes of computation. At this juncture, the major components of soft computing are fuzzy logic (FL), neural network theory (NN), and probabilistic reasoning techniques (PR), including genetic algorithms, chaos theory, and part of learning theory. Increasingly, these techniques are used in combination to achieve significant improvement in performance and adaptability. Among the important application areas for soft computing are control systems, expert systems, data compression techniques, image processing, and decision support systems. It may be argued that it is soft computing, rather than the traditional hard computing, that should be viewed as the foundation for artificial intelligence. In the years ahead, this may well become a widely held position.
Federated data storage system prototype for LHC experiments and data intensive science
NASA Astrophysics Data System (ADS)
Kiryanov, A.; Klimentov, A.; Krasnopevtsev, D.; Ryabinkin, E.; Zarochentsev, A.
2017-10-01
Rapid increase of data volume from the experiments running at the Large Hadron Collider (LHC) prompted physics computing community to evaluate new data handling and processing solutions. Russian grid sites and universities’ clusters scattered over a large area aim at the task of uniting their resources for future productive work, at the same time giving an opportunity to support large physics collaborations. In our project we address the fundamental problem of designing a computing architecture to integrate distributed storage resources for LHC experiments and other data-intensive science applications and to provide access to data from heterogeneous computing facilities. Studies include development and implementation of federated data storage prototype for Worldwide LHC Computing Grid (WLCG) centres of different levels and University clusters within one National Cloud. The prototype is based on computing resources located in Moscow, Dubna, Saint Petersburg, Gatchina and Geneva. This project intends to implement a federated distributed storage for all kind of operations such as read/write/transfer and access via WAN from Grid centres, university clusters, supercomputers, academic and commercial clouds. The efficiency and performance of the system are demonstrated using synthetic and experiment-specific tests including real data processing and analysis workflows from ATLAS and ALICE experiments, as well as compute-intensive bioinformatics applications (PALEOMIX) running on supercomputers. We present topology and architecture of the designed system, report performance and statistics for different access patterns and show how federated data storage can be used efficiently by physicists and biologists. We also describe how sharing data on a widely distributed storage system can lead to a new computing model and reformations of computing style, for instance how bioinformatics program running on supercomputers can read/write data from the federated storage.
Mapping similarities in temporal parking occupancy behavior based on city-wide parking meter data
NASA Astrophysics Data System (ADS)
Bock, Fabian; Xia, Karen; Sester, Monika
2018-05-01
The search for a parking space is a severe and stressful problem for drivers in many cities. The provision of maps with parking space occupancy information assists drivers in avoiding the most crowded roads at certain times. Since parking occupancy reveals a repetitive pattern per day and per week, typical parking occupancy patterns can be extracted from historical data. In this paper, we analyze city-wide parking meter data from Hannover, Germany, for a full year. We describe an approach of clustering these parking meters to reduce the complexity of this parking occupancy information and to reveal areas with similar parking behavior. The parking occupancy at every parking meter is derived from a timestamp of ticket payment and the validity period of the parking tickets. The similarity of the parking meters is computed as the mean-squared deviation of the average daily patterns in parking occupancy at the parking meters. Based on this similarity measure, a hierarchical clustering is applied. The number of clusters is determined with the Davies-Bouldin Index and the Silhouette Index. Results show that, after extensive data cleansing, the clustering leads to three clusters representing typical parking occupancy day patterns. Those clusters differ mainly in the hour of the maximum occupancy. In addition, the lo-cations of parking meter clusters, computed only based on temporal similarity, also show clear spatial distinctions from other clusters.
Simple Sequence Repeats in Escherichia coli: Abundance, Distribution, Composition, and Polymorphism
Gur-Arie, Riva; Cohen, Cyril J.; Eitan, Yuval; Shelef, Leora; Hallerman, Eric M.; Kashi, Yechezkel
2000-01-01
Computer-based genome-wide screening of the DNA sequence of Escherichia coli strain K12 revealed tens of thousands of tandem simple sequence repeat (SSR) tracts, with motifs ranging from 1 to 6 nucleotides. SSRs were well distributed throughout the genome. Mononucleotide SSRs were over-represented in noncoding regions and under-represented in open reading frames (ORFs). Nucleotide composition of mono- and dinucleotide SSRs, both in ORFs and in noncoding regions, differed from that of the genomic region in which they occurred, with 93% of all mononucleotide SSRs proving to be of A or T. Computer-based analysis of the fine position of every SSR locus in the noncoding portion of the genome relative to downstream ORFs showed SSRs located in areas that could affect gene regulation. DNA sequences at 14 arbitrarily chosen SSR tracts were compared among E. coli strains. Polymorphisms of SSR copy number were observed at four of seven mononucleotide SSR tracts screened, with all polymorphisms occurring in noncoding regions. SSR polymorphism could prove important as a genome-wide source of variation, both for practical applications (including rapid detection, strain identification, and detection of loci affecting key phenotypes) and for evolutionary adaptation of microbes.[The sequence data described in this paper have been submitted to the GenBank data library under accession numbers AF209020–209030 and AF209508–209518.] PMID:10645951
A survey of World Wide Web use in middle-aged and older adults.
Morrell, R W; Mayhorn, C B; Bennett, J
2000-01-01
We conducted a survey to document World Wide Web use patterns in middle-aged (ages 40-59), young-old (ages 60-74), and old-old adults (ages 75-92). We conducted this survey of 550 adults 40 years old and over in southeastern Michigan, and the overall response rate was approximately 71%. The results suggested that (a) there are distinct age and demographic differences in individuals who use the Web; (b) middle-aged and older Web users are similar in their use patterns; (c) the two primary predictors for not using the Web are lack of access to a computer and lack of knowledge about the Web; (d) old-old adults have the least interest in using the Web compared with middle-aged and young-old adults; and (e) the primary content areas in learning how to use the Web are learning how to use electronic mail and accessing health information and information about traveling for pleasure. This research may serve as a preliminary attempt to ascertain the issues that must be faced in order to increase use of the World Wide Web in middle-aged and older adults.
Sonoma County Office of Education Computer Education Plan. County Level Plans.
ERIC Educational Resources Information Center
Malone, Greg
1986-01-01
This plan describes the educational computing and computer literacy program to be implemented by the schools in Sonoma County, California. Topics covered include the roles, responsibilities, and procedures of the county-wide computer committee; the goals of computer education in the county schools; the results of a needs assessment study; a 3-year…
ERIC Educational Resources Information Center
Huston, Rick, Ed.; Armel, Donald, Ed.
Topics addressed by 40 papers from a conference on microcomputers include: developing a campus wide computer ethics policy; integrating new technologies into professional education; campus computer networks; computer assisted instruction; client/server architecture; competencies for entry-level computing positions; auditing and professional…
ERIC Educational Resources Information Center
Association for Educational Data Systems, Washington, DC.
The 122 papers in this collection were presented in 15 sessions of the 20th annual convention of the Association for Educational Data Systems which was held in Orlando, Florida, May 10-14, 1982. Individual papers covered a wide variety of topics, including computer assisted instruction, computer managed instruction, computer literacy,…
Protecting Your Computer from Viruses
ERIC Educational Resources Information Center
Descy, Don E.
2006-01-01
A computer virus is defined as a software program capable of reproducing itself and usually capable of causing great harm to files or other programs on the same computer. The existence of computer viruses--or the necessity of avoiding viruses--is part of using a computer. With the advent of the Internet, the door was opened wide for these…
A Web of Resources for Introductory Computer Science.
ERIC Educational Resources Information Center
Rebelsky, Samuel A.
As the field of Computer Science has grown, the syllabus of the introductory Computer Science course has changed significantly. No longer is it a simple introduction to programming or a tutorial on computer concepts and applications. Rather, it has become a survey of the field of Computer Science, touching on a wide variety of topics from digital…
A 60 GOPS/W, -1.8 V to 0.9 V body bias ULP cluster in 28 nm UTBB FD-SOI technology
NASA Astrophysics Data System (ADS)
Rossi, Davide; Pullini, Antonio; Loi, Igor; Gautschi, Michael; Gürkaynak, Frank K.; Bartolini, Andrea; Flatresse, Philippe; Benini, Luca
2016-03-01
Ultra-low power operation and extreme energy efficiency are strong requirements for a number of high-growth application areas, such as E-health, Internet of Things, and wearable Human-Computer Interfaces. A promising approach to achieve up to one order of magnitude of improvement in energy efficiency over current generation of integrated circuits is near-threshold computing. However, frequency degradation due to aggressive voltage scaling may not be acceptable across all performance-constrained applications. Thread-level parallelism over multiple cores can be used to overcome the performance degradation at low voltage. Moreover, enabling the processors to operate on-demand and over a wide supply voltage and body bias ranges allows to achieve the best possible energy efficiency while satisfying a large spectrum of computational demands. In this work we present the first ever implementation of a 4-core cluster fabricated using conventional-well 28 nm UTBB FD-SOI technology. The multi-core architecture we present in this work is able to operate on a wide range of supply voltages starting from 0.44 V to 1.2 V. In addition, the architecture allows a wide range of body bias to be applied from -1.8 V to 0.9 V. The peak energy efficiency 60 GOPS/W is achieved at 0.5 V supply voltage and 0.5 V forward body bias. Thanks to the extended body bias range of conventional-well FD-SOI technology, high energy efficiency can be guaranteed for a wide range of process and environmental conditions. We demonstrate the ability to compensate for up to 99.7% of chips for process variation with only ±0.2 V of body biasing, and compensate temperature variation in the range -40 °C to 120 °C exploiting -1.1 V to 0.8 V body biasing. When compared to leading-edge near-threshold RISC processors optimized for extremely low power applications, the multi-core architecture we propose has 144× more performance at comparable energy efficiency levels. Even when compared to other low-power processors with comparable performance, including those implemented in 28 nm technology, our platform provides 1.4× to 3.7× better energy efficiency.
Information Technology: Making It All Fit. Track VIII: Academic Computing Strategy.
ERIC Educational Resources Information Center
CAUSE, Boulder, CO.
Six papers from the 1988 CAUSE conference's Track VIII, Academic Computing Strategy, are presented. They include: "Achieving Institution-Wide Computer Fluency: A Five-Year Retrospective" (Paul J. Plourde); "A Methodology and a Policy for Building and Implementing a Strategic Computer Plan" (Frank B. Thomas); "Aligning…
Color in Computer-Assisted Instruction.
ERIC Educational Resources Information Center
Steinberg, Esther R.
Color monitors are in wide use in computer systems. Thus, it is important to understand how to apply color effectively in computer assisted instruction (CAI) and computer based training (CBT). Color can enhance learning, but it does not automatically do so. Indiscriminate application of color can mislead a student and thereby even interfere with…
Cupek, Rafal; Ziębiński, Adam
2016-01-01
Rheumatoid arthritis is the most common rheumatic disease with arthritis, and causes substantial functional disability in approximately 50% patients after 10 years. Accurate measurement of the disease activity is crucial to provide an adequate treatment and care to the patients. The aim of this study is focused on a computer aided diagnostic system that supports an assessment of synovitis severity. This paper focus on a computer aided diagnostic system that was developed within joint Polish-Norwegian research project related to the automated assessment of the severity of synovitis. Semiquantitative ultrasound with power Doppler is a reliable and widely used method of assessing synovitis. Synovitis is estimated by ultrasound examiner using the scoring system graded from 0 to 3. Activity score is estimated on the basis of the examiner's experience or standardized ultrasound atlases. The method needs trained medical personnel and the result can be affected by a human error. The porotype of a computer-aided diagnostic system and algorithms essential for an analysis of ultrasonic images of finger joints are main scientific output of the MEDUSA project. Medusa Evaluation System prototype uses bone, skin, joint and synovitis area detectors for mutual structural model based evaluation of synovitis. Finally, several algorithms that support the semi-automatic or automatic detection of the bone region were prepared as well as a system that uses the statistical data processing approach in order to automatically localize the regions of interest. Semiquantitative ultrasound with power Doppler is a reliable and widely used method of assessing synovitis. Activity score is estimated on the basis of the examiner's experience and the result can be affected by a human error. In this paper we presented the MEDUSA project which is focused on a computer aided diagnostic system that supports an assessment of synovitis severity.
Wide area Hyperspectral Motion Imaging
2017-02-03
LEXINGTON, MASSACHUSETTS 02420-9108 (781) 981-1343 3 February 2017 TO: FROM: Dr. Joseph Lin (joseph.lin@ll.mit.edu), Advanced Imager ...Technology SUBJECT: Wide-area Hyperspectral Motion Imaging Introduction Wide-area motion imaging (WAMI) has received increased attention in...fielded imaging spectrometers use either dispersive or interferometric techniques. A dispersive spectrometer uses a grating or prism to disperse the
Technical Report for the Demonstration of Wide Area ...
Report The U.S. Environmental Protection Agency in collaboration with the Department of Homeland Security conducted the “Wide-Area Urban Radiological Contaminant, Mitigation, and Cleanup Technology Demonstration” in Columbus, Ohio on June 22-25, 2015. Five wide-area radiological decontamination technologies (including strippable coatings, gels, and chemical foam technologies) were demonstrated on an urban building.
ERIC Educational Resources Information Center
Osunwusi, Adeyinka Olumuyiwa; Abifarin, Michael Segun
2013-01-01
The aim of this study was to conduct a comparative assessment of computer literacy of private and public secondary school students. Although the definition of computer literacy varies widely, this study treated computer literacy in terms of access to, and use of, computers and the internet, basic knowledge and skills required to use computers and…
Wide area restoration following biological contamination
NASA Astrophysics Data System (ADS)
Yang, Lynn; Hibbard, Wilthea; Edwards, Donna; Franco, David; Fruetel, Julie; Tucker, Mark; Einfeld, Wayne; Knowlton, Robert; Brown, Gary; Brockmann, John; Greenwalt, Robert; Miles, Robin; Raber, Ellen; Carlsen, Tina; Krauter, Paula; Dillon, Michael; MacQueen, Don; Intrepido, Tony; Hoppes, Bill; Wilson, Wendy; Mancieri, Sav
2008-04-01
Current understanding of how to restore a wide area that has been contaminated following a large biological attack is limited. The Department of Homeland Security and Department of Defense are executing a four-year collaborative program named the Interagency Biological Restoration Demonstration (IBRD) program. This program is aimed at developing technologies, methods, plans and policies necessary to restore a wide area, including military installations and critical infrastructures, in the event of a large outdoor aerosol release of anthrax. The IBRD program partner pilot city is the Seattle Urban Area to include Fort Lewis, WA and McChord Air Force Base. A front-end systems analysis was conducted as part of IBRD, to: 1) assess existing technologies and processes for wide area restoration; from this, 2) develop an "as-is" decision framework for wide area restoration; and 3) identify and prioritize capability gaps. Qualitative assessments and quantitative analyses, including sensitivity, timeline and case study analyses, were conducted to evaluate existing processes and rank capability gaps. This paper describes the approach and results from this front-end systems analysis.
Speeding Up Geophysical Research Using Docker Containers Within Multi-Cloud Environment.
NASA Astrophysics Data System (ADS)
Synytsky, R.; Henadiy, S.; Lobzakov, V.; Kolesnikov, L.; Starovoit, Y. O.
2016-12-01
How useful are the geophysical observations in a scope of minimizing losses from natural disasters today? Does it help to decrease number of human victims during tsunami and earthquake? Unfortunately it's still at early stage these days. It's a big goal and achievement to make such observations more useful by improving early warning and prediction systems with the help of cloud computing. Cloud computing technologies have proved the ability to speed up application development in many areas for 10 years already. Cloud unlocks new opportunities for geoscientists by providing access to modern data processing tools and algorithms including real-time high-performance computing, big data processing, artificial intelligence and others. Emerging lightweight cloud technologies, such as Docker containers, are gaining wide traction in IT due to the fact of faster and more efficient deployment of different applications in a cloud environment. It allows to deploy and manage geophysical applications and systems in minutes across multiple clouds and data centers that becomes of utmost importance for the next generation applications. In this session we'll demonstrate how Docker containers technology within multi-cloud can accelerate the development of applications specifically designed for geophysical researches.
A spread willingness computing-based information dissemination model.
Huang, Haojing; Cui, Zhiming; Zhang, Shukui
2014-01-01
This paper constructs a kind of spread willingness computing based on information dissemination model for social network. The model takes into account the impact of node degree and dissemination mechanism, combined with the complex network theory and dynamics of infectious diseases, and further establishes the dynamical evolution equations. Equations characterize the evolutionary relationship between different types of nodes with time. The spread willingness computing contains three factors which have impact on user's spread behavior: strength of the relationship between the nodes, views identity, and frequency of contact. Simulation results show that different degrees of nodes show the same trend in the network, and even if the degree of node is very small, there is likelihood of a large area of information dissemination. The weaker the relationship between nodes, the higher probability of views selection and the higher the frequency of contact with information so that information spreads rapidly and leads to a wide range of dissemination. As the dissemination probability and immune probability change, the speed of information dissemination is also changing accordingly. The studies meet social networking features and can help to master the behavior of users and understand and analyze characteristics of information dissemination in social network.
de la Iglesia, D; Cachau, R E; García-Remesal, M; Maojo, V
2013-11-27
Nanotechnology represents an area of particular promise and significant opportunity across multiple scientific disciplines. Ongoing nanotechnology research ranges from the characterization of nanoparticles and nanomaterials to the analysis and processing of experimental data seeking correlations between nanoparticles and their functionalities and side effects. Due to their special properties, nanoparticles are suitable for cellular-level diagnostics and therapy, offering numerous applications in medicine, e.g. development of biomedical devices, tissue repair, drug delivery systems and biosensors. In nanomedicine, recent studies are producing large amounts of structural and property data, highlighting the role for computational approaches in information management. While in vitro and in vivo assays are expensive, the cost of computing is falling. Furthermore, improvements in the accuracy of computational methods (e.g. data mining, knowledge discovery, modeling and simulation) have enabled effective tools to automate the extraction, management and storage of these vast data volumes. Since this information is widely distributed, one major issue is how to locate and access data where it resides (which also poses data-sharing limitations). The novel discipline of nanoinformatics addresses the information challenges related to nanotechnology research. In this paper, we summarize the needs and challenges in the field and present an overview of extant initiatives and efforts.
A Spread Willingness Computing-Based Information Dissemination Model
Cui, Zhiming; Zhang, Shukui
2014-01-01
This paper constructs a kind of spread willingness computing based on information dissemination model for social network. The model takes into account the impact of node degree and dissemination mechanism, combined with the complex network theory and dynamics of infectious diseases, and further establishes the dynamical evolution equations. Equations characterize the evolutionary relationship between different types of nodes with time. The spread willingness computing contains three factors which have impact on user's spread behavior: strength of the relationship between the nodes, views identity, and frequency of contact. Simulation results show that different degrees of nodes show the same trend in the network, and even if the degree of node is very small, there is likelihood of a large area of information dissemination. The weaker the relationship between nodes, the higher probability of views selection and the higher the frequency of contact with information so that information spreads rapidly and leads to a wide range of dissemination. As the dissemination probability and immune probability change, the speed of information dissemination is also changing accordingly. The studies meet social networking features and can help to master the behavior of users and understand and analyze characteristics of information dissemination in social network. PMID:25110738
Methods in Symbolic Computation and p-Adic Valuations of Polynomials
NASA Astrophysics Data System (ADS)
Guan, Xiao
Symbolic computation has widely appear in many mathematical fields such as combinatorics, number theory and stochastic processes. The techniques created in the area of experimental mathematics provide us efficient ways of symbolic computing and verification of complicated relations. Part I consists of three problems. The first one focuses on a unimodal sequence derived from a quartic integral. Many of its properties are explored with the help of hypergeometric representations and automatic proofs. The second problem tackles the generating function of the reciprocal of Catalan number. It springs from the closed form given by Mathematica. Furthermore, three methods in special functions are used to justify this result. The third issue addresses the closed form solutions for the moments of products of generalized elliptic integrals , which combines the experimental mathematics and classical analysis. Part II concentrates on the p-adic valuations of polynomials from the perspective of trees. For a given polynomial f( n) indexed in positive integers, the package developed in Mathematica will create certain tree structure following a couple of rules. The evolution of such trees are studied both rigorously and experimentally from the view of field extension, nonparametric statistics and random matrix.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winkler, David A., E-mail: dave.winkler@csiro.au
2016-05-15
Nanomaterials research is one of the fastest growing contemporary research areas. The unprecedented properties of these materials have meant that they are being incorporated into products very quickly. Regulatory agencies are concerned they cannot assess the potential hazards of these materials adequately, as data on the biological properties of nanomaterials are still relatively limited and expensive to acquire. Computational modelling methods have much to offer in helping understand the mechanisms by which toxicity may occur, and in predicting the likelihood of adverse biological impacts of materials not yet tested experimentally. This paper reviews the progress these methods, particularly those QSAR-based,more » have made in understanding and predicting potentially adverse biological effects of nanomaterials, and also the limitations and pitfalls of these methods. - Highlights: • Nanomaterials regulators need good information to make good decisions. • Nanomaterials and their interactions with biology are very complex. • Computational methods use existing data to predict properties of new nanomaterials. • Statistical, data driven modelling methods have been successfully applied to this task. • Much more must be learnt before robust toolkits will be widely usable by regulators.« less
Biophysical Discovery through the Lens of a Computational Microscope
NASA Astrophysics Data System (ADS)
Amaro, Rommie
With exascale computing power on the horizon, improvements in the underlying algorithms and available structural experimental data are enabling new paradigms for chemical discovery. My work has provided key insights for the systematic incorporation of structural information resulting from state-of-the-art biophysical simulations into protocols for inhibitor and drug discovery. We have shown that many disease targets have druggable pockets that are otherwise ``hidden'' in high resolution x-ray structures, and that this is a common theme across a wide range of targets in different disease areas. We continue to push the limits of computational biophysical modeling by expanding the time and length scales accessible to molecular simulation. My sights are set on, ultimately, the development of detailed physical models of cells, as the fundamental unit of life, and two recent achievements highlight our efforts in this arena. First is the development of a molecular and Brownian dynamics multi-scale modeling framework, which allows us to investigate drug binding kinetics in addition to thermodynamics. In parallel, we have made significant progress developing new tools to extend molecular structure to cellular environments. Collectively, these achievements are enabling the investigation of the chemical and biophysical nature of cells at unprecedented scales.
NASA Astrophysics Data System (ADS)
de la Iglesia, D.; Cachau, R. E.; García-Remesal, M.; Maojo, V.
2013-01-01
Nanotechnology represents an area of particular promise and significant opportunity across multiple scientific disciplines. Ongoing nanotechnology research ranges from the characterization of nanoparticles and nanomaterials to the analysis and processing of experimental data seeking correlations between nanoparticles and their functionalities and side effects. Due to their special properties, nanoparticles are suitable for cellular-level diagnostics and therapy, offering numerous applications in medicine, e.g. development of biomedical devices, tissue repair, drug delivery systems and biosensors. In nanomedicine, recent studies are producing large amounts of structural and property data, highlighting the role for computational approaches in information management. While in vitro and in vivo assays are expensive, the cost of computing is falling. Furthermore, improvements in the accuracy of computational methods (e.g. data mining, knowledge discovery, modeling and simulation) have enabled effective tools to automate the extraction, management and storage of these vast data volumes. Since this information is widely distributed, one major issue is how to locate and access data where it resides (which also poses data-sharing limitations). The novel discipline of nanoinformatics addresses the information challenges related to nanotechnology research. In this paper, we summarize the needs and challenges in the field and present an overview of extant initiatives and efforts.
Performance of computer vision in vivo flow cytometry with low fluorescence contrast
NASA Astrophysics Data System (ADS)
Markovic, Stacey; Li, Siyuan; Niedre, Mark
2015-03-01
Detection and enumeration of circulating cells in the bloodstream of small animals are important in many areas of preclinical biomedical research, including cancer metastasis, immunology, and reproductive medicine. Optical in vivo flow cytometry (IVFC) represents a class of technologies that allow noninvasive and continuous enumeration of circulating cells without drawing blood samples. We recently developed a technique termed computer vision in vivo flow cytometry (CV-IVFC) that uses a high-sensitivity fluorescence camera and an automated computer vision algorithm to interrogate relatively large circulating blood volumes in the ear of a mouse. We detected circulating cells at concentrations as low as 20 cells/mL. In the present work, we characterized the performance of CV-IVFC with low-contrast imaging conditions with (1) weak cell fluorescent labeling using cell-simulating fluorescent microspheres with varying brightness and (2) high background tissue autofluorescence by varying autofluorescence properties of optical phantoms. Our analysis indicates that CV-IVFC can robustly track and enumerate circulating cells with at least 50% sensitivity even in conditions with two orders of magnitude degraded contrast than our previous in vivo work. These results support the significant potential utility of CV-IVFC in a wide range of in vivo biological models.
Struct2Net: a web service to predict protein–protein interactions using a structure-based approach
Singh, Rohit; Park, Daniel; Xu, Jinbo; Hosur, Raghavendra; Berger, Bonnie
2010-01-01
Struct2Net is a web server for predicting interactions between arbitrary protein pairs using a structure-based approach. Prediction of protein–protein interactions (PPIs) is a central area of interest and successful prediction would provide leads for experiments and drug design; however, the experimental coverage of the PPI interactome remains inadequate. We believe that Struct2Net is the first community-wide resource to provide structure-based PPI predictions that go beyond homology modeling. Also, most web-resources for predicting PPIs currently rely on functional genomic data (e.g. GO annotation, gene expression, cellular localization, etc.). Our structure-based approach is independent of such methods and only requires the sequence information of the proteins being queried. The web service allows multiple querying options, aimed at maximizing flexibility. For the most commonly studied organisms (fly, human and yeast), predictions have been pre-computed and can be retrieved almost instantaneously. For proteins from other species, users have the option of getting a quick-but-approximate result (using orthology over pre-computed results) or having a full-blown computation performed. The web service is freely available at http://struct2net.csail.mit.edu. PMID:20513650
Radio Galaxy Zoo: Machine learning for radio source host galaxy cross-identification
NASA Astrophysics Data System (ADS)
Alger, M. J.; Banfield, J. K.; Ong, C. S.; Rudnick, L.; Wong, O. I.; Wolf, C.; Andernach, H.; Norris, R. P.; Shabala, S. S.
2018-05-01
We consider the problem of determining the host galaxies of radio sources by cross-identification. This has traditionally been done manually, which will be intractable for wide-area radio surveys like the Evolutionary Map of the Universe (EMU). Automated cross-identification will be critical for these future surveys, and machine learning may provide the tools to develop such methods. We apply a standard approach from computer vision to cross-identification, introducing one possible way of automating this problem, and explore the pros and cons of this approach. We apply our method to the 1.4 GHz Australian Telescope Large Area Survey (ATLAS) observations of the Chandra Deep Field South (CDFS) and the ESO Large Area ISO Survey South 1 (ELAIS-S1) fields by cross-identifying them with the Spitzer Wide-area Infrared Extragalactic (SWIRE) survey. We train our method with two sets of data: expert cross-identifications of CDFS from the initial ATLAS data release and crowdsourced cross-identifications of CDFS from Radio Galaxy Zoo. We found that a simple strategy of cross-identifying a radio component with the nearest galaxy performs comparably to our more complex methods, though our estimated best-case performance is near 100 per cent. ATLAS contains 87 complex radio sources that have been cross-identified by experts, so there are not enough complex examples to learn how to cross-identify them accurately. Much larger datasets are therefore required for training methods like ours. We also show that training our method on Radio Galaxy Zoo cross-identifications gives comparable results to training on expert cross-identifications, demonstrating the value of crowdsourced training data.
Space information technologies: future agenda
NASA Astrophysics Data System (ADS)
Flournoy, Don M.
2005-11-01
Satellites will operate more like wide area broadband computer networks in the 21st Century. Space-based information and communication technologies will therefore be a lot more accessible and functional for the individual user. These developments are the result of earth-based telecommunication and computing innovations being extended to space. The author predicts that the broadband Internet will eventually be available on demand to users of terrestrial networks wherever they are. Earth and space communication assets will be managed as a single network. Space networks will assure that online access is ubiquitous. No matter whether users are located in cities or in remote locations, they will always be within reach of a node on the Internet. Even today, scalable bandwidth can be delivered to active users when moving around in vehicles on the ground, or aboard ships at sea or in the air. Discussion of the innovative technologies produced by NASA's Advanced Communications Technology Satellite (1993-2004) demonstrates future capabilities of satellites that make them uniquely suited to serve as nodes on the broadband Internet.
TauFactor: An open-source application for calculating tortuosity factors from tomographic data
NASA Astrophysics Data System (ADS)
Cooper, S. J.; Bertei, A.; Shearing, P. R.; Kilner, J. A.; Brandon, N. P.
TauFactor is a MatLab application for efficiently calculating the tortuosity factor, as well as volume fractions, surface areas and triple phase boundary densities, from image based microstructural data. The tortuosity factor quantifies the apparent decrease in diffusive transport resulting from convolutions of the flow paths through porous media. TauFactor was originally developed to improve the understanding of electrode microstructures for batteries and fuel cells; however, the tortuosity factor has been of interest to a wide range of disciplines for over a century, including geoscience, biology and optics. It is still common practice to use correlations, such as that developed by Bruggeman, to approximate the tortuosity factor, but in recent years the increasing availability of 3D imaging techniques has spurred interest in calculating this quantity more directly. This tool provides a fast and accurate computational platform applicable to the big datasets (>108 voxels) typical of modern tomography, without requiring high computational power.
Recent developments in computer vision-based analytical chemistry: A tutorial review.
Capitán-Vallvey, Luis Fermín; López-Ruiz, Nuria; Martínez-Olmos, Antonio; Erenas, Miguel M; Palma, Alberto J
2015-10-29
Chemical analysis based on colour changes recorded with imaging devices is gaining increasing interest. This is due to its several significant advantages, such as simplicity of use, and the fact that it is easily combinable with portable and widely distributed imaging devices, resulting in friendly analytical procedures in many areas that demand out-of-lab applications for in situ and real-time monitoring. This tutorial review covers computer vision-based analytical (CVAC) procedures and systems from 2005 to 2015, a period of time when 87.5% of the papers on this topic were published. The background regarding colour spaces and recent analytical system architectures of interest in analytical chemistry is presented in the form of a tutorial. Moreover, issues regarding images, such as the influence of illuminants, and the most relevant techniques for processing and analysing digital images are addressed. Some of the most relevant applications are then detailed, highlighting their main characteristics. Finally, our opinion about future perspectives is discussed. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Hu, Yifan; Han, Hao; Zhu, Wei; Li, Lihong; Pickhardt, Perry J.; Liang, Zhengrong
2016-03-01
Feature classification plays an important role in differentiation or computer-aided diagnosis (CADx) of suspicious lesions. As a widely used ensemble learning algorithm for classification, random forest (RF) has a distinguished performance for CADx. Our recent study has shown that the location index (LI), which is derived from the well-known kNN (k nearest neighbor) and wkNN (weighted k nearest neighbor) classifier [1], has also a distinguished role in the classification for CADx. Therefore, in this paper, based on the property that the LI will achieve a very high accuracy, we design an algorithm to integrate the LI into RF for improved or higher value of AUC (area under the curve of receiver operating characteristics -- ROC). Experiments were performed by the use of a database of 153 lesions (polyps), including 116 neoplastic lesions and 37 hyperplastic lesions, with comparison to the existing classifiers of RF and wkNN, respectively. A noticeable gain by the proposed integrated classifier was quantified by the AUC measure.
Weidel, Philipp; Djurfeldt, Mikael; Duarte, Renato C; Morrison, Abigail
2016-01-01
In order to properly assess the function and computational properties of simulated neural systems, it is necessary to account for the nature of the stimuli that drive the system. However, providing stimuli that are rich and yet both reproducible and amenable to experimental manipulations is technically challenging, and even more so if a closed-loop scenario is required. In this work, we present a novel approach to solve this problem, connecting robotics and neural network simulators. We implement a middleware solution that bridges the Robotic Operating System (ROS) to the Multi-Simulator Coordinator (MUSIC). This enables any robotic and neural simulators that implement the corresponding interfaces to be efficiently coupled, allowing real-time performance for a wide range of configurations. This work extends the toolset available for researchers in both neurorobotics and computational neuroscience, and creates the opportunity to perform closed-loop experiments of arbitrary complexity to address questions in multiple areas, including embodiment, agency, and reinforcement learning.
Weidel, Philipp; Djurfeldt, Mikael; Duarte, Renato C.; Morrison, Abigail
2016-01-01
In order to properly assess the function and computational properties of simulated neural systems, it is necessary to account for the nature of the stimuli that drive the system. However, providing stimuli that are rich and yet both reproducible and amenable to experimental manipulations is technically challenging, and even more so if a closed-loop scenario is required. In this work, we present a novel approach to solve this problem, connecting robotics and neural network simulators. We implement a middleware solution that bridges the Robotic Operating System (ROS) to the Multi-Simulator Coordinator (MUSIC). This enables any robotic and neural simulators that implement the corresponding interfaces to be efficiently coupled, allowing real-time performance for a wide range of configurations. This work extends the toolset available for researchers in both neurorobotics and computational neuroscience, and creates the opportunity to perform closed-loop experiments of arbitrary complexity to address questions in multiple areas, including embodiment, agency, and reinforcement learning. PMID:27536234
Current Developments in Machine Learning Techniques in Biological Data Mining.
Dumancas, Gerard G; Adrianto, Indra; Bello, Ghalib; Dozmorov, Mikhail
2017-01-01
This supplement is intended to focus on the use of machine learning techniques to generate meaningful information on biological data. This supplement under Bioinformatics and Biology Insights aims to provide scientists and researchers working in this rapid and evolving field with online, open-access articles authored by leading international experts in this field. Advances in the field of biology have generated massive opportunities to allow the implementation of modern computational and statistical techniques. Machine learning methods in particular, a subfield of computer science, have evolved as an indispensable tool applied to a wide spectrum of bioinformatics applications. Thus, it is broadly used to investigate the underlying mechanisms leading to a specific disease, as well as the biomarker discovery process. With a growth in this specific area of science comes the need to access up-to-date, high-quality scholarly articles that will leverage the knowledge of scientists and researchers in the various applications of machine learning techniques in mining biological data.
Korolkov, Victor P; Nasyrov, Ruslan K; Shimansky, Ruslan V
2006-01-01
Enhancing the diffraction efficiency of continuous-relief diffractive optical elements fabricated by direct laser writing is discussed. A new method of zone-boundary optimization is proposed to correct exposure data only in narrow areas along the boundaries of diffractive zones. The optimization decreases the loss of diffraction efficiency related to convolution of a desired phase profile with a writing-beam intensity distribution. A simplified stepped transition function that describes optimized exposure data near zone boundaries can be made universal for a wide range of zone periods. The approach permits a similar increase in the diffraction efficiency as an individual-pixel optimization but with fewer computation efforts. Computer simulations demonstrated that the zone-boundary optimization for a 6 microm period grating increases the efficiency by 7% and 14.5% for 0.6 microm and 1.65 microm writing-spot diameters, respectively. The diffraction efficiency of as much as 65%-90% for 4-10 microm zone periods was obtained experimentally with this method.
A comparative analysis of rawinsonde and NIMBUS 6 and TIROS N satellite profile data
NASA Technical Reports Server (NTRS)
Scoggins, J. R.; Carle, W. E.; Knight, K.; Moyer, V.; Cheng, N. M.
1981-01-01
Comparisons are made between rawinsonde and satellite profiles in seven areas for a wide range of surface and weather conditions. Variables considered include temperature, dewpoint temperature, thickness, precipitable water, lapse rate of temperature, stability, geopotential height, mixing ratio, wind direction, wind speed, and kinematic parameters, including vorticity and the advection of vorticity and temperature. In addition, comparisons are made in the form of cross sections and synoptic fields for selected variables. Sounding data from the NIMBUS 6 and TIROS N satellites were used. Geostrophic wind computed from smoothed geopotential heights provided large scale flow patterns that agreed well with the rawinsonde wind fields. Surface wind patterns as well as magnitudes computed by use of the log law to extrapolate wind to a height of 10 m agreed with observations. Results of this study demonstrate rather conclusively that satellite profile data can be used to determine characteristics of large scale systems but that small scale features, such as frontal zones, cannot yet be resolved.
Computer applications making rapid advances in high throughput microbial proteomics (HTMP).
Anandkumar, Balakrishna; Haga, Steve W; Wu, Hui-Fen
2014-02-01
The last few decades have seen the rise of widely-available proteomics tools. From new data acquisition devices, such as MALDI-MS and 2DE to new database searching softwares, these new products have paved the way for high throughput microbial proteomics (HTMP). These tools are enabling researchers to gain new insights into microbial metabolism, and are opening up new areas of study, such as protein-protein interactions (interactomics) discovery. Computer software is a key part of these emerging fields. This current review considers: 1) software tools for identifying the proteome, such as MASCOT or PDQuest, 2) online databases of proteomes, such as SWISS-PROT, Proteome Web, or the Proteomics Facility of the Pathogen Functional Genomics Resource Center, and 3) software tools for applying proteomic data, such as PSI-BLAST or VESPA. These tools allow for research in network biology, protein identification, functional annotation, target identification/validation, protein expression, protein structural analysis, metabolic pathway engineering and drug discovery.
Computer Vision Malaria Diagnostic Systems-Progress and Prospects.
Pollak, Joseph Joel; Houri-Yafin, Arnon; Salpeter, Seth J
2017-01-01
Accurate malaria diagnosis is critical to prevent malaria fatalities, curb overuse of antimalarial drugs, and promote appropriate management of other causes of fever. While several diagnostic tests exist, the need for a rapid and highly accurate malaria assay remains. Microscopy and rapid diagnostic tests are the main diagnostic modalities available, yet they can demonstrate poor performance and accuracy. Automated microscopy platforms have the potential to significantly improve and standardize malaria diagnosis. Based on image recognition and machine learning algorithms, these systems maintain the benefits of light microscopy and provide improvements such as quicker scanning time, greater scanning area, and increased consistency brought by automation. While these applications have been in development for over a decade, recently several commercial platforms have emerged. In this review, we discuss the most advanced computer vision malaria diagnostic technologies and investigate several of their features which are central to field use. Additionally, we discuss the technological and policy barriers to implementing these technologies in low-resource settings world-wide.
A pressure flux-split technique for computation of inlet flow behavior
NASA Technical Reports Server (NTRS)
Pordal, H. S.; Khosla, P. K.; Rubin, S. G.
1991-01-01
A method for calculating the flow field in aircraft engine inlets is presented. The phenomena of inlet unstart and restart are investigated. Solutions of the reduced Navier-Stokes (RNS) equations are obtained with a time consistent direct sparse matrix solver that computes the transient flow field both internal and external to the inlet. Time varying shocks and time varying recirculation regions can be efficiently analyzed. The code is quite general and is suitable for the computation of flow for a wide variety of geometries and over a wide range of Mach and Reynolds numbers.
Performance of the Widely-Used CFD Code OVERFLOW on the Pleides Supercomputer
NASA Technical Reports Server (NTRS)
Guruswamy, Guru P.
2017-01-01
Computational performance studies were made for NASA's widely used Computational Fluid Dynamics code OVERFLOW on the Pleiades Supercomputer. Two test cases were considered: a full launch vehicle with a grid of 286 million points and a full rotorcraft model with a grid of 614 million points. Computations using up to 8000 cores were run on Sandy Bridge and Ivy Bridge nodes. Performance was monitored using times reported in the day files from the Portable Batch System utility. Results for two grid topologies are presented and compared in detail. Observations and suggestions for future work are made.
Computational methods and software systems for dynamics and control of large space structures
NASA Technical Reports Server (NTRS)
Park, K. C.; Felippa, C. A.; Farhat, C.; Pramono, E.
1990-01-01
Two key areas of crucial importance to the computer-based simulation of large space structures are discussed. The first area involves multibody dynamics (MBD) of flexible space structures, with applications directed to deployment, construction, and maneuvering. The second area deals with advanced software systems, with emphasis on parallel processing. The latest research thrust in the second area involves massively parallel computers.
Essential Goals and Objectives for Computer Education.
ERIC Educational Resources Information Center
Michigan State Board of Education, Lansing.
Developed by the Michigan State Board of Education, this document begins with brief discussions of a philosophy for the integration of computers into all content areas and district planning for computer use in schools. Essential goals and objectives for computer education are then outlined in the following areas: (1) computing and its evolving…
Patoka, Jiří; Vejtrubová, Markéta; Vrabec, Vladimír; Masopustová, Renata
2018-01-01
The aardvark is popular in many zoological gardens in the European Union. These creatures are nocturnal, and aardvarks in the wild are known to walk distances of 4 km to 7 km per night. Despite what is known about their biology, most aardvarks are kept in zoological gardens in indoor enclosures with little space for movement. This lack of space leads to a tendency toward obesity and compromised welfare. With their wide distribution in Sub-Saharan Africa, aardvarks are perceived as thermophilic nonhuman animals. Nevertheless, some records suggest they may be able to adapt to colder climates and can be active outside their burrows when temperatures fall to 2°C. These findings suggest there may be a wild African population that is suitable for partial outdoor keeping under European climatic conditions. Therefore, a climate match was computed between the source area with aardvark occurrence and a target area of the European Union. Data revealed that the Free State, a South African province, was the area with the best climate similarity, and aardvarks from this area are recommended as suitable for the aforementioned purpose.
Validation of 2D flood models with insurance claims
NASA Astrophysics Data System (ADS)
Zischg, Andreas Paul; Mosimann, Markus; Bernet, Daniel Benjamin; Röthlisberger, Veronika
2018-02-01
Flood impact modelling requires reliable models for the simulation of flood processes. In recent years, flood inundation models have been remarkably improved and widely used for flood hazard simulation, flood exposure and loss analyses. In this study, we validate a 2D inundation model for the purpose of flood exposure analysis at the river reach scale. We validate the BASEMENT simulation model with insurance claims using conventional validation metrics. The flood model is established on the basis of available topographic data in a high spatial resolution for four test cases. The validation metrics were calculated with two different datasets; a dataset of event documentations reporting flooded areas and a dataset of insurance claims. The model fit relating to insurance claims is in three out of four test cases slightly lower than the model fit computed on the basis of the observed inundation areas. This comparison between two independent validation data sets suggests that validation metrics using insurance claims can be compared to conventional validation data, such as the flooded area. However, a validation on the basis of insurance claims might be more conservative in cases where model errors are more pronounced in areas with a high density of values at risk.
2012-01-01
Background The aim of the paper is to assess by the principal components analysis (PCA) the heavy metal contamination of soil and vegetables widely used as food for people who live in areas contaminated by heavy metals (HMs) due to long-lasting mining activities. This chemometric technique allowed us to select the best model for determining the risk of HMs on the food chain as well as on people's health. Results Many PCA models were computed with different variables: heavy metals contents and some agro-chemical parameters which characterize the soil samples from contaminated and uncontaminated areas, HMs contents of different types of vegetables grown and consumed in these areas, and the complex parameter target hazard quotients (THQ). Results were discussed in terms of principal component analysis. Conclusion There were two major benefits in processing the data PCA: firstly, it helped in optimizing the number and type of data that are best in rendering the HMs contamination of the soil and vegetables. Secondly, it was valuable for selecting the vegetable species which present the highest/minimum risk of a negative impact on the food chain and human health. PMID:23234365
Irimia, Andrei; Erhart, Matthew J.; Brown, Timothy T.
2014-01-01
Objective To assess the feasibility and appropriateness of magnetoencephalography (MEG) for both adult and pediatric studies, as well as for the developmental comparison of these factors across a wide range of ages. Methods For 45 subjects with ages from 1 to 24 years (infants, toddlers, school-age children and young adults), lead fields (LFs) of MEG sensors are computed using anatomically realistic boundary element models (BEMs) and individually-reconstructed cortical surfaces. Novel metrics are introduced to quantify MEG sensor focality. Results The variability of MEG focality is graphed as a function of brain volume and cortical area. Statistically significant differences in total cerebral volume, cortical area, MEG global sensitivity and LF focality are found between age groups. Conclusions Because MEG focality and sensitivity differ substantially across the age groups studied, the cortical LF maps explored here can provide important insights for the examination and interpretation of MEG signals from early childhood to young adulthood. Significance This is the first study to (1) investigate the relationship between MEG cortical LFs and brain volume as well as cortical area across development, and (2) compare LFs between subjects with different head sizes using detailed cortical reconstructions. PMID:24589347
NASA Astrophysics Data System (ADS)
Bonforte, A.; Casu, F.; de Martino, P.; Guglielmino, F.; Lanari, R.; Manzo, M.; Obrizzo, F.; Puglisi, G.; Sansosti, E.; Tammaro, U.
2009-04-01
Differential Synthetic Aperture Radar Interferometry (DInSAR) is a methodology able to measure ground deformation rates and time series of relatively large areas. Several different approaches have been developed over the past few years: they all have in common the capability to measure deformations on a relatively wide area (say 100 km by 100 km) with a high density of the measuring points. For these reasons, DInSAR represents a very useful tool for investigating geophysical phenomena, with particular reference to volcanic areas. As for any measuring technique, the knowledge of the attainable accuracy is of fundamental importance. In the case of DInSAR technology, we have several error sources, such as orbital inaccuracies, phase unwrapping errors, atmospheric artifacts, effects related to the reference point selection, thus making very difficult to define a theoretical error model. A practical way to obtain assess the accuracy is to compare DInSAR results with independent measurements, such as GPS or levelling. Here we present an in-deep comparison between the deformation measurement obtained by exploiting the DInSAR technique referred to as Small BAseline Subset (SBAS) algorithm and by continuous GPS stations. The selected volcanic test-sites are Etna, Vesuvio and Campi Flegrei, in Italy. From continuous GPS data, solutions are computed at the same days SAR data are acquired for direct comparison. Moreover, three dimensional GPS displacement vectors are projected along the radar line of sight of both ascending and descending acquisition orbits. GPS data are then compared with the coherent DInSAR pixels closest to the GPS station. Relevant statistics of the differences between the two measurements are computed and correlated to some scene parameter that may affect DInSAR accuracy (altitude, terrain slope, etc.).
ERIC Educational Resources Information Center
Denner, Jill; Werner, Linda; O'Connor, Lisa; Glassman, Jill
2014-01-01
Efforts to increase the number of women who pursue and complete advanced degrees in computer and information sciences (CIS) have been limited, in part, by a lack of research on pathways into and out of community college CIS classes. This longitudinal study tests three widely held beliefs about how to increase the number of CIS majors at 4-year…
Overhauling, updating and augmenting NASA spacelink electronic information system
NASA Technical Reports Server (NTRS)
Blake, Jean A.
1991-01-01
NASA/Spacelink is a collection of NASA information and educational materials stored on a computer at the MSFC. It is provided by the NASA Educational Affairs Division and is operated by the Education Branch of the Marshall Center Public Affairs Office. It is designed to communicate with a wide variety of computers and modems, especially those most commonly found in classrooms and homes. It was made available to the public in February, 1988. The system may be accessed by educators and the public over regular telephone lines. NASA/Spacelink is free except for the cost of long distance calls. Overhauling and updating Spacelink was done to refurbish NASA/Spacelink, a very valuable resource medium. Several new classroom activities and miscellaneous topics were edited and entered into Spacelink. One of the areas that received a major overhaul (under the guidance of Amos Crisp) was the SPINOFFS BENEFITS, the great benefits resulting from America's space explorations. The Spinoff Benefits include information on a variety of topics including agriculture, communication, the computer, consumer, energy, equipment and materials, food, health, home, industry, medicine, natural resources, public services, recreation, safety, sports, and transportation. In addition to the Space Program Spinoff Benefits, the following is a partial list of some of the material updated and introduced: Astronaut Biographies, Miscellaneous Aeronautics Classroom Activities, Miscellaneous Astronomy Classroom Activities, Miscellaneous Rocketry Classroom Activities, Miscellaneous Classroom Activities, NASA and Its Center, NASA Areas of Research, NASA Patents, Licensing, NASA Technology Transfer, Pictures from Space Classroom Activities, Status of Current NASA Projects, Using Art to Teach Science, and Word Puzzles for Use in the Classroom.
NASA Technical Reports Server (NTRS)
1977-01-01
NASTRAN is an offshoot of the computer-design technique used in construction of airplanes and spacecraft. [n this technique engineers create a mathematical model of the aeronautical or space vehicle and "fly" it on the ground by means of computer simulation. The technique enables them to study performance and structural behavior of a number of different designs before settling on the final configuration and proceeding with construction. From this base of aerospace experience, NASA-Goddard developed the NASTRAN general purpose computer program, which offers an exceptionally wide range of analytic capability with regard to structures. NASTRAN has been applied to autos, trucks, railroad cars, ships, nuclear power reactors, steam turbines, bridges, and office buildings. NASA-Langley provides program maintenance services regarded as vital by many NASTRAN users. NASTRAN is essentially a predictive tool. It takes an electronic look at a computerire$.dedgn and reports how the structure will react under a great many different conditions. It can, for example, note areas where high stress levels will occur-potential failure points that need strengthening. Conversely, it can identify over-designed areas where weight and material might be saved safely. NASTRAN can tell how pipes stand up under strong fluid flow, how metals are affected by high temperatures, how a building will fare in an earthquake or how powerful winds will cause a bridge to oscillate. NASTRAN analysis is quick and inexpensive. It minimizes trial-and-error in the design process and makes possible better, safe, lighter structures affording large-scale savings in development time and materials. Some examples of the broad utility NASTRAN is finding among industrial firms are shown on these pages.
NASA Astrophysics Data System (ADS)
Scherstjanoi, M.; Kaplan, J. O.; Lischke, H.
2014-02-01
To be able to simulate climate change effects on forest dynamics over the whole of Switzerland, we adapted the second generation DGVM LPJ-GUESS to the Alpine environment. We modified model functions, tuned model parameters, and implemented new tree species to represent the potential natural vegetation of Alpine landscapes. Furthermore, we increased the computational efficiency of the model to enable area-covering simulations in a fine resolution (1 km) sufficient for the complex topography of the Alps, which resulted in more than 32 000 simulation grid cells. To this aim, we applied the recently developed method GAPPARD (Scherstjanoi et al., 2013) to LPJ-GUESS. GAPPARD derives mean output values from a combination of simulation runs without disturbances and a patch age distribution defined by the disturbance frequency. With this computationally efficient method, that increased the model's speed by approximately the factor 8, we were able to faster detect shortcomings of LPJ-GUESS functions and parameters. We used the adapted LPJ-GUESS together with GAPPARD to assess the influence of one climate change scenario on dynamics of tree species composition and biomass throughout the 21st century in Switzerland. To allow for comparison with the original model, we additionally simulated forest dynamics along a north-south-transect through Switzerland. The results from this transect confirmed the high value of the GAPPARD method despite some limitations towards extreme climatic events. It allowed for the first time to obtain area-wide, detailed high resolution LPJ-GUESS simulation results for a large part of the Alpine region.
An image-processing software package: UU and Fig for optical metrology applications
NASA Astrophysics Data System (ADS)
Chen, Lujie
2013-06-01
Modern optical metrology applications are largely supported by computational methods, such as phase shifting [1], Fourier Transform [2], digital image correlation [3], camera calibration [4], etc, in which image processing is a critical and indispensable component. While it is not too difficult to obtain a wide variety of image-processing programs from the internet; few are catered for the relatively special area of optical metrology. This paper introduces an image-processing software package: UU (data processing) and Fig (data rendering) that incorporates many useful functions to process optical metrological data. The cross-platform programs UU and Fig are developed based on wxWidgets. At the time of writing, it has been tested on Windows, Linux and Mac OS. The userinterface is designed to offer precise control of the underline processing procedures in a scientific manner. The data input/output mechanism is designed to accommodate diverse file formats and to facilitate the interaction with other independent programs. In terms of robustness, although the software was initially developed for personal use, it is comparably stable and accurate to most of the commercial software of similar nature. In addition to functions for optical metrology, the software package has a rich collection of useful tools in the following areas: real-time image streaming from USB and GigE cameras, computational geometry, computer vision, fitting of data, 3D image processing, vector image processing, precision device control (rotary stage, PZT stage, etc), point cloud to surface reconstruction, volume rendering, batch processing, etc. The software package is currently used in a number of universities for teaching and research.
Computer Supported Cooperative Work in Information Search and Retrieval.
ERIC Educational Resources Information Center
Twidale, Michael B.; Nichols, David M.
1998-01-01
Considers how research in collaborative technologies can inform research and development in library and information science. Topics include computer supported collaborative work; shared drawing; collaborative writing; MUDs; MOOs; workflow; World Wide Web; collaborative learning; computer mediated communication; ethnography; evaluation; remote…
Parallel approach in RDF query processing
NASA Astrophysics Data System (ADS)
Vajgl, Marek; Parenica, Jan
2017-07-01
Parallel approach is nowadays a very cheap solution to increase computational power due to possibility of usage of multithreaded computational units. This hardware became typical part of nowadays personal computers or notebooks and is widely spread. This contribution deals with experiments how evaluation of computational complex algorithm of the inference over RDF data can be parallelized over graphical cards to decrease computational time.
Computer vision and augmented reality in gastrointestinal endoscopy
Mahmud, Nadim; Cohen, Jonah; Tsourides, Kleovoulos; Berzin, Tyler M.
2015-01-01
Augmented reality (AR) is an environment-enhancing technology, widely applied in the computer sciences, which has only recently begun to permeate the medical field. Gastrointestinal endoscopy—which relies on the integration of high-definition video data with pathologic correlates—requires endoscopists to assimilate and process a tremendous amount of data in real time. We believe that AR is well positioned to provide computer-guided assistance with a wide variety of endoscopic applications, beginning with polyp detection. In this article, we review the principles of AR, describe its potential integration into an endoscopy set-up, and envisage a series of novel uses. With close collaboration between physicians and computer scientists, AR promises to contribute significant improvements to the field of endoscopy. PMID:26133175
Multilevel model of polycrystalline materials: grain boundary sliding description
NASA Astrophysics Data System (ADS)
Sharifullina, E.; Shveykin, A.; Trusov, P.
2017-12-01
Material behavior description in a wide range of thermomechanical effects is one of the topical areas in mathematical modeling. Inclusion of grain boundary sliding as an important mechanism of polycrystalline material deformation at elevated temperatures and predominant deformation mechanism of metals and alloys in structural superplasticity allows to simulate various deformation regimes and their transitions (including superplasticity regime with switch-on and switch-off regimes). The paper is devoted to description of grain boundary sliding in structure of two-level model, based on crystal plasticity, and relations for determination the contribution of this mechanism to inelastic deformation. Some results are presented concerning computational experiments of polycrystalline representative volume deformation using developed model.
NASA Technical Reports Server (NTRS)
Birman, Kenneth; Cooper, Robert; Marzullo, Keith
1990-01-01
The ISIS project has developed a new methodology, virtual synchony, for writing robust distributed software. High performance multicast, large scale applications, and wide area networks are the focus of interest. Several interesting applications that exploit the strengths of ISIS, including an NFS-compatible replicated file system, are being developed. The META project is distributed control in a soft real-time environment incorporating feedback. This domain encompasses examples as diverse as monitoring inventory and consumption on a factory floor, and performing load-balancing on a distributed computing system. One of the first uses of META is for distributed application management: the tasks of configuring a distributed program, dynamically adapting to failures, and monitoring its performance. Recent progress and current plans are reported.
Digital Signal Processing and Control for the Study of Gene Networks
NASA Astrophysics Data System (ADS)
Shin, Yong-Jun
2016-04-01
Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.
NASA Technical Reports Server (NTRS)
Carlson, H. W.
1978-01-01
Sonic boom overpressures and signature duration may be predicted for the entire affected ground area for a wide variety of supersonic airplane configurations and spacecraft operating at altitudes up to 76 km in level flight or in moderate climbing or descending flight paths. The outlined procedure relies to a great extent on the use of charts to provide generation and propagation factors for use in relatively simple expressions for signature calculation. Computational requirements can be met by hand-held scientific calculators, or even by slide rules. A variety of correlations of predicted and measured sonic-boom data for airplanes and spacecraft serve to demonstrate the applicability of the simplified method.
Grethe, Jeffrey S; Baru, Chaitan; Gupta, Amarnath; James, Mark; Ludaescher, Bertram; Martone, Maryann E; Papadopoulos, Philip M; Peltier, Steven T; Rajasekar, Arcot; Santini, Simone; Zaslavsky, Ilya N; Ellisman, Mark H
2005-01-01
Through support from the National Institutes of Health's National Center for Research Resources, the Biomedical Informatics Research Network (BIRN) is pioneering the use of advanced cyberinfrastructure for medical research. By synchronizing developments in advanced wide area networking, distributed computing, distributed database federation, and other emerging capabilities of e-science, the BIRN has created a collaborative environment that is paving the way for biomedical research and clinical information management. The BIRN Coordinating Center (BIRN-CC) is orchestrating the development and deployment of key infrastructure components for immediate and long-range support of biomedical and clinical research being pursued by domain scientists in three neuroimaging test beds.
Digital Signal Processing and Control for the Study of Gene Networks.
Shin, Yong-Jun
2016-04-22
Thanks to the digital revolution, digital signal processing and control has been widely used in many areas of science and engineering today. It provides practical and powerful tools to model, simulate, analyze, design, measure, and control complex and dynamic systems such as robots and aircrafts. Gene networks are also complex dynamic systems which can be studied via digital signal processing and control. Unlike conventional computational methods, this approach is capable of not only modeling but also controlling gene networks since the experimental environment is mostly digital today. The overall aim of this article is to introduce digital signal processing and control as a useful tool for the study of gene networks.