Banks, Victoria A; Stanton, Neville A
2016-11-01
To the average driver, the concept of automation in driving infers that they can become completely 'hands and feet free'. This is a common misconception, however, one that has been shown through the application of Network Analysis to new Cruise Assist technologies that may feature on our roads by 2020. Through the adoption of a Systems Theoretic approach, this paper introduces the concept of driver-initiated automation which reflects the role of the driver in highly automated driving systems. Using a combination of traditional task analysis and the application of quantitative network metrics, this agent-based modelling paper shows how the role of the driver remains an integral part of the driving system implicating the need for designers to ensure they are provided with the tools necessary to remain actively in-the-loop despite giving increasing opportunities to delegate their control to the automated subsystems. Practitioner Summary: This paper describes and analyses a driver-initiated command and control system of automation using representations afforded by task and social networks to understand how drivers remain actively involved in the task. A network analysis of different driver commands suggests that such a strategy does maintain the driver in the control loop.
Automated Run-Time Mission and Dialog Generation
2007-03-01
Processing, Social Network Analysis, Simulation, Automated Scenario Generation 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified...9 D. SOCIAL NETWORKS...13 B. MISSION AND DIALOG GENERATION.................................................13 C. SOCIAL NETWORKS
NASA Astrophysics Data System (ADS)
Wang, Hao; Zhong, Guoxin
2018-03-01
Optical communication network is the mainstream technique of the communication networks for distribution automation, and self-healing technologies can improve the in reliability of the optical communication networks significantly. This paper discussed the technical characteristics and application scenarios of several network self-healing technologies in the access layer, the backbone layer and the core layer of the optical communication networks for distribution automation. On the base of the contrastive analysis, this paper gives an application suggestion of these self-healing technologies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karnowski, Thomas Paul; Giancardo, Luca; Li, Yaquin
2013-01-01
Automated retina image analysis has reached a high level of maturity in recent years, and thus the question of how validation is performed in these systems is beginning to grow in importance. One application of retina image analysis is in telemedicine, where an automated system could enable the automated detection of diabetic retinopathy and other eye diseases as a low-cost method for broad-based screening. In this work we discuss our experiences in developing a telemedical network for retina image analysis, including our progression from a manual diagnosis network to a more fully automated one. We pay special attention to howmore » validations of our algorithm steps are performed, both using data from the telemedicine network and other public databases.« less
Banks, Victoria A; Stanton, Neville A
2015-01-01
Automated assistance in driving emergencies aims to improve the safety of our roads by avoiding or mitigating the effects of accidents. However, the behavioural implications of such systems remain unknown. This paper introduces the driver decision-making in emergencies (DDMiEs) framework to investigate how the level and type of automation may affect driver decision-making and subsequent responses to critical braking events using network analysis to interrogate retrospective verbalisations. Four DDMiE models were constructed to represent different levels of automation within the driving task and its effects on driver decision-making. Findings suggest that whilst automation does not alter the decision-making pathway (e.g. the processes between hazard detection and response remain similar), it does appear to significantly weaken the links between information-processing nodes. This reflects an unintended yet emergent property within the task network that could mean that we may not be improving safety in the way we expect. This paper contrasts models of driver decision-making in emergencies at varying levels of automation using the Southampton University Driving Simulator. Network analysis of retrospective verbalisations indicates that increasing the level of automation in driving emergencies weakens the link between information-processing nodes essential for effective decision-making.
ERIC Educational Resources Information Center
Kibirige, Harry M.
1991-01-01
Discussion of the potential effects of fiber optic-based communication technology on information networks and systems design highlights library automation. Topics discussed include computers and telecommunications systems, the importance of information in national economies, microcomputers, local area networks (LANs), national computer networks,…
Automated quantification of neuronal networks and single-cell calcium dynamics using calcium imaging
Patel, Tapan P.; Man, Karen; Firestein, Bonnie L.; Meaney, David F.
2017-01-01
Background Recent advances in genetically engineered calcium and membrane potential indicators provide the potential to estimate the activation dynamics of individual neurons within larger, mesoscale networks (100s–1000 +neurons). However, a fully integrated automated workflow for the analysis and visualization of neural microcircuits from high speed fluorescence imaging data is lacking. New method Here we introduce FluoroSNNAP, Fluorescence Single Neuron and Network Analysis Package. FluoroSNNAP is an open-source, interactive software developed in MATLAB for automated quantification of numerous biologically relevant features of both the calcium dynamics of single-cells and network activity patterns. FluoroSNNAP integrates and improves upon existing tools for spike detection, synchronization analysis, and inference of functional connectivity, making it most useful to experimentalists with little or no programming knowledge. Results We apply FluoroSNNAP to characterize the activity patterns of neuronal microcircuits undergoing developmental maturation in vitro. Separately, we highlight the utility of single-cell analysis for phenotyping a mixed population of neurons expressing a human mutant variant of the microtubule associated protein tau and wild-type tau. Comparison with existing method(s) We show the performance of semi-automated cell segmentation using spatiotemporal independent component analysis and significant improvement in detecting calcium transients using a template-based algorithm in comparison to peak-based or wavelet-based detection methods. Our software further enables automated analysis of microcircuits, which is an improvement over existing methods. Conclusions We expect the dissemination of this software will facilitate a comprehensive analysis of neuronal networks, promoting the rapid interrogation of circuits in health and disease. PMID:25629800
The Deep Space Network information system in the year 2000
NASA Technical Reports Server (NTRS)
Markley, R. W.; Beswick, C. A.
1992-01-01
The Deep Space Network (DSN), the largest, most sensitive scientific communications and radio navigation network in the world, is considered. Focus is made on the telemetry processing, monitor and control, and ground data transport architectures of the DSN ground information system envisioned for the year 2000. The telemetry architecture will be unified from the front-end area to the end user. It will provide highly automated monitor and control of the DSN, automated configuration of support activities, and a vastly improved human interface. Automated decision support systems will be in place for DSN resource management, performance analysis, fault diagnosis, and contingency management.
Neural network expert system for X-ray analysis of welded joints
NASA Astrophysics Data System (ADS)
Kozlov, V. V.; Lapik, N. V.; Popova, N. V.
2018-03-01
The use of intelligent technologies for the automated analysis of product quality is one of the main trends in modern machine building. At the same time, rapid development in various spheres of human activity is experienced by methods associated with the use of artificial neural networks, as the basis for building automated intelligent diagnostic systems. Technologies of machine vision allow one to effectively detect the presence of certain regularities in the analyzed designation, including defects of welded joints according to radiography data.
NASA Astrophysics Data System (ADS)
Chiesl, T. N.; Benhabib, M.; Stockton, A. M.; Mathies, R. A.
2010-04-01
We present the Multichannel Mars Organic Analyzer (McMOA) for the analysis of Amino Acids, PAHs, and Oxidized Carbon. Microfluidic architecures integrating automated metering, mixing, on chip reactions, and serial dilutions are also discussed.
Neural networks for structural design - An integrated system implementation
NASA Technical Reports Server (NTRS)
Berke, Laszlo; Hafez, Wassim; Pao, Yoh-Han
1992-01-01
The development of powerful automated procedures to aid the creative designer is becoming increasingly critical for complex design tasks. In the work described here Artificial Neural Nets are applied to acquire structural analysis and optimization domain expertise. Based on initial instructions from the user an automated procedure generates random instances of structural analysis and/or optimization 'experiences' that cover a desired domain. It extracts training patterns from the created instances, constructs and trains an appropriate network architecture and checks the accuracy of net predictions. The final product is a trained neural net that can estimate analysis and/or optimization results instantaneously.
Performance modeling of automated manufacturing systems
NASA Astrophysics Data System (ADS)
Viswanadham, N.; Narahari, Y.
A unified and systematic treatment is presented of modeling methodologies and analysis techniques for performance evaluation of automated manufacturing systems. The book is the first treatment of the mathematical modeling of manufacturing systems. Automated manufacturing systems are surveyed and three principal analytical modeling paradigms are discussed: Markov chains, queues and queueing networks, and Petri nets.
Research of the application of the new communication technologies for distribution automation
NASA Astrophysics Data System (ADS)
Zhong, Guoxin; Wang, Hao
2018-03-01
Communication network is a key factor of distribution automation. In recent years, new communication technologies for distribution automation have a rapid development in China. This paper introduces the traditional communication technologies of distribution automation and analyse the defects of these traditional technologies. Then this paper gives a detailed analysis on some new communication technologies for distribution automation including wired communication and wireless communication and then gives an application suggestion of these new technologies.
An Automated Data Analysis Tool for Livestock Market Data
ERIC Educational Resources Information Center
Williams, Galen S.; Raper, Kellie Curry
2011-01-01
This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…
Automated Analysis of Planktic Foraminifers Part III: Neural Network Classification
NASA Astrophysics Data System (ADS)
Schiebel, R.; Bollmann, J.; Quinn, P.; Vela, M.; Schmidt, D. N.; Thierstein, H. R.
2003-04-01
The abundance and assemblage composition of microplankton, together with the chemical and stable isotopic composition of their shells, are among the most successful methods in paleoceanography and paleoclimatology. However, the manual collection of statistically significant numbers of unbiased, reproducible data is time consuming. Consequently, automated microfossil analysis and species recognition has been a long-standing goal in micropaleontology. We have developed a Windows based software package COGNIS for the segmentation, preprocessing, and classification of automatically acquired microfossil images (see Part II, Bollmann et al., this volume), using operator designed neural network structures. With a five-layered convolutional neural network we obtain an average recognition rate of 75 % (max. 88 %) for 6 taxa (N. dutertrei, N. pachyderma dextral, N. pachyderma sinistral, G. inflata, G. menardii/tumida, O. universa), represented by 50 images each for 20 classes (separation of spiral and umbilical views, and of sinistral and dextral forms). Our investigation indicates that neural networks hold great potential for the automated classification of planktic foraminifers and offer new perspectives in micropaleontology, paleoceanography, and paleoclimatology (see Part I, Schmidt et al., this volume).
Statistical modelling of networked human-automation performance using working memory capacity.
Ahmed, Nisar; de Visser, Ewart; Shaw, Tyler; Mohamed-Ameen, Amira; Campbell, Mark; Parasuraman, Raja
2014-01-01
This study examines the challenging problem of modelling the interaction between individual attentional limitations and decision-making performance in networked human-automation system tasks. Analysis of real experimental data from a task involving networked supervision of multiple unmanned aerial vehicles by human participants shows that both task load and network message quality affect performance, but that these effects are modulated by individual differences in working memory (WM) capacity. These insights were used to assess three statistical approaches for modelling and making predictions with real experimental networked supervisory performance data: classical linear regression, non-parametric Gaussian processes and probabilistic Bayesian networks. It is shown that each of these approaches can help designers of networked human-automated systems cope with various uncertainties in order to accommodate future users by linking expected operating conditions and performance from real experimental data to observable cognitive traits like WM capacity. Practitioner Summary: Working memory (WM) capacity helps account for inter-individual variability in operator performance in networked unmanned aerial vehicle supervisory tasks. This is useful for reliable performance prediction near experimental conditions via linear models; robust statistical prediction beyond experimental conditions via Gaussian process models and probabilistic inference about unknown task conditions/WM capacities via Bayesian network models.
Bowsher, Clive G
2011-02-15
Understanding the encoding and propagation of information by biochemical reaction networks and the relationship of such information processing properties to modular network structure is of fundamental importance in the study of cell signalling and regulation. However, a rigorous, automated approach for general biochemical networks has not been available, and high-throughput analysis has therefore been out of reach. Modularization Identification by Dynamic Independence Algorithms (MIDIA) is a user-friendly, extensible R package that performs automated analysis of how information is processed by biochemical networks. An important component is the algorithm's ability to identify exact network decompositions based on both the mass action kinetics and informational properties of the network. These modularizations are visualized using a tree structure from which important dynamic conditional independence properties can be directly read. Only partial stoichiometric information needs to be used as input to MIDIA, and neither simulations nor knowledge of rate parameters are required. When applied to a signalling network, for example, the method identifies the routes and species involved in the sequential propagation of information between its multiple inputs and outputs. These routes correspond to the relevant paths in the tree structure and may be further visualized using the Input-Output Path Matrix tool. MIDIA remains computationally feasible for the largest network reconstructions currently available and is straightforward to use with models written in Systems Biology Markup Language (SBML). The package is distributed under the GNU General Public License and is available, together with a link to browsable Supplementary Material, at http://code.google.com/p/midia. Further information is at www.maths.bris.ac.uk/~macgb/Software.html.
NASA Astrophysics Data System (ADS)
Wang, Ke; Guo, Ping; Luo, A.-Li
2017-03-01
Spectral feature extraction is a crucial procedure in automated spectral analysis. This procedure starts from the spectral data and produces informative and non-redundant features, facilitating the subsequent automated processing and analysis with machine-learning and data-mining techniques. In this paper, we present a new automated feature extraction method for astronomical spectra, with application in spectral classification and defective spectra recovery. The basic idea of our approach is to train a deep neural network to extract features of spectra with different levels of abstraction in different layers. The deep neural network is trained with a fast layer-wise learning algorithm in an analytical way without any iterative optimization procedure. We evaluate the performance of the proposed scheme on real-world spectral data. The results demonstrate that our method is superior regarding its comprehensive performance, and the computational cost is significantly lower than that for other methods. The proposed method can be regarded as a new valid alternative general-purpose feature extraction method for various tasks in spectral data analysis.
Nikolaisen, Julie; Nilsson, Linn I. H.; Pettersen, Ina K. N.; Willems, Peter H. G. M.; Lorens, James B.; Koopman, Werner J. H.; Tronstad, Karl J.
2014-01-01
Mitochondrial morphology and function are coupled in healthy cells, during pathological conditions and (adaptation to) endogenous and exogenous stress. In this sense mitochondrial shape can range from small globular compartments to complex filamentous networks, even within the same cell. Understanding how mitochondrial morphological changes (i.e. “mitochondrial dynamics”) are linked to cellular (patho) physiology is currently the subject of intense study and requires detailed quantitative information. During the last decade, various computational approaches have been developed for automated 2-dimensional (2D) analysis of mitochondrial morphology and number in microscopy images. Although these strategies are well suited for analysis of adhering cells with a flat morphology they are not applicable for thicker cells, which require a three-dimensional (3D) image acquisition and analysis procedure. Here we developed and validated an automated image analysis algorithm allowing simultaneous 3D quantification of mitochondrial morphology and network properties in human endothelial cells (HUVECs). Cells expressing a mitochondria-targeted green fluorescence protein (mitoGFP) were visualized by 3D confocal microscopy and mitochondrial morphology was quantified using both the established 2D method and the new 3D strategy. We demonstrate that both analyses can be used to characterize and discriminate between various mitochondrial morphologies and network properties. However, the results from 2D and 3D analysis were not equivalent when filamentous mitochondria in normal HUVECs were compared with circular/spherical mitochondria in metabolically stressed HUVECs treated with rotenone (ROT). 2D quantification suggested that metabolic stress induced mitochondrial fragmentation and loss of biomass. In contrast, 3D analysis revealed that the mitochondrial network structure was dissolved without affecting the amount and size of the organelles. Thus, our results demonstrate that 3D imaging and quantification are crucial for proper understanding of mitochondrial shape and topology in non-flat cells. In summary, we here present an integrative method for unbiased 3D quantification of mitochondrial shape and network properties in mammalian cells. PMID:24988307
Artificial neural network-aided image analysis system for cell counting.
Sjöström, P J; Frydel, B R; Wahlberg, L U
1999-05-01
In histological preparations containing debris and synthetic materials, it is difficult to automate cell counting using standard image analysis tools, i.e., systems that rely on boundary contours, histogram thresholding, etc. In an attempt to mimic manual cell recognition, an automated cell counter was constructed using a combination of artificial intelligence and standard image analysis methods. Artificial neural network (ANN) methods were applied on digitized microscopy fields without pre-ANN feature extraction. A three-layer feed-forward network with extensive weight sharing in the first hidden layer was employed and trained on 1,830 examples using the error back-propagation algorithm on a Power Macintosh 7300/180 desktop computer. The optimal number of hidden neurons was determined and the trained system was validated by comparison with blinded human counts. System performance at 50x and lO0x magnification was evaluated. The correlation index at 100x magnification neared person-to-person variability, while 50x magnification was not useful. The system was approximately six times faster than an experienced human. ANN-based automated cell counting in noisy histological preparations is feasible. Consistent histology and computer power are crucial for system performance. The system provides several benefits, such as speed of analysis and consistency, and frees up personnel for other tasks.
Automated screening of propulsion system test data by neural networks, phase 1
NASA Technical Reports Server (NTRS)
Hoyt, W. Andes; Whitehead, Bruce A.
1992-01-01
The evaluation of propulsion system test and flight performance data involves reviewing an extremely large volume of sensor data generated by each test. An automated system that screens large volumes of data and identifies propulsion system parameters which appear unusual or anomalous will increase the productivity of data analysis. Data analysts may then focus on a smaller subset of anomalous data for further evaluation of propulsion system tests. Such an automated data screening system would give NASA the benefit of a reduction in the manpower and time required to complete a propulsion system data evaluation. A phase 1 effort to develop a prototype data screening system is reported. Neural networks will detect anomalies based on nominal propulsion system data only. It appears that a reasonable goal for an operational system would be to screen out 95 pct. of the nominal data, leaving less than 5 pct. needing further analysis by human experts.
Toward the automated generation of genome-scale metabolic networks in the SEED.
DeJongh, Matthew; Formsma, Kevin; Boillot, Paul; Gould, John; Rycenga, Matthew; Best, Aaron
2007-04-26
Current methods for the automated generation of genome-scale metabolic networks focus on genome annotation and preliminary biochemical reaction network assembly, but do not adequately address the process of identifying and filling gaps in the reaction network, and verifying that the network is suitable for systems level analysis. Thus, current methods are only sufficient for generating draft-quality networks, and refinement of the reaction network is still largely a manual, labor-intensive process. We have developed a method for generating genome-scale metabolic networks that produces substantially complete reaction networks, suitable for systems level analysis. Our method partitions the reaction space of central and intermediary metabolism into discrete, interconnected components that can be assembled and verified in isolation from each other, and then integrated and verified at the level of their interconnectivity. We have developed a database of components that are common across organisms, and have created tools for automatically assembling appropriate components for a particular organism based on the metabolic pathways encoded in the organism's genome. This focuses manual efforts on that portion of an organism's metabolism that is not yet represented in the database. We have demonstrated the efficacy of our method by reverse-engineering and automatically regenerating the reaction network from a published genome-scale metabolic model for Staphylococcus aureus. Additionally, we have verified that our method capitalizes on the database of common reaction network components created for S. aureus, by using these components to generate substantially complete reconstructions of the reaction networks from three other published metabolic models (Escherichia coli, Helicobacter pylori, and Lactococcus lactis). We have implemented our tools and database within the SEED, an open-source software environment for comparative genome annotation and analysis. Our method sets the stage for the automated generation of substantially complete metabolic networks for over 400 complete genome sequences currently in the SEED. With each genome that is processed using our tools, the database of common components grows to cover more of the diversity of metabolic pathways. This increases the likelihood that components of reaction networks for subsequently processed genomes can be retrieved from the database, rather than assembled and verified manually.
Automated X-ray image analysis for cargo security: Critical review and future promise.
Rogers, Thomas W; Jaccard, Nicolas; Morton, Edward J; Griffin, Lewis D
2017-01-01
We review the relatively immature field of automated image analysis for X-ray cargo imagery. There is increasing demand for automated analysis methods that can assist in the inspection and selection of containers, due to the ever-growing volumes of traded cargo and the increasing concerns that customs- and security-related threats are being smuggled across borders by organised crime and terrorist networks. We split the field into the classical pipeline of image preprocessing and image understanding. Preprocessing includes: image manipulation; quality improvement; Threat Image Projection (TIP); and material discrimination and segmentation. Image understanding includes: Automated Threat Detection (ATD); and Automated Contents Verification (ACV). We identify several gaps in the literature that need to be addressed and propose ideas for future research. Where the current literature is sparse we borrow from the single-view, multi-view, and CT X-ray baggage domains, which have some characteristics in common with X-ray cargo.
Performance Analysis of Wireless Networks for Industrial Automation-Process Automation (WIA-PA)
2017-09-01
electrical, water and wastewater, oil and natural gas, chemical, transportation, pharmaceutical, pulp and paper, food and beverage, and discrete...scheduling method for isa100.11a,” in 2013 11th IEEE International Conference on Industrial Informatics (INDIN), 2013, pp. 649–654. [24] Y. Wei and D.-S
Sequence-of-events-driven automation of the deep space network
NASA Technical Reports Server (NTRS)
Hill, R., Jr.; Fayyad, K.; Smyth, C.; Santos, T.; Chen, R.; Chien, S.; Bevan, R.
1996-01-01
In February 1995, sequence-of-events (SOE)-driven automation technology was demonstrated for a Voyager telemetry downlink track at DSS 13. This demonstration entailed automated generation of an operations procedure (in the form of a temporal dependency network) from project SOE information using artificial intelligence planning technology and automated execution of the temporal dependency network using the link monitor and control operator assistant system. This article describes the overall approach to SOE-driven automation that was demonstrated, identifies gaps in SOE definitions and project profiles that hamper automation, and provides detailed measurements of the knowledge engineering effort required for automation.
Sequence-of-Events-Driven Automation of the Deep Space Network
NASA Technical Reports Server (NTRS)
Hill, R., Jr.; Fayyad, K.; Smyth, C.; Santos, T.; Chen, R.; Chien, S.; Bevan, R.
1996-01-01
In February 1995, sequence-of-events (SOE)-driven automation technology was demonstrated for a Voyager telemetry downlink track at DSS 13. This demonstration entailed automated generation of an operations procedure (in the form of a temporal dependency network) from project SOE information using artificial intelligence planning technology and automated execution of the temporal dependency network using the link monitor and control operator assistant system. This article describes the overall approach to SOE-driven automation that was demonstrated, identifies gaps in SOE definitions and project profiles that hamper automation, and provides detailed measurements of the knowledge engineering effort required for automation.
Automation of the longwall mining system
NASA Technical Reports Server (NTRS)
Zimmerman, W.; Aster, R. W.; Harris, J.; High, J.
1982-01-01
Cost effective, safe, and technologically sound applications of automation technology to underground coal mining were identified. The longwall analysis commenced with a general search for government and industry experience of mining automation technology. A brief industry survey was conducted to identify longwall operational, safety, and design problems. The prime automation candidates resulting from the industry experience and survey were: (1) the shearer operation, (2) shield and conveyor pan line advance, (3) a management information system to allow improved mine logistics support, and (4) component fault isolation and diagnostics to reduce untimely maintenance delays. A system network analysis indicated that a 40% improvement in productivity was feasible if system delays associated with all of the above four areas were removed. A technology assessment and conceptual system design of each of the four automation candidate areas showed that state of the art digital computer, servomechanism, and actuator technologies could be applied to automate the longwall system.
Li, Lin; Cazzell, Mary; Babawale, Olajide; Liu, Hanli
2016-10-01
Atlas-guided diffuse optical tomography (atlas-DOT) is a computational means to image changes in cortical hemodynamic signals during human brain activities. Graph theory analysis (GTA) is a network analysis tool commonly used in functional neuroimaging to study brain networks. Atlas-DOT has not been analyzed with GTA to derive large-scale brain connectivity/networks based on near-infrared spectroscopy (NIRS) measurements. We introduced an automated voxel classification (AVC) method that facilitated the use of GTA with atlas-DOT images by grouping unequal-sized finite element voxels into anatomically meaningful regions of interest within the human brain. The overall approach included volume segmentation, AVC, and cross-correlation. To demonstrate the usefulness of AVC, we applied reproducibility analysis to resting-state functional connectivity measurements conducted from 15 young adults in a two-week period. We also quantified and compared changes in several brain network metrics between young and older adults, which were in agreement with those reported by a previous positron emission tomography study. Overall, this study demonstrated that AVC is a useful means for facilitating integration or combination of atlas-DOT with GTA and thus for quantifying NIRS-based, voxel-wise resting-state functional brain networks.
Fast automated analysis of strong gravitational lenses with convolutional neural networks.
Hezaveh, Yashar D; Levasseur, Laurence Perreault; Marshall, Philip J
2017-08-30
Quantifying image distortions caused by strong gravitational lensing-the formation of multiple images of distant sources due to the deflection of their light by the gravity of intervening structures-and estimating the corresponding matter distribution of these structures (the 'gravitational lens') has primarily been performed using maximum likelihood modelling of observations. This procedure is typically time- and resource-consuming, requiring sophisticated lensing codes, several data preparation steps, and finding the maximum likelihood model parameters in a computationally expensive process with downhill optimizers. Accurate analysis of a single gravitational lens can take up to a few weeks and requires expert knowledge of the physical processes and methods involved. Tens of thousands of new lenses are expected to be discovered with the upcoming generation of ground and space surveys. Here we report the use of deep convolutional neural networks to estimate lensing parameters in an extremely fast and automated way, circumventing the difficulties that are faced by maximum likelihood methods. We also show that the removal of lens light can be made fast and automated using independent component analysis of multi-filter imaging data. Our networks can recover the parameters of the 'singular isothermal ellipsoid' density profile, which is commonly used to model strong lensing systems, with an accuracy comparable to the uncertainties of sophisticated models but about ten million times faster: 100 systems in approximately one second on a single graphics processing unit. These networks can provide a way for non-experts to obtain estimates of lensing parameters for large samples of data.
Extracting microtubule networks from superresolution single-molecule localization microscopy data
Zhang, Zhen; Nishimura, Yukako; Kanchanawong, Pakorn
2017-01-01
Microtubule filaments form ubiquitous networks that specify spatial organization in cells. However, quantitative analysis of microtubule networks is hampered by their complex architecture, limiting insights into the interplay between their organization and cellular functions. Although superresolution microscopy has greatly facilitated high-resolution imaging of microtubule filaments, extraction of complete filament networks from such data sets is challenging. Here we describe a computational tool for automated retrieval of microtubule filaments from single-molecule-localization–based superresolution microscopy images. We present a user-friendly, graphically interfaced implementation and a quantitative analysis of microtubule network architecture phenotypes in fibroblasts. PMID:27852898
Cyber Security Research Frameworks For Coevolutionary Network Defense
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rush, George D.; Tauritz, Daniel Remy
Several architectures have been created for developing and testing systems used in network security, but most are meant to provide a platform for running cyber security experiments as opposed to automating experiment processes. In the first paper, we propose a framework termed Distributed Cyber Security Automation Framework for Experiments (DCAFE) that enables experiment automation and control in a distributed environment. Predictive analysis of adversaries is another thorny issue in cyber security. Game theory can be used to mathematically analyze adversary models, but its scalability limitations restrict its use. Computational game theory allows us to scale classical game theory to larger,more » more complex systems. In the second paper, we propose a framework termed Coevolutionary Agent-based Network Defense Lightweight Event System (CANDLES) that can coevolve attacker and defender agent strategies and capabilities and evaluate potential solutions with a custom network defense simulation. The third paper is a continuation of the CANDLES project in which we rewrote key parts of the framework. Attackers and defenders have been redesigned to evolve pure strategy, and a new network security simulation is devised which specifies network architecture and adds a temporal aspect. We also add a hill climber algorithm to evaluate the search space and justify the use of a coevolutionary algorithm.« less
DOT National Transportation Integrated Search
2014-09-01
The concept of Automated Transit Networks (ATN) - in which fully automated vehicles on exclusive, grade-separated guideways : provide on-demand, primarily non-stop, origin-to-destination service over an area network has been around since the 1950...
2008-09-01
software facilitate targeting problem understanding and the network analysis tool, Palantir , as an efficient and tailored semi-automated means to...the use of compendium software facilitate targeting problem understanding and the network analysis tool, Palantir , as an efficient and tailored semi...OBJECTIVES USING COMPENDIUM SOFTWARE .....63 E. HOT TARGET PRIORITIZATION AND DEVELOPMENT USING PALANTIR SOFTWARE .................................69 1
A fully-automated neural network analysis of AFM force-distance curves for cancer tissue diagnosis
NASA Astrophysics Data System (ADS)
Minelli, Eleonora; Ciasca, Gabriele; Sassun, Tanya Enny; Antonelli, Manila; Palmieri, Valentina; Papi, Massimiliano; Maulucci, Giuseppe; Santoro, Antonio; Giangaspero, Felice; Delfini, Roberto; Campi, Gaetano; De Spirito, Marco
2017-10-01
Atomic Force Microscopy (AFM) has the unique capability of probing the nanoscale mechanical properties of biological systems that affect and are affected by the occurrence of many pathologies, including cancer. This capability has triggered growing interest in the translational process of AFM from physics laboratories to clinical practice. A factor still hindering the current use of AFM in diagnostics is related to the complexity of AFM data analysis, which is time-consuming and needs highly specialized personnel with a strong physical and mathematical background. In this work, we demonstrate an operator-independent neural-network approach for the analysis of surgically removed brain cancer tissues. This approach allowed us to distinguish—in a fully automated fashion—cancer from healthy tissues with high accuracy, also highlighting the presence and the location of infiltrating tumor cells.
A novel tracing method for the segmentation of cell wall networks.
De Vylder, Jonas; Rooms, Filip; Dhondt, Stijn; Inze, Dirk; Philips, Wilfried
2013-01-01
Cell wall networks are a common subject of research in biology, which are important for plant growth analysis, organ studies, etc. In order to automate the detection of individual cells in such cell wall networks, we propose a new segmentation algorithm. The proposed method is a network tracing algorithm, exploiting the prior knowledge of the network structure. The method is applicable on multiple microscopy modalities such as fluorescence, but also for images captured using non invasive microscopes such as differential interference contrast (DIC) microscopes.
2015-01-01
Background Sufficient knowledge of molecular and genetic interactions, which comprise the entire basis of the functioning of living systems, is one of the necessary requirements for successfully answering almost any research question in the field of biology and medicine. To date, more than 24 million scientific papers can be found in PubMed, with many of them containing descriptions of a wide range of biological processes. The analysis of such tremendous amounts of data requires the use of automated text-mining approaches. Although a handful of tools have recently been developed to meet this need, none of them provide error-free extraction of highly detailed information. Results The ANDSystem package was developed for the reconstruction and analysis of molecular genetic networks based on an automated text-mining technique. It provides a detailed description of the various types of interactions between genes, proteins, microRNA's, metabolites, cellular components, pathways and diseases, taking into account the specificity of cell lines and organisms. Although the accuracy of ANDSystem is comparable to other well known text-mining tools, such as Pathway Studio and STRING, it outperforms them in having the ability to identify an increased number of interaction types. Conclusion The use of ANDSystem, in combination with Pathway Studio and STRING, can improve the quality of the automated reconstruction of molecular and genetic networks. ANDSystem should provide a useful tool for researchers working in a number of different fields, including biology, biotechnology, pharmacology and medicine. PMID:25881313
Jang, Min Jee; Nam, Yoonkey
2015-01-01
Abstract. Optical recording facilitates monitoring the activity of a large neural network at the cellular scale, but the analysis and interpretation of the collected data remain challenging. Here, we present a MATLAB-based toolbox, named NeuroCa, for the automated processing and quantitative analysis of large-scale calcium imaging data. Our tool includes several computational algorithms to extract the calcium spike trains of individual neurons from the calcium imaging data in an automatic fashion. Two algorithms were developed to decompose the imaging data into the activity of individual cells and subsequently detect calcium spikes from each neuronal signal. Applying our method to dense networks in dissociated cultures, we were able to obtain the calcium spike trains of ∼1000 neurons in a few minutes. Further analyses using these data permitted the quantification of neuronal responses to chemical stimuli as well as functional mapping of spatiotemporal patterns in neuronal firing within the spontaneous, synchronous activity of a large network. These results demonstrate that our method not only automates time-consuming, labor-intensive tasks in the analysis of neural data obtained using optical recording techniques but also provides a systematic way to visualize and quantify the collective dynamics of a network in terms of its cellular elements. PMID:26229973
Fast automated analysis of strong gravitational lenses with convolutional neural networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hezaveh, Yashar D.; Levasseur, Laurence Perreault; Marshall, Philip J.
Quantifying image distortions caused by strong gravitational lensing—the formation of multiple images of distant sources due to the deflection of their light by the gravity of intervening structures—and estimating the corresponding matter distribution of these structures (the ‘gravitational lens’) has primarily been performed using maximum likelihood modelling of observations. Our procedure is typically time- and resource-consuming, requiring sophisticated lensing codes, several data preparation steps, and finding the maximum likelihood model parameters in a computationally expensive process with downhill optimizers. Accurate analysis of a single gravitational lens can take up to a few weeks and requires expert knowledge of the physicalmore » processes and methods involved. Tens of thousands of new lenses are expected to be discovered with the upcoming generation of ground and space surveys. We report the use of deep convolutional neural networks to estimate lensing parameters in an extremely fast and automated way, circumventing the difficulties that are faced by maximum likelihood methods. We also show that the removal of lens light can be made fast and automated using independent component analysis of multi-filter imaging data. Our networks can recover the parameters of the ‘singular isothermal ellipsoid’ density profile, which is commonly used to model strong lensing systems, with an accuracy comparable to the uncertainties of sophisticated models but about ten million times faster: 100 systems in approximately one second on a single graphics processing unit. These networks can provide a way for non-experts to obtain estimates of lensing parameters for large samples of data.« less
Fast automated analysis of strong gravitational lenses with convolutional neural networks
Hezaveh, Yashar D.; Levasseur, Laurence Perreault; Marshall, Philip J.
2017-08-30
Quantifying image distortions caused by strong gravitational lensing—the formation of multiple images of distant sources due to the deflection of their light by the gravity of intervening structures—and estimating the corresponding matter distribution of these structures (the ‘gravitational lens’) has primarily been performed using maximum likelihood modelling of observations. Our procedure is typically time- and resource-consuming, requiring sophisticated lensing codes, several data preparation steps, and finding the maximum likelihood model parameters in a computationally expensive process with downhill optimizers. Accurate analysis of a single gravitational lens can take up to a few weeks and requires expert knowledge of the physicalmore » processes and methods involved. Tens of thousands of new lenses are expected to be discovered with the upcoming generation of ground and space surveys. We report the use of deep convolutional neural networks to estimate lensing parameters in an extremely fast and automated way, circumventing the difficulties that are faced by maximum likelihood methods. We also show that the removal of lens light can be made fast and automated using independent component analysis of multi-filter imaging data. Our networks can recover the parameters of the ‘singular isothermal ellipsoid’ density profile, which is commonly used to model strong lensing systems, with an accuracy comparable to the uncertainties of sophisticated models but about ten million times faster: 100 systems in approximately one second on a single graphics processing unit. These networks can provide a way for non-experts to obtain estimates of lensing parameters for large samples of data.« less
Fast automated analysis of strong gravitational lenses with convolutional neural networks
NASA Astrophysics Data System (ADS)
Hezaveh, Yashar D.; Levasseur, Laurence Perreault; Marshall, Philip J.
2017-08-01
Quantifying image distortions caused by strong gravitational lensing—the formation of multiple images of distant sources due to the deflection of their light by the gravity of intervening structures—and estimating the corresponding matter distribution of these structures (the ‘gravitational lens’) has primarily been performed using maximum likelihood modelling of observations. This procedure is typically time- and resource-consuming, requiring sophisticated lensing codes, several data preparation steps, and finding the maximum likelihood model parameters in a computationally expensive process with downhill optimizers. Accurate analysis of a single gravitational lens can take up to a few weeks and requires expert knowledge of the physical processes and methods involved. Tens of thousands of new lenses are expected to be discovered with the upcoming generation of ground and space surveys. Here we report the use of deep convolutional neural networks to estimate lensing parameters in an extremely fast and automated way, circumventing the difficulties that are faced by maximum likelihood methods. We also show that the removal of lens light can be made fast and automated using independent component analysis of multi-filter imaging data. Our networks can recover the parameters of the ‘singular isothermal ellipsoid’ density profile, which is commonly used to model strong lensing systems, with an accuracy comparable to the uncertainties of sophisticated models but about ten million times faster: 100 systems in approximately one second on a single graphics processing unit. These networks can provide a way for non-experts to obtain estimates of lensing parameters for large samples of data.
Using satellite communications for a mobile computer network
NASA Technical Reports Server (NTRS)
Wyman, Douglas J.
1993-01-01
The topics discussed include the following: patrol car automation, mobile computer network, network requirements, network design overview, MCN mobile network software, MCN hub operation, mobile satellite software, hub satellite software, the benefits of patrol car automation, the benefits of satellite mobile computing, and national law enforcement satellite.
NASA Astrophysics Data System (ADS)
Porto, C. D. N.; Costa Filho, C. F. F.; Macedo, M. M. G.; Gutierrez, M. A.; Costa, M. G. F.
2017-03-01
Studies in intravascular optical coherence tomography (IV-OCT) have demonstrated the importance of coronary bifurcation regions in intravascular medical imaging analysis, as plaques are more likely to accumulate in this region leading to coronary disease. A typical IV-OCT pullback acquires hundreds of frames, thus developing an automated tool to classify the OCT frames as bifurcation or non-bifurcation can be an important step to speed up OCT pullbacks analysis and assist automated methods for atherosclerotic plaque quantification. In this work, we evaluate the performance of two state-of-the-art classifiers, SVM and Neural Networks in the bifurcation classification task. The study included IV-OCT frames from 9 patients. In order to improve classification performance, we trained and tested the SVM with different parameters by means of a grid search and different stop criteria were applied to the Neural Network classifier: mean square error, early stop and regularization. Different sets of features were tested, using feature selection techniques: PCA, LDA and scalar feature selection with correlation. Training and test were performed in sets with a maximum of 1460 OCT frames. We quantified our results in terms of false positive rate, true positive rate, accuracy, specificity, precision, false alarm, f-measure and area under ROC curve. Neural networks obtained the best classification accuracy, 98.83%, overcoming the results found in literature. Our methods appear to offer a robust and reliable automated classification of OCT frames that might assist physicians indicating potential frames to analyze. Methods for improving neural networks generalization have increased the classification performance.
Reasoning and Knowledge Acquisition Framework for 5G Network Analytics
2017-01-01
Autonomic self-management is a key challenge for next-generation networks. This paper proposes an automated analysis framework to infer knowledge in 5G networks with the aim to understand the network status and to predict potential situations that might disrupt the network operability. The framework is based on the Endsley situational awareness model, and integrates automated capabilities for metrics discovery, pattern recognition, prediction techniques and rule-based reasoning to infer anomalous situations in the current operational context. Those situations should then be mitigated, either proactive or reactively, by a more complex decision-making process. The framework is driven by a use case methodology, where the network administrator is able to customize the knowledge inference rules and operational parameters. The proposal has also been instantiated to prove its adaptability to a real use case. To this end, a reference network traffic dataset was used to identify suspicious patterns and to predict the behavior of the monitored data volume. The preliminary results suggest a good level of accuracy on the inference of anomalous traffic volumes based on a simple configuration. PMID:29065473
Reasoning and Knowledge Acquisition Framework for 5G Network Analytics.
Sotelo Monge, Marco Antonio; Maestre Vidal, Jorge; García Villalba, Luis Javier
2017-10-21
Autonomic self-management is a key challenge for next-generation networks. This paper proposes an automated analysis framework to infer knowledge in 5G networks with the aim to understand the network status and to predict potential situations that might disrupt the network operability. The framework is based on the Endsley situational awareness model, and integrates automated capabilities for metrics discovery, pattern recognition, prediction techniques and rule-based reasoning to infer anomalous situations in the current operational context. Those situations should then be mitigated, either proactive or reactively, by a more complex decision-making process. The framework is driven by a use case methodology, where the network administrator is able to customize the knowledge inference rules and operational parameters. The proposal has also been instantiated to prove its adaptability to a real use case. To this end, a reference network traffic dataset was used to identify suspicious patterns and to predict the behavior of the monitored data volume. The preliminary results suggest a good level of accuracy on the inference of anomalous traffic volumes based on a simple configuration.
Social networks to biological networks: systems biology of Mycobacterium tuberculosis.
Vashisht, Rohit; Bhardwaj, Anshu; Osdd Consortium; Brahmachari, Samir K
2013-07-01
Contextualizing relevant information to construct a network that represents a given biological process presents a fundamental challenge in the network science of biology. The quality of network for the organism of interest is critically dependent on the extent of functional annotation of its genome. Mostly the automated annotation pipelines do not account for unstructured information present in volumes of literature and hence large fraction of genome remains poorly annotated. However, if used, this information could substantially enhance the functional annotation of a genome, aiding the development of a more comprehensive network. Mining unstructured information buried in volumes of literature often requires manual intervention to a great extent and thus becomes a bottleneck for most of the automated pipelines. In this review, we discuss the potential of scientific social networking as a solution for systematic manual mining of data. Focusing on Mycobacterium tuberculosis, as a case study, we discuss our open innovative approach for the functional annotation of its genome. Furthermore, we highlight the strength of such collated structured data in the context of drug target prediction based on systems level analysis of pathogen.
Regional Seismic Arrays and Nuclear Test Ban Verification
1990-12-01
estimation has been difficult to automate, at least for regional and teleseismic signals. A neural network approach might be applicable here. The data must...use of trained neural networks . Of the 95 events examined, 66 were selected for the classification study based on high signal-to-noise ratio and...the International Joint Conference on Neural Networks , Washington, D.C., June, 1989. Menke, W. Geophysical Data Analysis : Discrete Inverse Theory
Code of Federal Regulations, 2012 CFR
2012-04-01
... funds at an automated teller machine, or to obtain a cash advance or loan against the cardholder's... transactions with B exceeds 200 (as provided in paragraph (c)(4) of this section). Example 3. Automated clearinghouse network. A operates an automated clearinghouse (“ACH”) network that merely processes electronic...
Code of Federal Regulations, 2013 CFR
2013-04-01
... funds at an automated teller machine, or to obtain a cash advance or loan against the cardholder's... transactions with B exceeds 200 (as provided in paragraph (c)(4) of this section). Example 3. Automated clearinghouse network. A operates an automated clearinghouse (“ACH”) network that merely processes electronic...
Automated and comprehensive link engineering supporting branched, ring, and mesh network topologies
NASA Astrophysics Data System (ADS)
Farina, J.; Khomchenko, D.; Yevseyenko, D.; Meester, J.; Richter, A.
2016-02-01
Link design, while relatively easy in the past, can become quite cumbersome with complex channel plans and equipment configurations. The task of designing optical transport systems and selecting equipment is often performed by an applications or sales engineer using simple tools, such as custom Excel spreadsheets. Eventually, every individual has their own version of the spreadsheet as well as their own methodology for building the network. This approach becomes unmanageable very quickly and leads to mistakes, bending of the engineering rules and installations that do not perform as expected. We demonstrate a comprehensive planning environment, which offers an efficient approach to unify, control and expedite the design process by controlling libraries of equipment and engineering methodologies, automating the process and providing the analysis tools necessary to predict system performance throughout the system and for all channels. In addition to the placement of EDFAs and DCEs, performance analysis metrics are provided at every step of the way. Metrics that can be tracked include power, CD and OSNR, SPM, XPM, FWM and SBS. Automated routine steps assist in design aspects such as equalization, padding and gain setting for EDFAs, the placement of ROADMs and transceivers, and creating regeneration points. DWDM networks consisting of a large number of nodes and repeater huts, interconnected in linear, branched, mesh and ring network topologies, can be designed much faster when compared with conventional design methods. Using flexible templates for all major optical components, our technology-agnostic planning approach supports the constant advances in optical communications.
North Carolina: Statewide Automation and Connectivity Efforts.
ERIC Educational Resources Information Center
Christian, Elaine J., Ed.
1996-01-01
Describes statewide information automation and connectivity efforts in North Carolina. Highlights include Triangle Research Libraries Network Document Delivery System; cooperative networking projects; public library connectivity to the state library; rural access projects; community college automation; K-12 technology plans; electronic government…
Myneni, Sahiti; Cobb, Nathan K; Cohen, Trevor
2016-01-01
Analysis of user interactions in online communities could improve our understanding of health-related behaviors and inform the design of technological solutions that support behavior change. However, to achieve this we would need methods that provide granular perspective, yet are scalable. In this paper, we present a methodology for high-throughput semantic and network analysis of large social media datasets, combining semi-automated text categorization with social network analytics. We apply this method to derive content-specific network visualizations of 16,492 user interactions in an online community for smoking cessation. Performance of the categorization system was reasonable (average F-measure of 0.74, with system-rater reliability approaching rater-rater reliability). The resulting semantically specific network analysis of user interactions reveals content- and behavior-specific network topologies. Implications for socio-behavioral health and wellness platforms are also discussed.
Performance Analysis of Automated Attack Graph Generation Software
2006-12-01
MIT Lincoln Laboratory – NetSPA .................................................13 3. Skybox - Skybox View...Lip05*) 3. Skybox - Skybox View Skybox View is a commercially available tool developed by Skybox Security that can automatically generate...each host. It differs from CAULDRON because it requires that Skybox View probe live networks and must be connected to live networks during its
CHARACTERIZATION OF THE COMPLETE FIBER NETWORK TOPOLOGY OF PLANAR FIBROUS TISSUES AND SCAFFOLDS
D'Amore, Antonio; Stella, John A.; Wagner, William R.; Sacks, Michael S.
2010-01-01
Understanding how engineered tissue scaffold architecture affects cell morphology, metabolism, phenotypic expression, as well as predicting material mechanical behavior have recently received increased attention. In the present study, an image-based analysis approach that provides an automated tool to characterize engineered tissue fiber network topology is presented. Micro-architectural features that fully defined fiber network topology were detected and quantified, which include fiber orientation, connectivity, intersection spatial density, and diameter. Algorithm performance was tested using scanning electron microscopy (SEM) images of electrospun poly(ester urethane)urea (ES-PEUU) scaffolds. SEM images of rabbit mesenchymal stem cell (MSC) seeded collagen gel scaffolds and decellularized rat carotid arteries were also analyzed to further evaluate the ability of the algorithm to capture fiber network morphology regardless of scaffold type and the evaluated size scale. The image analysis procedure was validated qualitatively and quantitatively, comparing fiber network topology manually detected by human operators (n=5) with that automatically detected by the algorithm. Correlation values between manual detected and algorithm detected results for the fiber angle distribution and for the fiber connectivity distribution were 0.86 and 0.93 respectively. Algorithm detected fiber intersections and fiber diameter values were comparable (within the mean ± standard deviation) with those detected by human operators. This automated approach identifies and quantifies fiber network morphology as demonstrated for three relevant scaffold types and provides a means to: (1) guarantee objectivity, (2) significantly reduce analysis time, and (3) potentiate broader analysis of scaffold architecture effects on cell behavior and tissue development both in vitro and in vivo. PMID:20398930
Myneni, Sahiti; Cobb, Nathan K; Cohen, Trevor
2013-01-01
Unhealthy behaviors increase individual health risks and are a socioeconomic burden. Harnessing social influence is perceived as fundamental for interventions to influence health-related behaviors. However, the mechanisms through which social influence occurs are poorly understood. Online social networks provide the opportunity to understand these mechanisms as they digitally archive communication between members. In this paper, we present a methodology for content-based social network analysis, combining qualitative coding, automated text analysis, and formal network analysis such that network structure is determined by the content of messages exchanged between members. We apply this approach to characterize the communication between members of QuitNet, an online social network for smoking cessation. Results indicate that the method identifies meaningful theme-based social sub-networks. Modeling social network data using this method can provide us with theme-specific insights such as the identities of opinion leaders and sub-community clusters. Implications for design of targeted social interventions are discussed.
Biblio-MetReS: A bibliometric network reconstruction application and server
2011-01-01
Background Reconstruction of genes and/or protein networks from automated analysis of the literature is one of the current targets of text mining in biomedical research. Some user-friendly tools already perform this analysis on precompiled databases of abstracts of scientific papers. Other tools allow expert users to elaborate and analyze the full content of a corpus of scientific documents. However, to our knowledge, no user friendly tool that simultaneously analyzes the latest set of scientific documents available on line and reconstructs the set of genes referenced in those documents is available. Results This article presents such a tool, Biblio-MetReS, and compares its functioning and results to those of other user-friendly applications (iHOP, STRING) that are widely used. Under similar conditions, Biblio-MetReS creates networks that are comparable to those of other user friendly tools. Furthermore, analysis of full text documents provides more complete reconstructions than those that result from using only the abstract of the document. Conclusions Literature-based automated network reconstruction is still far from providing complete reconstructions of molecular networks. However, its value as an auxiliary tool is high and it will increase as standards for reporting biological entities and relationships become more widely accepted and enforced. Biblio-MetReS is an application that can be downloaded from http://metres.udl.cat/. It provides an easy to use environment for researchers to reconstruct their networks of interest from an always up to date set of scientific documents. PMID:21975133
Automated image-based phenotypic analysis in zebrafish embryos
Vogt, Andreas; Cholewinski, Andrzej; Shen, Xiaoqiang; Nelson, Scott; Lazo, John S.; Tsang, Michael; Hukriede, Neil A.
2009-01-01
Presently, the zebrafish is the only vertebrate model compatible with contemporary paradigms of drug discovery. Zebrafish embryos are amenable to automation necessary for high-throughput chemical screens, and optical transparency makes them potentially suited for image-based screening. However, the lack of tools for automated analysis of complex images presents an obstacle to utilizing the zebrafish as a high-throughput screening model. We have developed an automated system for imaging and analyzing zebrafish embryos in multi-well plates regardless of embryo orientation and without user intervention. Images of fluorescent embryos were acquired on a high-content reader and analyzed using an artificial intelligence-based image analysis method termed Cognition Network Technology (CNT). CNT reliably detected transgenic fluorescent embryos (Tg(fli1:EGFP)y1) arrayed in 96-well plates and quantified intersegmental blood vessel development in embryos treated with small molecule inhibitors of anigiogenesis. The results demonstrate it is feasible to adapt image-based high-content screening methodology to measure complex whole organism phenotypes. PMID:19235725
Automated Analysis of Fluorescence Microscopy Images to Identify Protein-Protein Interactions
Venkatraman, S.; Doktycz, M. J.; Qi, H.; ...
2006-01-01
The identification of protein interactions is important for elucidating biological networks. One obstacle in comprehensive interaction studies is the analyses of large datasets, particularly those containing images. Development of an automated system to analyze an image-based protein interaction dataset is needed. Such an analysis system is described here, to automatically extract features from fluorescence microscopy images obtained from a bacterial protein interaction assay. These features are used to relay quantitative values that aid in the automated scoring of positive interactions. Experimental observations indicate that identifying at least 50% positive cells in an image is sufficient to detect a protein interaction.more » Based on this criterion, the automated system presents 100% accuracy in detecting positive interactions for a dataset of 16 images. Algorithms were implemented using MATLAB and the software developed is available on request from the authors.« less
DOE Program on Seismic Characterization for Regions of Interest to CTBT Monitoring,
1995-08-14
processing of the monitoring network data). While developing and testing the corrections and other parameters needed by the automated processing systems...the secondary network. Parameters tabulated in the knowledge base must be appropriate for routine automated processing of network data, and must also...operation of the PNDC, as well as to results of investigations of "special events" (i.e., those events that fail to locate or discriminate during automated
Spanish Museum Libraries Network.
ERIC Educational Resources Information Center
Lopez de Prado, Rosario
This paper describes the creation of an automated network of museum libraries in Spain. The only way in which the specialized libraries in the world today can continue to be active and to offer valid information is to automate the service they offer, and create network libraries with cooperative plans. The network can be configured with different…
NASA Astrophysics Data System (ADS)
Kapulin, D. V.; Chemidov, I. V.; Kazantsev, M. A.
2017-01-01
In the paper, the aspects of design, development and implementation of the automated control system for warehousing under the manufacturing process of the radio-electronic enterprise JSC «Radiosvyaz» are discussed. The architecture of the automated control system for warehousing proposed in the paper consists of a server which is connected to the physically separated information networks: the network with a database server, which stores information about the orders for picking, and the network with the automated storage and retrieval system. This principle allows implementing the requirements for differentiation of access, ensuring the information safety and security requirements. Also, the efficiency of the developed automated solutions in terms of optimizing the warehouse’s logistic characteristics is researched.
The Australian Bibliographic Network--An Introduction.
ERIC Educational Resources Information Center
Hannan, Chris
1982-01-01
Focuses on the centralized, online, shared-cataloging facility of the Australian Bibliographic Network operated by the National Library of Australia. Existing automated library systems in Australia, selection and implementation of Washington Library Network software, comparison of Australian and U.S. automation, and projections for future…
2014-01-01
Background Network-based learning algorithms for automated function prediction (AFP) are negatively affected by the limited coverage of experimental data and limited a priori known functional annotations. As a consequence their application to model organisms is often restricted to well characterized biological processes and pathways, and their effectiveness with poorly annotated species is relatively limited. A possible solution to this problem might consist in the construction of big networks including multiple species, but this in turn poses challenging computational problems, due to the scalability limitations of existing algorithms and the main memory requirements induced by the construction of big networks. Distributed computation or the usage of big computers could in principle respond to these issues, but raises further algorithmic problems and require resources not satisfiable with simple off-the-shelf computers. Results We propose a novel framework for scalable network-based learning of multi-species protein functions based on both a local implementation of existing algorithms and the adoption of innovative technologies: we solve “locally” the AFP problem, by designing “vertex-centric” implementations of network-based algorithms, but we do not give up thinking “globally” by exploiting the overall topology of the network. This is made possible by the adoption of secondary memory-based technologies that allow the efficient use of the large memory available on disks, thus overcoming the main memory limitations of modern off-the-shelf computers. This approach has been applied to the analysis of a large multi-species network including more than 300 species of bacteria and to a network with more than 200,000 proteins belonging to 13 Eukaryotic species. To our knowledge this is the first work where secondary-memory based network analysis has been applied to multi-species function prediction using biological networks with hundreds of thousands of proteins. Conclusions The combination of these algorithmic and technological approaches makes feasible the analysis of large multi-species networks using ordinary computers with limited speed and primary memory, and in perspective could enable the analysis of huge networks (e.g. the whole proteomes available in SwissProt), using well-equipped stand-alone machines. PMID:24843788
The role of human-automation consensus in multiple unmanned vehicle scheduling.
Cummings, M L; Clare, Andrew; Hart, Christin
2010-02-01
This study examined the impact of increasing automation replanning rates on operator performance and workload when supervising a decentralized network of heterogeneous unmanned vehicles. Futuristic unmanned vehicles systems will invert the operator-to-vehicle ratio so that one operator can control multiple dissimilar vehicles connected through a decentralized network. Significant human-automation collaboration will be needed because of automation brittleness, but such collaboration could cause high workload. Three increasing levels of replanning were tested on an existing multiple unmanned vehicle simulation environment that leverages decentralized algorithms for vehicle routing and task allocation in conjunction with human supervision. Rapid replanning can cause high operator workload, ultimately resulting in poorer overall system performance. Poor performance was associated with a lack of operator consensus for when to accept the automation's suggested prompts for new plan consideration as well as negative attitudes toward unmanned aerial vehicles in general. Participants with video game experience tended to collaborate more with the automation, which resulted in better performance. In decentralized unmanned vehicle networks, operators who ignore the automation's requests for new plan consideration and impose rapid replans both increase their own workload and reduce the ability of the vehicle network to operate at its maximum capacity. These findings have implications for personnel selection and training for futuristic systems involving human collaboration with decentralized algorithms embedded in networks of autonomous systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wantuck, P. J.; Hollen, R. M.
2002-01-01
This paper provides an overview of some design and automation-related projects ongoing within the Applied Engineering Technologies (AET) Group at Los Alamos National Laboratory. AET uses a diverse set of technical capabilities to develop and apply processes and technologies to applications for a variety of customers both internal and external to the Laboratory. The Advanced Recovery and Integrated Extraction System (ARIES) represents a new paradigm for the processing of nuclear material from retired weapon systems in an environment that seeks to minimize the radiation dose to workers. To achieve this goal, ARIES relies upon automation-based features to handle and processmore » the nuclear material. Our Chemical Process Development Team specializes in fuzzy logic and intelligent control systems. Neural network technology has been utilized in some advanced control systems developed by team members. Genetic algorithms and neural networks have often been applied for data analysis. Enterprise modeling, or discrete event simulation, as well as chemical process simulation has been employed for chemical process plant design. Fuel cell research and development has historically been an active effort within the AET organization. Under the principal sponsorship of the Department of Energy, the Fuel Cell Team is now focusing on technologies required to produce fuel cell compatible feed gas from reformation of a variety of conventional fuels (e.g., gasoline, natural gas), principally for automotive applications. This effort involves chemical reactor design and analysis, process modeling, catalyst analysis, as well as full scale system characterization and testing. The group's Automation and Robotics team has at its foundation many years of experience delivering automated and robotic systems for nuclear, analytical chemistry, and bioengineering applications. As an integrator of commercial systems and a developer of unique custom-made systems, the team currently supports the automation needs of many Laboratory programs.« less
Automated distress surveys : analysis of network-level data.
DOT National Transportation Integrated Search
2017-04-01
TxDOT Project 0-6663, Phase 1: Rutting : Applus, Dynatest, Fugro, Pathway and TxDOT : Reference: detailed project level (24 550-ft sections) : Phase 2: Distresses : Dynatest, Fugro, WayLink-OSU and TxDOT : Reference: detailed proj...
Neural network approach in multichannel auditory event-related potential analysis.
Wu, F Y; Slater, J D; Ramsay, R E
1994-04-01
Even though there are presently no clearly defined criteria for the assessment of P300 event-related potential (ERP) abnormality, it is strongly indicated through statistical analysis that such criteria exist for classifying control subjects and patients with diseases resulting in neuropsychological impairment such as multiple sclerosis (MS). We have demonstrated the feasibility of artificial neural network (ANN) methods in classifying ERP waveforms measured at a single channel (Cz) from control subjects and MS patients. In this paper, we report the results of multichannel ERP analysis and a modified network analysis methodology to enhance automation of the classification rule extraction process. The proposed methodology significantly reduces the work of statistical analysis. It also helps to standardize the criteria of P300 ERP assessment and facilitate the computer-aided analysis on neuropsychological functions.
Quantifying autonomous vehicles national fuel consumption impacts: A data-rich approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yuche; Gonder, Jeffrey; Young, Stanley
Autonomous vehicles are drawing significant attention from governments, manufacturers and consumers. Experts predict them to be the primary means of transportation by the middle of this century. Recent literature shows that vehicle automation has the potential to alter traffic patterns, vehicle ownership, and land use, which may affect fuel consumption from the transportation sector. In this paper, we developed a data-rich analytical framework to quantify system-wide fuel impacts of automation in the United States by integrating (1) a dynamic vehicle sales, stock, and usage model, (2) an historical transportation network-level vehicle miles traveled (VMT)/vehicle activity database, and (3) estimates ofmore » automation's impacts on fuel efficiency and travel demand. The vehicle model considers dynamics in vehicle fleet turnover and fuel efficiency improvements of conventional and advanced vehicle fleet. The network activity database contains VMT, free-flow speeds, and historical speeds of road links that can help us accurately identify fuel-savings opportunities of automation. Based on the model setup and assumptions, we found that the impacts of automation on fuel consumption are quite wide-ranging - with the potential to reduce fuel consumption by 45% in our 'Optimistic' case or increase it by 30% in our 'Pessimistic' case. Second, implementing automation on urban roads could potentially result in larger fuel savings compared with highway automation because of the driving features of urban roads. Lastly, through scenario analysis, we showed that the proposed framework can be used for refined assessments as better data on vehicle-level fuel efficiency and travel demand impacts of automation become available.« less
Quantifying autonomous vehicles national fuel consumption impacts: A data-rich approach
Chen, Yuche; Gonder, Jeffrey; Young, Stanley; ...
2017-11-06
Autonomous vehicles are drawing significant attention from governments, manufacturers and consumers. Experts predict them to be the primary means of transportation by the middle of this century. Recent literature shows that vehicle automation has the potential to alter traffic patterns, vehicle ownership, and land use, which may affect fuel consumption from the transportation sector. In this paper, we developed a data-rich analytical framework to quantify system-wide fuel impacts of automation in the United States by integrating (1) a dynamic vehicle sales, stock, and usage model, (2) an historical transportation network-level vehicle miles traveled (VMT)/vehicle activity database, and (3) estimates ofmore » automation's impacts on fuel efficiency and travel demand. The vehicle model considers dynamics in vehicle fleet turnover and fuel efficiency improvements of conventional and advanced vehicle fleet. The network activity database contains VMT, free-flow speeds, and historical speeds of road links that can help us accurately identify fuel-savings opportunities of automation. Based on the model setup and assumptions, we found that the impacts of automation on fuel consumption are quite wide-ranging - with the potential to reduce fuel consumption by 45% in our 'Optimistic' case or increase it by 30% in our 'Pessimistic' case. Second, implementing automation on urban roads could potentially result in larger fuel savings compared with highway automation because of the driving features of urban roads. Lastly, through scenario analysis, we showed that the proposed framework can be used for refined assessments as better data on vehicle-level fuel efficiency and travel demand impacts of automation become available.« less
32 CFR 2001.50 - Telecommunications automated information systems and network security.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 6 2011-07-01 2011-07-01 false Telecommunications automated information systems and network security. 2001.50 Section 2001.50 National Defense Other Regulations Relating to National... network security. Each agency head shall ensure that classified information electronically accessed...
32 CFR 2001.50 - Telecommunications automated information systems and network security.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 32 National Defense 6 2010-07-01 2010-07-01 false Telecommunications automated information systems and network security. 2001.50 Section 2001.50 National Defense Other Regulations Relating to National... network security. Each agency head shall ensure that classified information electronically accessed...
Connecting Land-Based Networks to Ships
2013-06-01
multipoint wireless broadband systems, and WiMAX networks were initially deployed for fixed and nomadic (portable) applications. These standards...CAPABILITIES OF SHIP-TO-SHORE COMMUNICATIONS A. US Navy Automated Digital Network System (ADNS) The U.S. Navy’s Automated Digital Network System (ADNS...submit digitally any necessary documents to the terminal operators, contact their logistics providers, access tidal information and receive
Network Meta-Analysis Using R: A Review of Currently Available Automated Packages
Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph
2014-01-01
Network meta-analysis (NMA) – a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously – has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA. PMID:25541687
Network meta-analysis using R: a review of currently available automated packages.
Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph
2014-01-01
Network meta-analysis (NMA)--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.
Analysis of Delays in Transmitting Time Code Using an Automated Computer Time Distribution System
1999-12-01
jlevine@clock. bldrdoc.gov Abstract An automated computer time distribution system broadcasts standard tune to users using computers and modems via...contributed to &lays - sofhareplatform (50% of the delay), transmission speed of time- codes (25OA), telephone network (lS%), modem and others (10’4). The... modems , and telephone lines. Users dial the ACTS server to receive time traceable to the national time scale of Singapore, UTC(PSB). The users can in
Integrated workflows for spiking neuronal network simulations
Antolík, Ján; Davison, Andrew P.
2013-01-01
The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages. PMID:24368902
Integrated workflows for spiking neuronal network simulations.
Antolík, Ján; Davison, Andrew P
2013-01-01
The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages.
NASA Astrophysics Data System (ADS)
Galvagno, Marta; Gamon, John; Cremonese, Edoardo; Garrity, Steven; Huemmrich, K. Fred; Filippa, Gianluca; Morra di Cella, Umberto; Rossini, Micol
2017-04-01
Automated canopy-level optical sampling in tandem with ecosystem-atmosphere flux observations is continuously carried on at a variety of ecosystems through the Specnet network (http://specnet.info/). Specifically, 9 sites within US and Europe were selected since 2015, to investigate the use of novel NDVI and PRI low-cost sensors for the analysis of ecosystem functioning and phenology. Different plant functional types, such as grasslands, deciduous, and evergreen forests belong to the network, here we present specific data from the larch (Larix decidua Mill.) forest Italian site. Three automated NDVI and three automated PRI spectral reflectance sensors (Decagon Devices Inc.) were installed in 2015 on the top of the 20-meters eddy covariance tower, pointing toward the west, north, and east orientations. An additional system, composed by one NDVI and PRI system was installed to monitor the understory component. The objective of this analysis is the comparison between these in-situ inexpensive sensors, independent NDVI and PRI sensors (Skye Instruments) previously installed on the 20-meters tower and satellite-derived NDVI. Both MODIS and Sentinel NDVI data were used for the comparison. Moreover, the newly derived chlorophyll/carotenoid index (CCI, Gamon et al. 2016), computed as the normalized difference between the NDVI red band and PRI 532 nm band, was tested to estimate the seasonal pattern of daily Gross Primary Productivity (GPP) of the larch forest. Results showed that the seasonality of NDVI was comparable among in-situ sensors and satellite data, though orientation-specific differences were observed. Both NDVI and CCI tracked daily GPP, but with different sensitivity to its seasonality. Future analysis will be directed toward a comparison between this site-based results with the other sites within the Specnet network.
1982-02-23
segregate the computer and storage from the outside world 2. Administrative security to control access to secure computer facilities 3. Network security to...Classification Alternative A- 8 NETWORK KG GENSER DSSCS AMPE TERMINALS TP No. 022-4668-A Figure A-2. Dedicated Switching Architecture Alternative A- 9...communications protocol with the network and GENSER message transmission to the - I-S/A AMPE processor. 7. DSSCS TPU - Handles communications protocol with
Hardware Realization of an Ethernet Packet Analyzer Search Engine
2000-06-30
specific for the home automation industry. This analyzer will be at the gateway of a network and analyze Ethernet packets as they go by. It will keep... home automation and not the computer network. This system is a stand-alone real-time network analyzer capable of decoding Ethernet protocols. The
Pastore, Vito Paolo; Godjoski, Aleksandar; Martinoia, Sergio; Massobrio, Paolo
2018-01-01
We implemented an automated and efficient open-source software for the analysis of multi-site neuronal spike signals. The software package, named SPICODYN, has been developed as a standalone windows GUI application, using C# programming language with Microsoft Visual Studio based on .NET framework 4.5 development environment. Accepted input data formats are HDF5, level 5 MAT and text files, containing recorded or generated time series spike signals data. SPICODYN processes such electrophysiological signals focusing on: spiking and bursting dynamics and functional-effective connectivity analysis. In particular, for inferring network connectivity, a new implementation of the transfer entropy method is presented dealing with multiple time delays (temporal extension) and with multiple binary patterns (high order extension). SPICODYN is specifically tailored to process data coming from different Multi-Electrode Arrays setups, guarantying, in those specific cases, automated processing. The optimized implementation of the Delayed Transfer Entropy and the High-Order Transfer Entropy algorithms, allows performing accurate and rapid analysis on multiple spike trains from thousands of electrodes.
Optimization-based method for automated road network extraction
DOT National Transportation Integrated Search
2001-09-18
Automated road information extraction has significant applicability in transportation. : It provides a means for creating, maintaining, and updating transportation network databases that : are needed for purposes ranging from traffic management to au...
Image segmentation and dynamic lineage analysis in single-cell fluorescence microscopy.
Wang, Quanli; Niemi, Jarad; Tan, Chee-Meng; You, Lingchong; West, Mike
2010-01-01
An increasingly common component of studies in synthetic and systems biology is analysis of dynamics of gene expression at the single-cell level, a context that is heavily dependent on the use of time-lapse movies. Extracting quantitative data on the single-cell temporal dynamics from such movies remains a major challenge. Here, we describe novel methods for automating key steps in the analysis of single-cell, fluorescent images-segmentation and lineage reconstruction-to recognize and track individual cells over time. The automated analysis iteratively combines a set of extended morphological methods for segmentation, and uses a neighborhood-based scoring method for frame-to-frame lineage linking. Our studies with bacteria, budding yeast and human cells, demonstrate the portability and usability of these methods, whether using phase, bright field or fluorescent images. These examples also demonstrate the utility of our integrated approach in facilitating analyses of engineered and natural cellular networks in diverse settings. The automated methods are implemented in freely available, open-source software.
Fuzzy Emotional Semantic Analysis and Automated Annotation of Scene Images
Cao, Jianfang; Chen, Lichao
2015-01-01
With the advances in electronic and imaging techniques, the production of digital images has rapidly increased, and the extraction and automated annotation of emotional semantics implied by images have become issues that must be urgently addressed. To better simulate human subjectivity and ambiguity for understanding scene images, the current study proposes an emotional semantic annotation method for scene images based on fuzzy set theory. A fuzzy membership degree was calculated to describe the emotional degree of a scene image and was implemented using the Adaboost algorithm and a back-propagation (BP) neural network. The automated annotation method was trained and tested using scene images from the SUN Database. The annotation results were then compared with those based on artificial annotation. Our method showed an annotation accuracy rate of 91.2% for basic emotional values and 82.4% after extended emotional values were added, which correspond to increases of 5.5% and 8.9%, respectively, compared with the results from using a single BP neural network algorithm. Furthermore, the retrieval accuracy rate based on our method reached approximately 89%. This study attempts to lay a solid foundation for the automated emotional semantic annotation of more types of images and therefore is of practical significance. PMID:25838818
2012-08-01
disruptions in the resting state networks and neurocognitive pathologies such as schizophrenia, Alzheimer’s disease and attention deficit hyperactive ... deficits associated with TBI [8,9,10]. However, there is a growing body of evidence suggesting that TBI could induce thalamic injury, classically...Inventory (NBSI), Post Traumatic Stress Disorder (PCL-C) and the Automated Neuropsychological Assessment Metrics (ANAM). C. Data Acquisition and
Method of evaluating, expanding, and collapsing connectivity regions within dynamic systems
Bailey, David A [Schenectady, NY
2004-11-16
An automated process defines and maintains connectivity regions within a dynamic network. The automated process requires an initial input of a network component around which a connectivity region will be defined. The process automatically and autonomously generates a region around the initial input, stores the region's definition, and monitors the network for a change. Upon detecting a change in the network, the effect is evaluated, and if necessary the regions are adjusted and redefined to accommodate the change. Only those regions of the network affected by the change will be updated. This process eliminates the need for an operator to manually evaluate connectivity regions within a network. Since the automated process maintains the network, the reliance on an operator is minimized; thus, reducing the potential for operator error. This combination of region maintenance and reduced operator reliance, results in a reduction of overall error.
NASA Technical Reports Server (NTRS)
Decker, Arthur J.; Krasowski, Michael J.
1991-01-01
The goal is to develop an approach to automating the alignment and adjustment of optical measurement, visualization, inspection, and control systems. Classical controls, expert systems, and neural networks are three approaches to automating the alignment of an optical system. Neural networks were chosen for this project and the judgements that led to this decision are presented. Neural networks were used to automate the alignment of the ubiquitous laser-beam-smoothing spatial filter. The results and future plans of the project are presented.
A holistic approach to ZigBee performance enhancement for home automation networks.
Betzler, August; Gomez, Carles; Demirkol, Ilker; Paradells, Josep
2014-08-14
Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible performance gains by implementing and testing alternative settings. The evaluations are carried out in a testbed of 57 TelosB motes. The results show that considerable performance improvements can be achieved by using alternative protocol stack configurations. From these results, we derive two improved protocol stack configurations for ZigBee wireless home automation networks that are validated in various network scenarios. In our experiments, these improved configurations yield a relative packet delivery ratio increase of up to 33.6%, a delay decrease of up to 66.6% and an improvement of the energy efficiency for battery powered devices of up to 48.7%, obtainable without incurring any overhead to the network.
A Holistic Approach to ZigBee Performance Enhancement for Home Automation Networks
Betzler, August; Gomez, Carles; Demirkol, Ilker; Paradells, Josep
2014-01-01
Wireless home automation networks are gaining importance for smart homes. In this ambit, ZigBee networks play an important role. The ZigBee specification defines a default set of protocol stack parameters and mechanisms that is further refined by the ZigBee Home Automation application profile. In a holistic approach, we analyze how the network performance is affected with the tuning of parameters and mechanisms across multiple layers of the ZigBee protocol stack and investigate possible performance gains by implementing and testing alternative settings. The evaluations are carried out in a testbed of 57 TelosB motes. The results show that considerable performance improvements can be achieved by using alternative protocol stack configurations. From these results, we derive two improved protocol stack configurations for ZigBee wireless home automation networks that are validated in various network scenarios. In our experiments, these improved configurations yield a relative packet delivery ratio increase of up to 33.6%, a delay decrease of up to 66.6% and an improvement of the energy efficiency for battery powered devices of up to 48.7%, obtainable without incurring any overhead to the network. PMID:25196004
Dynamic Communication Resource Negotiations
NASA Technical Reports Server (NTRS)
Chow, Edward; Vatan, Farrokh; Paloulian, George; Frisbie, Steve; Srostlik, Zuzana; Kalomiris, Vasilios; Apgar, Daniel
2012-01-01
Today's advanced network management systems can automate many aspects of the tactical networking operations within a military domain. However, automation of joint and coalition tactical networking across multiple domains remains challenging. Due to potentially conflicting goals and priorities, human agreement is often required before implementation into the network operations. This is further complicated by incompatible network management systems and security policies, rendering it difficult to implement automatic network management, thus requiring manual human intervention to the communication protocols used at various network routers and endpoints. This process of manual human intervention is tedious, error-prone, and slow. In order to facilitate a better solution, we are pursuing a technology which makes network management automated, reliable, and fast. Automating the negotiation of the common network communication parameters between different parties is the subject of this paper. We present the technology that enables inter-force dynamic communication resource negotiations to enable ad-hoc inter-operation in the field between force domains, without pre-planning. It also will enable a dynamic response to changing conditions within the area of operations. Our solution enables the rapid blending of intra-domain policies so that the forces involved are able to inter-operate effectively without overwhelming each other's networks with in-appropriate or un-warranted traffic. It will evaluate the policy rules and configuration data for each of the domains, then generate a compatible inter-domain policy and configuration that will update the gateway systems between the two domains.
MONGKIE: an integrated tool for network analysis and visualization for multi-omics data.
Jang, Yeongjun; Yu, Namhee; Seo, Jihae; Kim, Sun; Lee, Sanghyuk
2016-03-18
Network-based integrative analysis is a powerful technique for extracting biological insights from multilayered omics data such as somatic mutations, copy number variations, and gene expression data. However, integrated analysis of multi-omics data is quite complicated and can hardly be done in an automated way. Thus, a powerful interactive visual mining tool supporting diverse analysis algorithms for identification of driver genes and regulatory modules is much needed. Here, we present a software platform that integrates network visualization with omics data analysis tools seamlessly. The visualization unit supports various options for displaying multi-omics data as well as unique network models for describing sophisticated biological networks such as complex biomolecular reactions. In addition, we implemented diverse in-house algorithms for network analysis including network clustering and over-representation analysis. Novel functions include facile definition and optimized visualization of subgroups, comparison of a series of data sets in an identical network by data-to-visual mapping and subsequent overlaying function, and management of custom interaction networks. Utility of MONGKIE for network-based visual data mining of multi-omics data was demonstrated by analysis of the TCGA glioblastoma data. MONGKIE was developed in Java based on the NetBeans plugin architecture, thus being OS-independent with intrinsic support of module extension by third-party developers. We believe that MONGKIE would be a valuable addition to network analysis software by supporting many unique features and visualization options, especially for analysing multi-omics data sets in cancer and other diseases. .
Automated monitor and control for deep space network subsystems
NASA Technical Reports Server (NTRS)
Smyth, P.
1989-01-01
The problem of automating monitor and control loops for Deep Space Network (DSN) subsystems is considered and an overview of currently available automation techniques is given. The use of standard numerical models, knowledge-based systems, and neural networks is considered. It is argued that none of these techniques alone possess sufficient generality to deal with the demands imposed by the DSN environment. However, it is shown that schemes that integrate the better aspects of each approach and are referenced to a formal system model show considerable promise, although such an integrated technology is not yet available for implementation. Frequent reference is made to the receiver subsystem since this work was largely motivated by experience in developing an automated monitor and control loop for the advanced receiver.
Automated Video Quality Assessment for Deep-Sea Video
NASA Astrophysics Data System (ADS)
Pirenne, B.; Hoeberechts, M.; Kalmbach, A.; Sadhu, T.; Branzan Albu, A.; Glotin, H.; Jeffries, M. A.; Bui, A. O. V.
2015-12-01
Video provides a rich source of data for geophysical analysis, often supplying detailed information about the environment when other instruments may not. This is especially true of deep-sea environments, where direct visual observations cannot be made. As computer vision techniques improve and volumes of video data increase, automated video analysis is emerging as a practical alternative to labor-intensive manual analysis. Automated techniques can be much more sensitive to video quality than their manual counterparts, so performing quality assessment before doing full analysis is critical to producing valid results.Ocean Networks Canada (ONC), an initiative of the University of Victoria, operates cabled ocean observatories that supply continuous power and Internet connectivity to a broad suite of subsea instruments from the coast to the deep sea, including video and still cameras. This network of ocean observatories has produced almost 20,000 hours of video (about 38 hours are recorded each day) and an additional 8,000 hours of logs from remotely operated vehicle (ROV) dives. We begin by surveying some ways in which deep-sea video poses challenges for automated analysis, including: 1. Non-uniform lighting: Single, directional, light sources produce uneven luminance distributions and shadows; remotely operated lighting equipment are also susceptible to technical failures. 2. Particulate noise: Turbidity and marine snow are often present in underwater video; particles in the water column can have sharper focus and higher contrast than the objects of interest due to their proximity to the light source and can also influence the camera's autofocus and auto white-balance routines. 3. Color distortion (low contrast): The rate of absorption of light in water varies by wavelength, and is higher overall than in air, altering apparent colors and lowering the contrast of objects at a distance.We also describe measures under development at ONC for detecting and mitigating these effects. These steps include filtering out unusable data, color and luminance balancing, and choosing the most appropriate image descriptors. We apply these techniques to generate automated quality assessment of video data and illustrate their utility with an example application where we perform vision-based substrate classification.
Automated visual inspection system based on HAVNET architecture
NASA Astrophysics Data System (ADS)
Burkett, K.; Ozbayoglu, Murat A.; Dagli, Cihan H.
1994-10-01
In this study, the HAusdorff-Voronoi NETwork (HAVNET) developed at the UMR Smart Engineering Systems Lab is tested in the recognition of mounted circuit components commonly used in printed circuit board assembly systems. The automated visual inspection system used consists of a CCD camera, a neural network based image processing software and a data acquisition card connected to a PC. The experiments are run in the Smart Engineering Systems Lab in the Engineering Management Dept. of the University of Missouri-Rolla. The performance analysis shows that the vision system is capable of recognizing different components under uncontrolled lighting conditions without being effected by rotation or scale differences. The results obtained are promising and the system can be used in real manufacturing environments. Currently the system is being customized for a specific manufacturing application.
DOE R&D Accomplishments Database
Russell, J. A. G.; Alexoff, D. L.; Wolf, A. P.
1984-09-01
This presentation describes an evolving distributed microprocessor network for automating the routine production synthesis of radiotracers used in Positron Emission Tomography. We first present a brief overview of the PET method for measuring biological function, and then outline the general procedure for producing a radiotracer. The paper identifies several reasons for our automating the syntheses of these compounds. There is a description of the distributed microprocessor network architecture chosen and the rationale for that choice. Finally, we speculate about how this network may be exploited to extend the power of the PET method from the large university or National Laboratory to the biomedical research and clinical community at large. (DT)
Jurrus, Elizabeth; Paiva, Antonio R C; Watanabe, Shigeki; Anderson, James R; Jones, Bryan W; Whitaker, Ross T; Jorgensen, Erik M; Marc, Robert E; Tasdizen, Tolga
2010-12-01
Study of nervous systems via the connectome, the map of connectivities of all neurons in that system, is a challenging problem in neuroscience. Towards this goal, neurobiologists are acquiring large electron microscopy datasets. However, the shear volume of these datasets renders manual analysis infeasible. Hence, automated image analysis methods are required for reconstructing the connectome from these very large image collections. Segmentation of neurons in these images, an essential step of the reconstruction pipeline, is challenging because of noise, anisotropic shapes and brightness, and the presence of confounding structures. The method described in this paper uses a series of artificial neural networks (ANNs) in a framework combined with a feature vector that is composed of image intensities sampled over a stencil neighborhood. Several ANNs are applied in series allowing each ANN to use the classification context provided by the previous network to improve detection accuracy. We develop the method of serial ANNs and show that the learned context does improve detection over traditional ANNs. We also demonstrate advantages over previous membrane detection methods. The results are a significant step towards an automated system for the reconstruction of the connectome. Copyright 2010 Elsevier B.V. All rights reserved.
The Use of AMET and Automated Scripts for Model Evaluation
The Atmospheric Model Evaluation Tool (AMET) is a suite of software designed to facilitate the analysis and evaluation of meteorological and air quality models. AMET matches the model output for particular locations to the corresponding observed values from one or more networks ...
Jockeying for Supremacy in a Networked World.
ERIC Educational Resources Information Center
Barry, Jeff; And Others
1996-01-01
Provides an overview of the 1995 automated system marketplace, compiled by analysis of data gathered from 27 vendors. Topics include sales data; dominant vendors in different library sectors; company restructuring and consolidation; plans for technological upgrades, especially Internet connectivity, and improved customer support, service, and…
Automation of route identification and optimisation based on data-mining and chemical intuition.
Lapkin, A A; Heer, P K; Jacob, P-M; Hutchby, M; Cunningham, W; Bull, S D; Davidson, M G
2017-09-21
Data-mining of Reaxys and network analysis of the combined literature and in-house reactions set were used to generate multiple possible reaction routes to convert a bio-waste feedstock, limonene, into a pharmaceutical API, paracetamol. The network analysis of data provides a rich knowledge-base for generation of the initial reaction screening and development programme. Based on the literature and the in-house data, an overall flowsheet for the conversion of limonene to paracetamol was proposed. Each individual reaction-separation step in the sequence was simulated as a combination of the continuous flow and batch steps. The linear model generation methodology allowed us to identify the reaction steps requiring further chemical optimisation. The generated model can be used for global optimisation and generation of environmental and other performance indicators, such as cost indicators. However, the identified further challenge is to automate model generation to evolve optimal multi-step chemical routes and optimal process configurations.
Green, Walton A.; Little, Stefan A.; Price, Charles A.; Wing, Scott L.; Smith, Selena Y.; Kotrc, Benjamin; Doria, Gabriela
2014-01-01
The reticulate venation that is characteristic of a dicot leaf has excited interest from systematists for more than a century, and from physiological and developmental botanists for decades. The tools of digital image acquisition and computer image analysis, however, are only now approaching the sophistication needed to quantify aspects of the venation network found in real leaves quickly, easily, accurately, and reliably enough to produce biologically meaningful data. In this paper, we examine 120 leaves distributed across vascular plants (representing 118 genera and 80 families) using two approaches: a semiquantitative scoring system called “leaf ranking,” devised by the late Leo Hickey, and an automated image-analysis protocol. In the process of comparing these approaches, we review some methodological issues that arise in trying to quantify a vein network, and discuss the strengths and weaknesses of automatic data collection and human pattern recognition. We conclude that subjective leaf rank provides a relatively consistent, semiquantitative measure of areole size among other variables; that modal areole size is generally consistent across large sections of a leaf lamina; and that both approaches—semiquantitative, subjective scoring; and fully quantitative, automated measurement—have appropriate places in the study of leaf venation. PMID:25202646
Neural-network-based state of health diagnostics for an automated radioxenon sampler/analyzer
NASA Astrophysics Data System (ADS)
Keller, Paul E.; Kangas, Lars J.; Hayes, James C.; Schrom, Brian T.; Suarez, Reynold; Hubbard, Charles W.; Heimbigner, Tom R.; McIntyre, Justin I.
2009-05-01
Artificial neural networks (ANNs) are used to determine the state-of-health (SOH) of the Automated Radioxenon Analyzer/Sampler (ARSA). ARSA is a gas collection and analysis system used for non-proliferation monitoring in detecting radioxenon released during nuclear tests. SOH diagnostics are important for automated, unmanned sensing systems so that remote detection and identification of problems can be made without onsite staff. Both recurrent and feed-forward ANNs are presented. The recurrent ANN is trained to predict sensor values based on current valve states, which control air flow, so that with only valve states the normal SOH sensor values can be predicted. Deviation between modeled value and actual is an indication of a potential problem. The feed-forward ANN acts as a nonlinear version of principal components analysis (PCA) and is trained to replicate the normal SOH sensor values. Because of ARSA's complexity, this nonlinear PCA is better able to capture the relationships among the sensors than standard linear PCA and is applicable to both sensor validation and recognizing off-normal operating conditions. Both models provide valuable information to detect impending malfunctions before they occur to avoid unscheduled shutdown. Finally, the ability of ANN methods to predict the system state is presented.
A Fully Automated High-Throughput Zebrafish Behavioral Ototoxicity Assay.
Todd, Douglas W; Philip, Rohit C; Niihori, Maki; Ringle, Ryan A; Coyle, Kelsey R; Zehri, Sobia F; Zabala, Leanne; Mudery, Jordan A; Francis, Ross H; Rodriguez, Jeffrey J; Jacob, Abraham
2017-08-01
Zebrafish animal models lend themselves to behavioral assays that can facilitate rapid screening of ototoxic, otoprotective, and otoregenerative drugs. Structurally similar to human inner ear hair cells, the mechanosensory hair cells on their lateral line allow the zebrafish to sense water flow and orient head-to-current in a behavior called rheotaxis. This rheotaxis behavior deteriorates in a dose-dependent manner with increased exposure to the ototoxin cisplatin, thereby establishing itself as an excellent biomarker for anatomic damage to lateral line hair cells. Building on work by our group and others, we have built a new, fully automated high-throughput behavioral assay system that uses automated image analysis techniques to quantify rheotaxis behavior. This novel system consists of a custom-designed swimming apparatus and imaging system consisting of network-controlled Raspberry Pi microcomputers capturing infrared video. Automated analysis techniques detect individual zebrafish, compute their orientation, and quantify the rheotaxis behavior of a zebrafish test population, producing a powerful, high-throughput behavioral assay. Using our fully automated biological assay to test a standardized ototoxic dose of cisplatin against varying doses of compounds that protect or regenerate hair cells may facilitate rapid translation of candidate drugs into preclinical mammalian models of hearing loss.
Topology Analysis of Social Networks Extracted from Literature
2015-01-01
In a world where complex networks are an increasingly important part of science, it is interesting to question how the new reading of social realities they provide applies to our cultural background and in particular, popular culture. Are authors of successful novels able to reproduce social networks faithful to the ones found in reality? Is there any common trend connecting an author’s oeuvre, or a genre of fiction? Such an analysis could provide new insight on how we, as a culture, perceive human interactions and consume media. The purpose of the work presented in this paper is to define the signature of a novel’s story based on the topological analysis of its social network of characters. For this purpose, an automated tool was built that analyses the dialogs in novels, identifies characters and computes their relationships in a time-dependent manner in order to assess the network’s evolution over the course of the story. PMID:26039072
New methodologies for multi-scale time-variant reliability analysis of complex lifeline networks
NASA Astrophysics Data System (ADS)
Kurtz, Nolan Scot
The cost of maintaining existing civil infrastructure is enormous. Since the livelihood of the public depends on such infrastructure, its state must be managed appropriately using quantitative approaches. Practitioners must consider not only which components are most fragile to hazard, e.g. seismicity, storm surge, hurricane winds, etc., but also how they participate on a network level using network analysis. Focusing on particularly damaged components does not necessarily increase network functionality, which is most important to the people that depend on such infrastructure. Several network analyses, e.g. S-RDA, LP-bounds, and crude-MCS, and performance metrics, e.g. disconnection bounds and component importance, are available for such purposes. Since these networks are existing, the time state is also important. If networks are close to chloride sources, deterioration may be a major issue. Information from field inspections may also have large impacts on quantitative models. To address such issues, hazard risk analysis methodologies for deteriorating networks subjected to seismicity, i.e. earthquakes, have been created from analytics. A bridge component model has been constructed for these methodologies. The bridge fragilities, which were constructed from data, required a deeper level of analysis as these were relevant for specific structures. Furthermore, chloride-induced deterioration network effects were investigated. Depending on how mathematical models incorporate new information, many approaches are available, such as Bayesian model updating. To make such procedures more flexible, an adaptive importance sampling scheme was created for structural reliability problems. Additionally, such a method handles many kinds of system and component problems with singular or multiple important regions of the limit state function. These and previously developed analysis methodologies were found to be strongly sensitive to the network size. Special network topologies may be more or less computationally difficult, while the resolution of the network also has large affects. To take advantage of some types of topologies, network hierarchical structures with super-link representation have been used in the literature to increase the computational efficiency by analyzing smaller, densely connected networks; however, such structures were based on user input and subjective at times. To address this, algorithms must be automated and reliable. These hierarchical structures may indicate the structure of the network itself. This risk analysis methodology has been expanded to larger networks using such automated hierarchical structures. Component importance is the most important objective from such network analysis; however, this may only provide the information of which bridges to inspect/repair earliest and little else. High correlations influence such component importance measures in a negative manner. Additionally, a regional approach is not appropriately modelled. To investigate a more regional view, group importance measures based on hierarchical structures have been created. Such structures may also be used to create regional inspection/repair approaches. Using these analytical, quantitative risk approaches, the next generation of decision makers may make both component and regional-based optimal decisions using information from both network function and further effects of infrastructure deterioration.
Modeling Multiple Human-Automation Distributed Systems using Network-form Games
NASA Technical Reports Server (NTRS)
Brat, Guillaume
2012-01-01
The paper describes at a high-level the network-form game framework (based on Bayes net and game theory), which can be used to model and analyze safety issues in large, distributed, mixed human-automation systems such as NextGen.
Automation in School Library Media Centers.
ERIC Educational Resources Information Center
Driver, Russell W.; Driver, Mary Anne
1982-01-01
Surveys the historical development of automated technical processing in schools and notes the impact of this automation in a number of cases. Speculations about the future involvement of school libraries in automated processing and networking are included. Thirty references are listed. (BBM)
Looking backward, 1984-1959: twenty-five years of library automation--a personal view.
Pizer, I H
1984-01-01
A brief profile of Janet Doe is given. Twenty-five years of library automation are reviewed from the author's point of view. Major projects such as the SUNY Biomedical Communication Network and the Regional Online Union Catalog of the Greater Midwest Regional Medical Library Network are discussed. Important figures in medical library automation are considered, as is the major role played by the National Library of Medicine. Images PMID:6388691
Automated tracking of the Florida manatee (Trichechus manatus)
NASA Technical Reports Server (NTRS)
Michelson, R. C.; Breedlove, J.; Jenkins, H. H.
1978-01-01
The electronic, physical, biological and environmental factors involved in the automated remote tracking of the Florida manatee (Trichechus manatus) are identified. The current status of the manatee as an endangered species is provided. Brief descriptions of existing tracking and position locating systems are presented to identify the state of the art in these fields. An analysis of energy media is conducted to identify those with the highest probability of success for this application. Logistic questions such as the means of attachment and position of any equipment to be placed on the manatee are also investigated. Power sources and manateeborne electronics encapsulation techniques are studied and the results of a compter generated DF network analysis are summarized.
High-power transmitter automation. [deep space network
NASA Technical Reports Server (NTRS)
Gosline, R.
1980-01-01
The current status of the transmitter automation development applicable to all transmitters in the deep space network is described. Interface and software designs are described that improve reliability and reduce the time required for subsystem turn-on and klystron saturation to less than 10 minutes.
Automated Induction Of Rule-Based Neural Networks
NASA Technical Reports Server (NTRS)
Smyth, Padhraic J.; Goodman, Rodney M.
1994-01-01
Prototype expert systems implemented in software and are functionally equivalent to neural networks set up automatically and placed into operation within minutes following information-theoretic approach to automated acquisition of knowledge from large example data bases. Approach based largely on use of ITRULE computer program.
Security-Enhanced Autonomous Network Management
NASA Technical Reports Server (NTRS)
Zeng, Hui
2015-01-01
Ensuring reliable communication in next-generation space networks requires a novel network management system to support greater levels of autonomy and greater awareness of the environment and assets. Intelligent Automation, Inc., has developed a security-enhanced autonomous network management (SEANM) approach for space networks through cross-layer negotiation and network monitoring, analysis, and adaptation. The underlying technology is bundle-based delay/disruption-tolerant networking (DTN). The SEANM scheme allows a system to adaptively reconfigure its network elements based on awareness of network conditions, policies, and mission requirements. Although SEANM is generically applicable to any radio network, for validation purposes it has been prototyped and evaluated on two specific networks: a commercial off-the-shelf hardware test-bed using Institute of Electrical Engineers (IEEE) 802.11 Wi-Fi devices and a military hardware test-bed using AN/PRC-154 Rifleman Radio platforms. Testing has demonstrated that SEANM provides autonomous network management resulting in reliable communications in delay/disruptive-prone environments.
Library Automation in the Netherlands and Pica.
ERIC Educational Resources Information Center
Bossers, Anton; Van Muyen, Martin
1984-01-01
Describes the Pica Library Automation Network (originally the Project for Integrated Catalogue Automation), which is based on a centralized bibliographic database. Highlights include the Pica conception of library automation, online shared cataloging system, circulation control system, acquisition system, and online Dutch union catalog with…
Automating Risk Analysis of Software Design Models
Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688
Automating risk analysis of software design models.
Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.
Radar targets reveal all to automated tester
NASA Astrophysics Data System (ADS)
Hartman, R. E.
1985-09-01
Technological developments in the field of automated test equipment for low radar-cross-section (RCS) systems are reviewed. Emphasis is given to an Automated Digital Analysis and Measurement (ADAM) system for measuring, scattering, and evaluating RCS using a minicomputer in combination with a vector network analyzer and a positioner programmer. ADAM incorporates a stepped CW measurement technique to obtain RCS as a function of both range and frequency at a fixed aspect angle. The operating characteristics and calibration procedures of the ADAM system are described and estimates of RCS sensitivity are obtained. The response resolution of the ADAM system is estimated to be 36 cm per measurement bandwidth (in GHz) for a minimum window. A block diagram of the error checking routine of the ADAM system is provided.
Bayesian ISOLA: new tool for automated centroid moment tensor inversion
NASA Astrophysics Data System (ADS)
Vackář, Jiří; Burjánek, Jan; Gallovič, František; Zahradník, Jiří; Clinton, John
2017-08-01
We have developed a new, fully automated tool for the centroid moment tensor (CMT) inversion in a Bayesian framework. It includes automated data retrieval, data selection where station components with various instrumental disturbances are rejected and full-waveform inversion in a space-time grid around a provided hypocentre. A data covariance matrix calculated from pre-event noise yields an automated weighting of the station recordings according to their noise levels and also serves as an automated frequency filter suppressing noisy frequency ranges. The method is tested on synthetic and observed data. It is applied on a data set from the Swiss seismic network and the results are compared with the existing high-quality MT catalogue. The software package programmed in Python is designed to be as versatile as possible in order to be applicable in various networks ranging from local to regional. The method can be applied either to the everyday network data flow, or to process large pre-existing earthquake catalogues and data sets.
Performance evaluation of the NASA/KSC CAD/CAE and office automation LAN's
NASA Technical Reports Server (NTRS)
Zobrist, George W.
1994-01-01
This study's objective is the performance evaluation of the existing CAD/CAE (Computer Aided Design/Computer Aided Engineering) network at NASA/KSC. This evaluation also includes a similar study of the Office Automation network, since it is being planned to integrate this network into the CAD/CAE network. The Microsoft mail facility which is presently on the CAD/CAE network was monitored to determine its present usage. This performance evaluation of the various networks will aid the NASA/KSC network managers in planning for the integration of future workload requirements into the CAD/CAE network and determining the effectiveness of the planned FDDI (Fiber Distributed Data Interface) migration.
Optimizing a Drone Network to Deliver Automated External Defibrillators.
Boutilier, Justin J; Brooks, Steven C; Janmohamed, Alyf; Byers, Adam; Buick, Jason E; Zhan, Cathy; Schoellig, Angela P; Cheskes, Sheldon; Morrison, Laurie J; Chan, Timothy C Y
2017-06-20
Public access defibrillation programs can improve survival after out-of-hospital cardiac arrest, but automated external defibrillators (AEDs) are rarely available for bystander use at the scene. Drones are an emerging technology that can deliver an AED to the scene of an out-of-hospital cardiac arrest for bystander use. We hypothesize that a drone network designed with the aid of a mathematical model combining both optimization and queuing can reduce the time to AED arrival. We applied our model to 53 702 out-of-hospital cardiac arrests that occurred in the 8 regions of the Toronto Regional RescuNET between January 1, 2006, and December 31, 2014. Our primary analysis quantified the drone network size required to deliver an AED 1, 2, or 3 minutes faster than historical median 911 response times for each region independently. A secondary analysis quantified the reduction in drone resources required if RescuNET was treated as a large coordinated region. The region-specific analysis determined that 81 bases and 100 drones would be required to deliver an AED ahead of median 911 response times by 3 minutes. In the most urban region, the 90th percentile of the AED arrival time was reduced by 6 minutes and 43 seconds relative to historical 911 response times in the region. In the most rural region, the 90th percentile was reduced by 10 minutes and 34 seconds. A single coordinated drone network across all regions required 39.5% fewer bases and 30.0% fewer drones to achieve similar AED delivery times. An optimized drone network designed with the aid of a novel mathematical model can substantially reduce the AED delivery time to an out-of-hospital cardiac arrest event. © 2017 American Heart Association, Inc.
Ghafoorian, Mohsen; Karssemeijer, Nico; Heskes, Tom; Bergkamp, Mayra; Wissink, Joost; Obels, Jiri; Keizer, Karlijn; de Leeuw, Frank-Erik; Ginneken, Bram van; Marchiori, Elena; Platel, Bram
2017-01-01
Lacunes of presumed vascular origin (lacunes) are associated with an increased risk of stroke, gait impairment, and dementia and are a primary imaging feature of the small vessel disease. Quantification of lacunes may be of great importance to elucidate the mechanisms behind neuro-degenerative disorders and is recommended as part of study standards for small vessel disease research. However, due to the different appearance of lacunes in various brain regions and the existence of other similar-looking structures, such as perivascular spaces, manual annotation is a difficult, elaborative and subjective task, which can potentially be greatly improved by reliable and consistent computer-aided detection (CAD) routines. In this paper, we propose an automated two-stage method using deep convolutional neural networks (CNN). We show that this method has good performance and can considerably benefit readers. We first use a fully convolutional neural network to detect initial candidates. In the second step, we employ a 3D CNN as a false positive reduction tool. As the location information is important to the analysis of candidate structures, we further equip the network with contextual information using multi-scale analysis and integration of explicit location features. We trained, validated and tested our networks on a large dataset of 1075 cases obtained from two different studies. Subsequently, we conducted an observer study with four trained observers and compared our method with them using a free-response operating characteristic analysis. Shown on a test set of 111 cases, the resulting CAD system exhibits performance similar to the trained human observers and achieves a sensitivity of 0.974 with 0.13 false positives per slice. A feasibility study also showed that a trained human observer would considerably benefit once aided by the CAD system.
Automated Melanoma Recognition in Dermoscopy Images via Very Deep Residual Networks.
Yu, Lequan; Chen, Hao; Dou, Qi; Qin, Jing; Heng, Pheng-Ann
2017-04-01
Automated melanoma recognition in dermoscopy images is a very challenging task due to the low contrast of skin lesions, the huge intraclass variation of melanomas, the high degree of visual similarity between melanoma and non-melanoma lesions, and the existence of many artifacts in the image. In order to meet these challenges, we propose a novel method for melanoma recognition by leveraging very deep convolutional neural networks (CNNs). Compared with existing methods employing either low-level hand-crafted features or CNNs with shallower architectures, our substantially deeper networks (more than 50 layers) can acquire richer and more discriminative features for more accurate recognition. To take full advantage of very deep networks, we propose a set of schemes to ensure effective training and learning under limited training data. First, we apply the residual learning to cope with the degradation and overfitting problems when a network goes deeper. This technique can ensure that our networks benefit from the performance gains achieved by increasing network depth. Then, we construct a fully convolutional residual network (FCRN) for accurate skin lesion segmentation, and further enhance its capability by incorporating a multi-scale contextual information integration scheme. Finally, we seamlessly integrate the proposed FCRN (for segmentation) and other very deep residual networks (for classification) to form a two-stage framework. This framework enables the classification network to extract more representative and specific features based on segmented results instead of the whole dermoscopy images, further alleviating the insufficiency of training data. The proposed framework is extensively evaluated on ISBI 2016 Skin Lesion Analysis Towards Melanoma Detection Challenge dataset. Experimental results demonstrate the significant performance gains of the proposed framework, ranking the first in classification and the second in segmentation among 25 teams and 28 teams, respectively. This study corroborates that very deep CNNs with effective training mechanisms can be employed to solve complicated medical image analysis tasks, even with limited training data.
NMRNet: A deep learning approach to automated peak picking of protein NMR spectra.
Klukowski, Piotr; Augoff, Michal; Zieba, Maciej; Drwal, Maciej; Gonczarek, Adam; Walczak, Michal J
2018-03-14
Automated selection of signals in protein NMR spectra, known as peak picking, has been studied for over 20 years, nevertheless existing peak picking methods are still largely deficient. Accurate and precise automated peak picking would accelerate the structure calculation, and analysis of dynamics and interactions of macromolecules. Recent advancement in handling big data, together with an outburst of machine learning techniques, offer an opportunity to tackle the peak picking problem substantially faster than manual picking and on par with human accuracy. In particular, deep learning has proven to systematically achieve human-level performance in various recognition tasks, and thus emerges as an ideal tool to address automated identification of NMR signals. We have applied a convolutional neural network for visual analysis of multidimensional NMR spectra. A comprehensive test on 31 manually-annotated spectra has demonstrated top-tier average precision (AP) of 0.9596, 0.9058 and 0.8271 for backbone, side-chain and NOESY spectra, respectively. Furthermore, a combination of extracted peak lists with automated assignment routine, FLYA, outperformed other methods, including the manual one, and led to correct resonance assignment at the levels of 90.40%, 89.90% and 90.20% for three benchmark proteins. The proposed model is a part of a Dumpling software (platform for protein NMR data analysis), and is available at https://dumpling.bio/. michaljerzywalczak@gmail.compiotr.klukowski@pwr.edu.pl. Supplementary data are available at Bioinformatics online.
NASA Technical Reports Server (NTRS)
Chien, S. A.; Hill, R. W., Jr.; Govindjee, A.; Wang, X.; Estlin, T.; Griesel, M. A.; Lam, R.; Fayyad, K. V.
1996-01-01
This paper describes a hierarchical scheduling, planning, control, and execution monitoring architecture for automating operations of a worldwide network of communications antennas. The purpose of this paper is to describe an architecture for automating the process of capturing spacecraft data.
Collection Assessment: The Florida Community College Experience
ERIC Educational Resources Information Center
Perrault, Anna H.; Dixon, Jeannie
2007-01-01
Beginning in 1994, a series of collection analysis and assessment projects of community college library/LRC collections in Florida has been conducted by the College Center for Library Automation (CCLA). The purpose of the assessments conducted through LINCC, the network for Florida community colleges, was to provide data for improvement of…
Behavior-dependent Routing: Responding to Anomalies with Automated Low-cost Measures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oehmen, Christopher S.; Carroll, Thomas E.; Paulson, Patrick R.
2015-10-12
This is a conference paper submission describing research and software implementation of a cybersecurity concept that uses behavior models to trigger changes in routing of network traffic. As user behavior deviates more and more from baseline models, traffic is routed through more elevated layers of analysis and control.
First Annual Workshop on Space Operations Automation and Robotics (SOAR 87)
NASA Technical Reports Server (NTRS)
Griffin, Sandy (Editor)
1987-01-01
Several topics relative to automation and robotics technology are discussed. Automation of checkout, ground support, and logistics; automated software development; man-machine interfaces; neural networks; systems engineering and distributed/parallel processing architectures; and artificial intelligence/expert systems are among the topics covered.
Metabolic Network Modeling of Microbial Communities
Biggs, Matthew B.; Medlock, Gregory L.; Kolling, Glynis L.
2015-01-01
Genome-scale metabolic network reconstructions and constraint-based analysis are powerful methods that have the potential to make functional predictions about microbial communities. Current use of genome-scale metabolic networks to characterize the metabolic functions of microbial communities includes species compartmentalization, separating species-level and community-level objectives, dynamic analysis, the “enzyme-soup” approach, multi-scale modeling, and others. There are many challenges inherent to the field, including a need for tools that accurately assign high-level omics signals to individual community members, new automated reconstruction methods that rival manual curation, and novel algorithms for integrating omics data and engineering communities. As technologies and modeling frameworks improve, we expect that there will be proportional advances in the fields of ecology, health science, and microbial community engineering. PMID:26109480
Automated road marking recognition system
NASA Astrophysics Data System (ADS)
Ziyatdinov, R. R.; Shigabiev, R. R.; Talipov, D. N.
2017-09-01
Development of the automated road marking recognition systems in existing and future vehicles control systems is an urgent task. One way to implement such systems is the use of neural networks. To test the possibility of using neural network software has been developed with the use of a single-layer perceptron. The resulting system based on neural network has successfully coped with the task both when driving in the daytime and at night.
Automated Structure Annotation and Curation for MassBank: Potential and Pitfalls
The European MassBank server (www.massbank.eu) was founded in 2012 by the NORMAN Network (www.norman-network.net) to provide open access to mass spectra of substances of environmental interest contributed by NORMAN members. The automated workflow RMassBank was developed as a part...
ST-Segment Analysis Using Wireless Technology in Acute Myocardial Infarction (STAT-MI) trial.
Dhruva, Vivek N; Abdelhadi, Samir I; Anis, Ather; Gluckman, William; Hom, David; Dougan, William; Kaluski, Edo; Haider, Bunyad; Klapholz, Marc
2007-08-07
Our goal was to examine the effects of implementing a fully automated wireless network to reduce door-to-intervention times (D2I) in ST-segment elevation myocardial infarction (STEMI). Wireless technologies used to transmit prehospital electrocardiograms (ECGs) have helped to decrease D2I times but have unrealized potential. A fully automated wireless network that facilitates simultaneous 12-lead ECG transmission from emergency medical services (EMS) personnel in the field to the emergency department (ED) and offsite cardiologists via smartphones was developed. The system is composed of preconfigured Bluetooth devices, preprogrammed receiving/transmitting stations, dedicated e-mail servers, and smartphones. The network facilitates direct communication between offsite cardiologists and EMS personnel, allowing for patient triage directly to the cardiac catheterization laboratory from the field. Demographic, laboratory, and time interval data were prospectively collected and compared with calendar year 2005 data. From June to December 2006, 80 ECGs with suspected STEMI were transmitted via the network. Twenty patients with ECGs consistent with STEMI were triaged to the catheterization laboratory. Improvement was seen in mean door-to-cardiologist notification (-14.6 vs. 61.4 min, p < 0.001), door-to-arterial access (47.6 vs. 108.1 min, p < 0.001), time-to-first angiographic injection (52.8 vs. 119.2 min, p < 0.001), and D2I times (80.1 vs. 145.6 min, p < 0.001) compared with 2005 data. A fully automated wireless network that transmits ECGs simultaneously to the ED and offsite cardiologists for the early evaluation and triage of patients with suspected STEMI can decrease D2I times to <90 min and has the potential to be broadly applied in clinical practice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aykac, Deniz; Chaum, Edward; Fox, Karen
A telemedicine network with retina cameras and automated quality control, physiological feature location, and lesion/anomaly detection is a low-cost way of achieving broad-based screening for diabetic retinopathy (DR) and other eye diseases. In the process of a routine eye-screening examination, other non-image data is often available which may be useful in automated diagnosis of disease. In this work, we report on the results of combining this non-image data with image data, using the protocol and processing steps of a prototype system for automated disease diagnosis of retina examinations from a telemedicine network. The system includes quality assessments, automated physiology detection,more » and automated lesion detection to create an archive of known cases. Non-image data such as diabetes onset date and hemoglobin A1c (HgA1c) for each patient examination are included as well, and the system is used to create a content-based image retrieval engine capable of automated diagnosis of disease into 'normal' and 'abnormal' categories. The system achieves a sensitivity and specificity of 91.2% and 71.6% using hold-one-out validation testing.« less
3D Filament Network Segmentation with Multiple Active Contours
NASA Astrophysics Data System (ADS)
Xu, Ting; Vavylonis, Dimitrios; Huang, Xiaolei
2014-03-01
Fluorescence microscopy is frequently used to study two and three dimensional network structures formed by cytoskeletal polymer fibers such as actin filaments and microtubules. While these cytoskeletal structures are often dilute enough to allow imaging of individual filaments or bundles of them, quantitative analysis of these images is challenging. To facilitate quantitative, reproducible and objective analysis of the image data, we developed a semi-automated method to extract actin networks and retrieve their topology in 3D. Our method uses multiple Stretching Open Active Contours (SOACs) that are automatically initialized at image intensity ridges and then evolve along the centerlines of filaments in the network. SOACs can merge, stop at junctions, and reconfigure with others to allow smooth crossing at junctions of filaments. The proposed approach is generally applicable to images of curvilinear networks with low SNR. We demonstrate its potential by extracting the centerlines of synthetic meshwork images, actin networks in 2D TIRF Microscopy images, and 3D actin cable meshworks of live fission yeast cells imaged by spinning disk confocal microscopy.
NASA Astrophysics Data System (ADS)
Ali, Salah M.; Hui, K. H.; Hee, L. M.; Salman Leong, M.; Al-Obaidi, M. A.; Ali, Y. H.; Abdelrhman, Ahmed M.
2018-03-01
Acoustic emission (AE) analysis has become a vital tool for initiating the maintenance tasks in many industries. However, the analysis process and interpretation has been found to be highly dependent on the experts. Therefore, an automated monitoring method would be required to reduce the cost and time consumed in the interpretation of AE signal. This paper investigates the application of two of the most common machine learning approaches namely artificial neural network (ANN) and support vector machine (SVM) to automate the diagnosis of valve faults in reciprocating compressor based on AE signal parameters. Since the accuracy is an essential factor in any automated diagnostic system, this paper also provides a comparative study based on predictive performance of ANN and SVM. AE parameters data was acquired from single stage reciprocating air compressor with different operational and valve conditions. ANN and SVM diagnosis models were subsequently devised by combining AE parameters of different conditions. Results demonstrate that ANN and SVM models have the same results in term of prediction accuracy. However, SVM model is recommended to automate diagnose the valve condition in due to the ability of handling a high number of input features with low sampling data sets.
NASA Astrophysics Data System (ADS)
Tomori, Zoltan; Keša, Peter; Nikorovič, Matej; Kaůka, Jan; Zemánek, Pavel
2016-12-01
We proposed the improved control software for the holographic optical tweezers (HOT) proper for simple semi-automated sorting. The controller receives data from both the human interface sensors and the HOT microscope camera and processes them. As a result, the new positions of active laser traps are calculated, packed into the network format and sent to the remote HOT. Using the photo-polymerization technique, we created a sorting container consisting of two parallel horizontal walls where one wall contains "gates" representing a place where the trapped particle enters into the container. The positions of particles and gates are obtained by image analysis technique which can be exploited to achieve the higher level of automation. Sorting is documented on computer game simulation and the real experiment.
Providing security for automated process control systems at hydropower engineering facilities
NASA Astrophysics Data System (ADS)
Vasiliev, Y. S.; Zegzhda, P. D.; Zegzhda, D. P.
2016-12-01
This article suggests the concept of a cyberphysical system to manage computer security of automated process control systems at hydropower engineering facilities. According to the authors, this system consists of a set of information processing tools and computer-controlled physical devices. Examples of cyber attacks on power engineering facilities are provided, and a strategy of improving cybersecurity of hydropower engineering systems is suggested. The architecture of the multilevel protection of the automated process control system (APCS) of power engineering facilities is given, including security systems, control systems, access control, encryption, secure virtual private network of subsystems for monitoring and analysis of security events. The distinctive aspect of the approach is consideration of interrelations and cyber threats, arising when SCADA is integrated with the unified enterprise information system.
Automatic brain MR image denoising based on texture feature-based artificial neural networks.
Chang, Yu-Ning; Chang, Herng-Hua
2015-01-01
Noise is one of the main sources of quality deterioration not only for visual inspection but also in computerized processing in brain magnetic resonance (MR) image analysis such as tissue classification, segmentation and registration. Accordingly, noise removal in brain MR images is important for a wide variety of subsequent processing applications. However, most existing denoising algorithms require laborious tuning of parameters that are often sensitive to specific image features and textures. Automation of these parameters through artificial intelligence techniques will be highly beneficial. In the present study, an artificial neural network associated with image texture feature analysis is proposed to establish a predictable parameter model and automate the denoising procedure. In the proposed approach, a total of 83 image attributes were extracted based on four categories: 1) Basic image statistics. 2) Gray-level co-occurrence matrix (GLCM). 3) Gray-level run-length matrix (GLRLM) and 4) Tamura texture features. To obtain the ranking of discrimination in these texture features, a paired-samples t-test was applied to each individual image feature computed in every image. Subsequently, the sequential forward selection (SFS) method was used to select the best texture features according to the ranking of discrimination. The selected optimal features were further incorporated into a back propagation neural network to establish a predictable parameter model. A wide variety of MR images with various scenarios were adopted to evaluate the performance of the proposed framework. Experimental results indicated that this new automation system accurately predicted the bilateral filtering parameters and effectively removed the noise in a number of MR images. Comparing to the manually tuned filtering process, our approach not only produced better denoised results but also saved significant processing time.
Tschmelak, Jens; Proll, Guenther; Riedt, Johannes; Kaiser, Joachim; Kraemmer, Peter; Bárzaga, Luis; Wilkinson, James S; Hua, Ping; Hole, J Patrick; Nudd, Richard; Jackson, Michael; Abuknesha, Ram; Barceló, Damià; Rodriguez-Mozaz, Sara; de Alda, Maria J López; Sacher, Frank; Stien, Jan; Slobodník, Jaroslav; Oswald, Peter; Kozmenko, Helena; Korenková, Eva; Tóthová, Lívia; Krascsenits, Zoltan; Gauglitz, Guenter
2005-02-15
A novel analytical system AWACSS (automated water analyser computer-supported system) based on immunochemical technology has been developed that can measure several organic pollutants at low nanogram per litre level in a single few-minutes analysis without any prior sample pre-concentration nor pre-treatment steps. Having in mind actual needs of water-sector managers related to the implementation of the Drinking Water Directive (DWD) (98/83/EC, 1998) and Water Framework Directive WFD (2000/60/EC, 2000), drinking, ground, surface, and waste waters were major media used for the evaluation of the system performance. The instrument was equipped with remote control and surveillance facilities. The system's software allows for the internet-based networking between the measurement and control stations, global management, trend analysis, and early-warning applications. The experience of water laboratories has been utilised at the design of the instrument's hardware and software in order to make the system rugged and user-friendly. Several market surveys were conducted during the project to assess the applicability of the final system. A web-based AWACSS database was created for automated evaluation and storage of the obtained data in a format compatible with major databases of environmental organic pollutants in Europe. This first part article gives the reader an overview of the aims and scope of the AWACSS project as well as details about basic technology, immunoassays, software, and networking developed and utilised within the research project. The second part article reports on the system performance, first real sample measurements, and an international collaborative trial (inter-laboratory tests) to compare the biosensor with conventional anayltical methods.
Project Planning and Reporting
NASA Technical Reports Server (NTRS)
1982-01-01
Project Planning Analysis and Reporting System (PPARS) is automated aid in monitoring and scheduling of activities within project. PPARS system consists of PPARS Batch Program, five preprocessor programs, and two post-processor programs. PPARS Batch program is full CPM (Critical Path Method) scheduling program with resource capabilities. Can process networks with up to 10,000 activities.
Automated diagnosis of rolling bearings using MRA and neural networks
NASA Astrophysics Data System (ADS)
Castejón, C.; Lara, O.; García-Prada, J. C.
2010-01-01
Any industry needs an efficient predictive plan in order to optimize the management of resources and improve the economy of the plant by reducing unnecessary costs and increasing the level of safety. A great percentage of breakdowns in productive processes are caused by bearings. They begin to deteriorate from early stages of their functional life, also called the incipient level. This manuscript develops an automated diagnosis of rolling bearings based on the analysis and classification of signature vibrations. The novelty of this work is the application of the methodology proposed for data collected from a quasi-real industrial machine, where rolling bearings support the radial and axial loads the bearings are designed for. Multiresolution analysis (MRA) is used in a first stage in order to extract the most interesting features from signals. Features will be used in a second stage as inputs of a supervised neural network (NN) for classification purposes. Experimental results carried out in a real system show the soundness of the method which detects four bearing conditions (normal, inner race fault, outer race fault and ball fault) in a very incipient stage.
Automated retinal image quality assessment on the UK Biobank dataset for epidemiological studies.
Welikala, R A; Fraz, M M; Foster, P J; Whincup, P H; Rudnicka, A R; Owen, C G; Strachan, D P; Barman, S A
2016-04-01
Morphological changes in the retinal vascular network are associated with future risk of many systemic and vascular diseases. However, uncertainty over the presence and nature of some of these associations exists. Analysis of data from large population based studies will help to resolve these uncertainties. The QUARTZ (QUantitative Analysis of Retinal vessel Topology and siZe) retinal image analysis system allows automated processing of large numbers of retinal images. However, an image quality assessment module is needed to achieve full automation. In this paper, we propose such an algorithm, which uses the segmented vessel map to determine the suitability of retinal images for use in the creation of vessel morphometric data suitable for epidemiological studies. This includes an effective 3-dimensional feature set and support vector machine classification. A random subset of 800 retinal images from UK Biobank (a large prospective study of 500,000 middle aged adults; where 68,151 underwent retinal imaging) was used to examine the performance of the image quality algorithm. The algorithm achieved a sensitivity of 95.33% and a specificity of 91.13% for the detection of inadequate images. The strong performance of this image quality algorithm will make rapid automated analysis of vascular morphometry feasible on the entire UK Biobank dataset (and other large retinal datasets), with minimal operator involvement, and at low cost. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Gong, X.; Wu, Q.
2017-12-01
Network virtual instrument (VI) is a new development direction in current automated test. Based on LabVIEW, the software and hardware system of VI used for emission spectrum of pulsed high-voltage direct current (DC) discharge is developed and applied to investigate pulsed high-voltage DC discharge of nitrogen. By doing so, various functions are realized including real time collection of emission spectrum of nitrogen, monitoring operation state of instruments and real time analysis and processing of data. By using shared variables and DataSocket technology in LabVIEW, the network VI system based on field VI is established. The system can acquire the emission spectrum of nitrogen in the test site, monitor operation states of field instruments, realize real time face-to-face interchange of two sites, and analyze data in the far-end from the network terminal. By employing the network VI system, the staff in the two sites acquired the same emission spectrum of nitrogen and conducted the real time communication. By comparing with the previous results, it can be seen that the experimental data obtained by using the system are highly precise. This implies that the system shows reliable network stability and safety and satisfies the requirements for studying the emission spectrum of pulsed high-voltage discharge in high-precision fields or network terminals. The proposed architecture system is described and the target group gets the useful enlightenment in many fields including engineering remote users, specifically in control- and automation-related tasks.
NASA Astrophysics Data System (ADS)
Anitha, J.; Vijila, C. Kezi Selva; Hemanth, D. Jude
2010-02-01
Diabetic retinopathy (DR) is a chronic eye disease for which early detection is highly essential to avoid any fatal results. Image processing of retinal images emerge as a feasible tool for this early diagnosis. Digital image processing techniques involve image classification which is a significant technique to detect the abnormality in the eye. Various automated classification systems have been developed in the recent years but most of them lack high classification accuracy. Artificial neural networks are the widely preferred artificial intelligence technique since it yields superior results in terms of classification accuracy. In this work, Radial Basis function (RBF) neural network based bi-level classification system is proposed to differentiate abnormal DR Images and normal retinal images. The results are analyzed in terms of classification accuracy, sensitivity and specificity. A comparative analysis is performed with the results of the probabilistic classifier namely Bayesian classifier to show the superior nature of neural classifier. Experimental results show promising results for the neural classifier in terms of the performance measures.
A Low Cost Micro-Computer Based Local Area Network for Medical Office and Medical Center Automation
Epstein, Mel H.; Epstein, Lynn H.; Emerson, Ron G.
1984-01-01
A Low Cost Micro-computer based Local Area Network for medical office automation is described which makes use of an array of multiple and different personal computers interconnected by a local area network. Each computer on the network functions as fully potent workstations for data entry and report generation. The network allows each workstation complete access to the entire database. Additionally, designated computers may serve as access ports for remote terminals. Through “Gateways” the network may serve as a front end for a large mainframe, or may interface with another network. The system provides for the medical office environment the expandability and flexibility of a multi-terminal mainframe system at a far lower cost without sacrifice of performance.
Marcon, Luciano; Diego, Xavier; Sharpe, James; Müller, Patrick
2016-04-08
The Turing reaction-diffusion model explains how identical cells can self-organize to form spatial patterns. It has been suggested that extracellular signaling molecules with different diffusion coefficients underlie this model, but the contribution of cell-autonomous signaling components is largely unknown. We developed an automated mathematical analysis to derive a catalog of realistic Turing networks. This analysis reveals that in the presence of cell-autonomous factors, networks can form a pattern with equally diffusing signals and even for any combination of diffusion coefficients. We provide a software (available at http://www.RDNets.com) to explore these networks and to constrain topologies with qualitative and quantitative experimental data. We use the software to examine the self-organizing networks that control embryonic axis specification and digit patterning. Finally, we demonstrate how existing synthetic circuits can be extended with additional feedbacks to form Turing reaction-diffusion systems. Our study offers a new theoretical framework to understand multicellular pattern formation and enables the wide-spread use of mathematical biology to engineer synthetic patterning systems.
Marcon, Luciano; Diego, Xavier; Sharpe, James; Müller, Patrick
2016-01-01
The Turing reaction-diffusion model explains how identical cells can self-organize to form spatial patterns. It has been suggested that extracellular signaling molecules with different diffusion coefficients underlie this model, but the contribution of cell-autonomous signaling components is largely unknown. We developed an automated mathematical analysis to derive a catalog of realistic Turing networks. This analysis reveals that in the presence of cell-autonomous factors, networks can form a pattern with equally diffusing signals and even for any combination of diffusion coefficients. We provide a software (available at http://www.RDNets.com) to explore these networks and to constrain topologies with qualitative and quantitative experimental data. We use the software to examine the self-organizing networks that control embryonic axis specification and digit patterning. Finally, we demonstrate how existing synthetic circuits can be extended with additional feedbacks to form Turing reaction-diffusion systems. Our study offers a new theoretical framework to understand multicellular pattern formation and enables the wide-spread use of mathematical biology to engineer synthetic patterning systems. DOI: http://dx.doi.org/10.7554/eLife.14022.001 PMID:27058171
NASA Astrophysics Data System (ADS)
Stack, J. R.; Guthrie, R. S.; Cramer, M. A.
2009-05-01
The purpose of this paper is to outline the requisite technologies and enabling capabilities for network-centric sensor data analysis within the mine warfare community. The focus includes both automated processing and the traditional humancentric post-mission analysis (PMA) of tactical and environmental sensor data. This is motivated by first examining the high-level network-centric guidance and noting the breakdown in the process of distilling actionable requirements from this guidance. Examples are provided that illustrate the intuitive and substantial capability improvement resulting from processing sensor data jointly in a network-centric fashion. Several candidate technologies are introduced including the ability to fully process multi-sensor data given only partial overlap in sensor coverage and the ability to incorporate target identification information in stride. Finally the critical enabling capabilities are outlined including open architecture, open business, and a concept of operations. This ability to process multi-sensor data in a network-centric fashion is a core enabler of the Navy's vision and will become a necessity with the increasing number of manned and unmanned sensor systems and the requirement for their simultaneous use.
Operations automation using temporal dependency networks
NASA Technical Reports Server (NTRS)
Cooper, Lynne P.
1991-01-01
Precalibration activities for the Deep Space Network are time- and work force-intensive. Significant gains in availability and efficiency could be realized by intelligently incorporating automation techniques. An approach is presented to automation based on the use of Temporal Dependency Networks (TDNs). A TDN represents an activity by breaking it down into its component pieces and formalizing the precedence and other constraints associated with lower level activities. The representations are described which are used to implement a TDN and the underlying system architecture needed to support its use. The commercial applications of this technique are numerous. It has potential for application in any system which requires real-time, system-level control, and accurate monitoring of health, status, and configuration in an asynchronous environment.
Automated Information System (AIS) Alarm System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunteman, W.
1997-05-01
The Automated Information Alarm System is a joint effort between Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratory to demonstrate and implement, on a small-to-medium sized local area network, an automated system that detects and automatically responds to attacks that use readily available tools and methodologies. The Alarm System will sense or detect, assess, and respond to suspicious activities that may be detrimental to information on the network or to continued operation of the network. The responses will allow stopping, isolating, or ejecting the suspicious activities. The number of sensors, the sensitivity of the sensors, themore » assessment criteria, and the desired responses may be set by the using organization to meet their local security policies.« less
Distributed dynamic simulations of networked control and building performance applications.
Yahiaoui, Azzedine
2018-02-01
The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the minimum energy consumption possible, and in doing so generally refers to Building Automation and Control Systems (BACS) architecture. Instead of costly and time-consuming experiments, this paper focuses on using distributed dynamic simulations to analyze the real-time performance of network-based building control systems in ABs and improve the functions of the BACS technology. The paper also presents the development and design of a distributed dynamic simulation environment with the capability of representing the BACS architecture in simulation by run-time coupling two or more different software tools over a network. The application and capability of this new dynamic simulation environment are demonstrated by an experimental design in this paper.
Distributed dynamic simulations of networked control and building performance applications
Yahiaoui, Azzedine
2017-01-01
The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the minimum energy consumption possible, and in doing so generally refers to Building Automation and Control Systems (BACS) architecture. Instead of costly and time-consuming experiments, this paper focuses on using distributed dynamic simulations to analyze the real-time performance of network-based building control systems in ABs and improve the functions of the BACS technology. The paper also presents the development and design of a distributed dynamic simulation environment with the capability of representing the BACS architecture in simulation by run-time coupling two or more different software tools over a network. The application and capability of this new dynamic simulation environment are demonstrated by an experimental design in this paper. PMID:29568135
Protecting Networks Via Automated Defense of Cyber Systems
2016-09-01
autonomics, and artificial intelligence . Our conclusion is that automation is the future of cyber defense, and that advances are being made in each of...SUBJECT TERMS Internet of Things, autonomics, sensors, artificial intelligence , cyber defense, active cyber defense, automated indicator sharing...called Automated Defense of Cyber Systems, built upon three core technological components: sensors, autonomics, and artificial intelligence . Our
NASA Astrophysics Data System (ADS)
Baars, Holger; Althausen, Dietrich; Engelmann, Ronny; Heese, Birgit; Ansmann, Albert; Wandinger, Ulla; Hofer, Julian; Skupin, Annett; Komppula, Mika; Giannakaki, Eleni; Filioglou, Maria; Bortoli, Daniele; Silva, Ana Maria; Pereira, Sergio; Stachlewska, Iwona S.; Kumala, Wojciech; Szczepanik, Dominika; Amiridis, Vassilis; Marinou, Eleni; Kottas, Michail; Mattis, Ina; Müller, Gerhard
2018-04-01
PollyNET is a network of portable, automated, and continuously measuring Ramanpolarization lidars of type Polly operated by several institutes worldwide. The data from permanent and temporary measurements sites are automatically processed in terms of optical aerosol profiles and displayed in near-real time at polly.tropos.de. According to current schedules, the network will grow by 3-4 systems during the upcoming 2-3 years and will then comprise 11 permanent stations and 2 mobile platforms.
Space station data system analysis/architecture study. Task 3: Trade studies, DR-5, volume 1
NASA Technical Reports Server (NTRS)
1985-01-01
The primary objective of Task 3 is to provide additional analysis and insight necessary to support key design/programmatic decision for options quantification and selection for system definition. This includes: (1) the identification of key trade study topics; (2) the definition of a trade study procedure for each topic (issues to be resolved, key inputs, criteria/weighting, methodology); (3) conduct tradeoff and sensitivity analysis; and (4) the review/verification of results within the context of evolving system design and definition. The trade study topics addressed in this volume include space autonomy and function automation, software transportability, system network topology, communications standardization, onboard local area networking, distributed operating system, software configuration management, and the software development environment facility.
CIAN - Cell Imaging and Analysis Network at the Biology Department of McGill University
Lacoste, J.; Lesage, G.; Bunnell, S.; Han, H.; Küster-Schöck, E.
2010-01-01
CF-31 The Cell Imaging and Analysis Network (CIAN) provides services and tools to researchers in the field of cell biology from within or outside Montreal's McGill University community. CIAN is composed of six scientific platforms: Cell Imaging (confocal and fluorescence microscopy), Proteomics (2-D protein gel electrophoresis and DiGE, fluorescent protein analysis), Automation and High throughput screening (Pinning robot and liquid handler), Protein Expression for Antibody Production, Genomics (real-time PCR), and Data storage and analysis (cluster, server, and workstations). Users submit project proposals, and can obtain training and consultation in any aspect of the facility, or initiate projects with the full-service platforms. CIAN is designed to facilitate training, enhance interactions, as well as share and maintain resources and expertise.
A Baseline for the Multivariate Comparison of Resting-State Networks
Allen, Elena A.; Erhardt, Erik B.; Damaraju, Eswar; Gruner, William; Segall, Judith M.; Silva, Rogers F.; Havlicek, Martin; Rachakonda, Srinivas; Fries, Jill; Kalyanam, Ravi; Michael, Andrew M.; Caprihan, Arvind; Turner, Jessica A.; Eichele, Tom; Adelsheim, Steven; Bryan, Angela D.; Bustillo, Juan; Clark, Vincent P.; Feldstein Ewing, Sarah W.; Filbey, Francesca; Ford, Corey C.; Hutchison, Kent; Jung, Rex E.; Kiehl, Kent A.; Kodituwakku, Piyadasa; Komesu, Yuko M.; Mayer, Andrew R.; Pearlson, Godfrey D.; Phillips, John P.; Sadek, Joseph R.; Stevens, Michael; Teuscher, Ursina; Thoma, Robert J.; Calhoun, Vince D.
2011-01-01
As the size of functional and structural MRI datasets expands, it becomes increasingly important to establish a baseline from which diagnostic relevance may be determined, a processing strategy that efficiently prepares data for analysis, and a statistical approach that identifies important effects in a manner that is both robust and reproducible. In this paper, we introduce a multivariate analytic approach that optimizes sensitivity and reduces unnecessary testing. We demonstrate the utility of this mega-analytic approach by identifying the effects of age and gender on the resting-state networks (RSNs) of 603 healthy adolescents and adults (mean age: 23.4 years, range: 12–71 years). Data were collected on the same scanner, preprocessed using an automated analysis pipeline based in SPM, and studied using group independent component analysis. RSNs were identified and evaluated in terms of three primary outcome measures: time course spectral power, spatial map intensity, and functional network connectivity. Results revealed robust effects of age on all three outcome measures, largely indicating decreases in network coherence and connectivity with increasing age. Gender effects were of smaller magnitude but suggested stronger intra-network connectivity in females and more inter-network connectivity in males, particularly with regard to sensorimotor networks. These findings, along with the analysis approach and statistical framework described here, provide a useful baseline for future investigations of brain networks in health and disease. PMID:21442040
Decision support systems and methods for complex networks
Huang, Zhenyu [Richland, WA; Wong, Pak Chung [Richland, WA; Ma, Jian [Richland, WA; Mackey, Patrick S [Richland, WA; Chen, Yousu [Richland, WA; Schneider, Kevin P [Seattle, WA
2012-02-28
Methods and systems for automated decision support in analyzing operation data from a complex network. Embodiments of the present invention utilize these algorithms and techniques not only to characterize the past and present condition of a complex network, but also to predict future conditions to help operators anticipate deteriorating and/or problem situations. In particular, embodiments of the present invention characterize network conditions from operation data using a state estimator. Contingency scenarios can then be generated based on those network conditions. For at least a portion of all of the contingency scenarios, risk indices are determined that describe the potential impact of each of those scenarios. Contingency scenarios with risk indices are presented visually as graphical representations in the context of a visual representation of the complex network. Analysis of the historical risk indices based on the graphical representations can then provide trends that allow for prediction of future network conditions.
Theodosiou, Theodosios; Efstathiou, Georgios; Papanikolaou, Nikolas; Kyrpides, Nikos C; Bagos, Pantelis G; Iliopoulos, Ioannis; Pavlopoulos, Georgios A
2017-07-14
Nowadays, due to the technological advances of high-throughput techniques, Systems Biology has seen a tremendous growth of data generation. With network analysis, looking at biological systems at a higher level in order to better understand a system, its topology and the relationships between its components is of a great importance. Gene expression, signal transduction, protein/chemical interactions, biomedical literature co-occurrences, are few of the examples captured in biological network representations where nodes represent certain bioentities and edges represent the connections between them. Today, many tools for network visualization and analysis are available. Nevertheless, most of them are standalone applications that often (i) burden users with computing and calculation time depending on the network's size and (ii) focus on handling, editing and exploring a network interactively. While such functionality is of great importance, limited efforts have been made towards the comparison of the topological analysis of multiple networks. Network Analysis Provider (NAP) is a comprehensive web tool to automate network profiling and intra/inter-network topology comparison. It is designed to bridge the gap between network analysis, statistics, graph theory and partially visualization in a user-friendly way. It is freely available and aims to become a very appealing tool for the broader community. It hosts a great plethora of topological analysis methods such as node and edge rankings. Few of its powerful characteristics are: its ability to enable easy profile comparisons across multiple networks, find their intersection and provide users with simplified, high quality plots of any of the offered topological characteristics against any other within the same network. It is written in R and Shiny, it is based on the igraph library and it is able to handle medium-scale weighted/unweighted, directed/undirected and bipartite graphs. NAP is available at http://bioinformatics.med.uoc.gr/NAP .
Automated analysis of Physarum network structure and dynamics
NASA Astrophysics Data System (ADS)
Fricker, Mark D.; Akita, Dai; Heaton, Luke LM; Jones, Nick; Obara, Boguslaw; Nakagaki, Toshiyuki
2017-06-01
We evaluate different ridge-enhancement and segmentation methods to automatically extract the network architecture from time-series of Physarum plasmodia withdrawing from an arena via a single exit. Whilst all methods gave reasonable results, judged by precision-recall analysis against a ground-truth skeleton, the mean phase angle (Feature Type) from intensity-independent, phase-congruency edge enhancement and watershed segmentation was the most robust to variation in threshold parameters. The resultant single pixel-wide segmented skeleton was converted to a graph representation as a set of weighted adjacency matrices containing the physical dimensions of each vein, and the inter-vein regions. We encapsulate the complete image processing and network analysis pipeline in a downloadable software package, and provide an extensive set of metrics that characterise the network structure, including hierarchical loop decomposition to analyse the nested structure of the developing network. In addition, the change in volume for each vein and intervening plasmodial sheet was used to predict the net flow across the network. The scaling relationships between predicted current, speed and shear force with vein radius were consistent with predictions from Murray’s law. This work was presented at PhysNet 2015.
McKendrick, Ryan; Shaw, Tyler; de Visser, Ewart; Saqer, Haneen; Kidwell, Brian; Parasuraman, Raja
2014-05-01
Assess team performance within a net-worked supervisory control setting while manipulating automated decision aids and monitoring team communication and working memory ability. Networked systems such as multi-unmanned air vehicle (UAV) supervision have complex properties that make prediction of human-system performance difficult. Automated decision aid can provide valuable information to operators, individual abilities can limit or facilitate team performance, and team communication patterns can alter how effectively individuals work together. We hypothesized that reliable automation, higher working memory capacity, and increased communication rates of task-relevant information would offset performance decrements attributed to high task load. Two-person teams performed a simulated air defense task with two levels of task load and three levels of automated aid reliability. Teams communicated and received decision aid messages via chat window text messages. Task Load x Automation effects were significant across all performance measures. Reliable automation limited the decline in team performance with increasing task load. Average team spatial working memory was a stronger predictor than other measures of team working memory. Frequency of team rapport and enemy location communications positively related to team performance, and word count was negatively related to team performance. Reliable decision aiding mitigated team performance decline during increased task load during multi-UAV supervisory control. Team spatial working memory, communication of spatial information, and team rapport predicted team success. An automated decision aid can improve team performance under high task load. Assessment of spatial working memory and the communication of task-relevant information can help in operator and team selection in supervisory control systems.
ERIC Educational Resources Information Center
Borgman, Christine L.
1996-01-01
Reports on a survey of 70 research libraries in Croatia, Czech Republic, Hungary, Poland, Slovakia, and Slovenia. Results show that libraries are rapidly acquiring automated processing systems, CD-ROM databases, and connections to computer networks. Discusses specific data on system implementation and network services by country and by type of…
The Development of Design Tools for Fault Tolerant Quantum Dot Cellular Automata Based Logic
NASA Technical Reports Server (NTRS)
Armstrong, Curtis D.; Humphreys, William M.
2003-01-01
We are developing software to explore the fault tolerance of quantum dot cellular automata gate architectures in the presence of manufacturing variations and device defects. The Topology Optimization Methodology using Applied Statistics (TOMAS) framework extends the capabilities of the A Quantum Interconnected Network Array Simulator (AQUINAS) by adding front-end and back-end software and creating an environment that integrates all of these components. The front-end tools establish all simulation parameters, configure the simulation system, automate the Monte Carlo generation of simulation files, and execute the simulation of these files. The back-end tools perform automated data parsing, statistical analysis and report generation.
NASA Astrophysics Data System (ADS)
Nuzhnaya, Tatyana; Bakic, Predrag; Kontos, Despina; Megalooikonomou, Vasileios; Ling, Haibin
2012-02-01
This work is a part of our ongoing study aimed at understanding a relation between the topology of anatomical branching structures with the underlying image texture. Morphological variability of the breast ductal network is associated with subsequent development of abnormalities in patients with nipple discharge such as papilloma, breast cancer and atypia. In this work, we investigate complex dependence among ductal components to perform segmentation, the first step for analyzing topology of ductal lobes. Our automated framework is based on incorporating a conditional random field with texture descriptors of skewness, coarseness, contrast, energy and fractal dimension. These features are selected to capture the architectural variability of the enhanced ducts by encoding spatial variations between pixel patches in galactographic image. The segmentation algorithm was applied to a dataset of 20 x-ray galactograms obtained at the Hospital of the University of Pennsylvania. We compared the performance of the proposed approach with fully and semi automated segmentation algorithms based on neural network classification, fuzzy-connectedness, vesselness filter and graph cuts. Global consistency error and confusion matrix analysis were used as accuracy measurements. For the proposed approach, the true positive rate was higher and the false negative rate was significantly lower compared to other fully automated methods. This indicates that segmentation based on CRF incorporated with texture descriptors has potential to efficiently support the analysis of complex topology of the ducts and aid in development of realistic breast anatomy phantoms.
Using deep neural networks to augment NIF post-shot analysis
NASA Astrophysics Data System (ADS)
Humbird, Kelli; Peterson, Luc; McClarren, Ryan; Field, John; Gaffney, Jim; Kruse, Michael; Nora, Ryan; Spears, Brian
2017-10-01
Post-shot analysis of National Ignition Facility (NIF) experiments is the process of determining which simulation inputs yield results consistent with experimental observations. This analysis is typically accomplished by running suites of manually adjusted simulations, or Monte Carlo sampling surrogate models that approximate the response surfaces of the physics code. These approaches are expensive and often find simulations that match only a small subset of observables simultaneously. We demonstrate an alternative method for performing post-shot analysis using inverse models, which map directly from experimental observables to simulation inputs with quantified uncertainties. The models are created using a novel machine learning algorithm which automates the construction and initialization of deep neural networks to optimize predictive accuracy. We show how these neural networks, trained on large databases of post-shot simulations, can rigorously quantify the agreement between simulation and experiment. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bobyshev, A.; Lamore, D.; Demar, P.
2004-12-01
In a large campus network, such at Fermilab, with tens of thousands of nodes, scanning initiated from either outside of or within the campus network raises security concerns. This scanning may have very serious impact on network performance, and even disrupt normal operation of many services. In this paper we introduce a system for detecting and automatic blocking excessive traffic of different kinds of scanning, DoS attacks, virus infected computers. The system, called AutoBlocker, is a distributed computing system based on quasi-real time analysis of network flow data collected from the border router and core switches. AutoBlocker also has anmore » interface to accept alerts from IDS systems (e.g. BRO, SNORT) that are based on other technologies. The system has multiple configurable alert levels for the detection of anomalous behavior and configurable trigger criteria for automated blocking of scans at the core or border routers. It has been in use at Fermilab for about 2 years, and has become a very valuable tool to curtail scan activity within the Fermilab campus network.« less
Automated Meteor Detection by All-Sky Digital Camera Systems
NASA Astrophysics Data System (ADS)
Suk, Tomáš; Šimberová, Stanislava
2017-12-01
We have developed a set of methods to detect meteor light traces captured by all-sky CCD cameras. Operating at small automatic observatories (stations), these cameras create a network spread over a large territory. Image data coming from these stations are merged in one central node. Since a vast amount of data is collected by the stations in a single night, robotic storage and analysis are essential to processing. The proposed methodology is adapted to data from a network of automatic stations equipped with digital fish-eye cameras and includes data capturing, preparation, pre-processing, analysis, and finally recognition of objects in time sequences. In our experiments we utilized real observed data from two stations.
SISSY: An example of a multi-threaded, networked, object-oriented databased application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scipioni, B.; Liu, D.; Song, T.
1993-05-01
The Systems Integration Support SYstem (SISSY) is presented and its capabilities and techniques are discussed. It is fully automated data collection and analysis system supporting the SSCL`s systems analysis activities as they relate to the Physics Detector and Simulation Facility (PDSF). SISSY itself is a paradigm of effective computing on the PDSF. It uses home-grown code (C++), network programming (RPC, SNMP), relational (SYBASE) and object-oriented (ObjectStore) DBMSs, UNIX operating system services (IRIX threads, cron, system utilities, shells scripts, etc.), and third party software applications (NetCentral Station, Wingz, DataLink) all of which act together as a single application to monitor andmore » analyze the PDSF.« less
Li, Tim M H; Chau, Michael; Wong, Paul W C; Lai, Eliza S Y; Yip, Paul S F
2013-05-15
Internet-based learning programs provide people with massive health care information and self-help guidelines on improving their health. The advent of Web 2.0 and social networks renders significant flexibility to embedding highly interactive components, such as games, to foster learning processes. The effectiveness of game-based learning on social networks has not yet been fully evaluated. The aim of this study was to assess the effectiveness of a fully automated, Web-based, social network electronic game on enhancing mental health knowledge and problem-solving skills of young people. We investigated potential motivational constructs directly affecting the learning outcome. Gender differences in learning outcome and motivation were also examined. A pre/posttest design was used to evaluate the fully automated Web-based intervention. Participants, recruited from a closed online user group, self-assessed their mental health literacy and motivational constructs before and after completing the game within a 3-week period. The electronic game was designed according to cognitive-behavioral approaches. Completers and intent-to-treat analyses, using multiple imputation for missing data, were performed. Regression analysis with backward selection was employed when examining the relationship between knowledge enhancement and motivational constructs. The sample included 73 undergraduates (42 females) for completers analysis. The gaming approach was effective in enhancing young people's mental health literacy (d=0.65). The finding was also consistent with the intent-to-treat analysis, which included 127 undergraduates (75 females). No gender differences were found in learning outcome (P=.97). Intrinsic goal orientation was the primary factor in learning motivation, whereas test anxiety was successfully alleviated in the game setting. No gender differences were found on any learning motivation subscales (P>.10). We also found that participants' self-efficacy for learning and performance, as well as test anxiety, significantly affected their learning outcomes, whereas other motivational subscales were statistically nonsignificant. Electronic games implemented through social networking sites appear to effectively enhance users' mental health literacy.
Post-processing of seismic parameter data based on valid seismic event determination
McEvilly, Thomas V.
1985-01-01
An automated seismic processing system and method are disclosed, including an array of CMOS microprocessors for unattended battery-powered processing of a multi-station network. According to a characterizing feature of the invention, each channel of the network is independently operable to automatically detect, measure times and amplitudes, and compute and fit Fast Fourier transforms (FFT's) for both P- and S- waves on analog seismic data after it has been sampled at a given rate. The measured parameter data from each channel are then reviewed for event validity by a central controlling microprocessor and if determined by preset criteria to constitute a valid event, the parameter data are passed to an analysis computer for calculation of hypocenter location, running b-values, source parameters, event count, P- wave polarities, moment-tensor inversion, and Vp/Vs ratios. The in-field real-time analysis of data maximizes the efficiency of microearthquake surveys allowing flexibility in experimental procedures, with a minimum of traditional labor-intensive postprocessing. A unique consequence of the system is that none of the original data (i.e., the sensor analog output signals) are necessarily saved after computation, but rather, the numerical parameters generated by the automatic analysis are the sole output of the automated seismic processor.
NASA Technical Reports Server (NTRS)
Broderick, Ron
1997-01-01
The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network development. The changes were to include evaluation tools that can be applied to neural networks at each phase of the software engineering life cycle. The result was a formal evaluation approach to increase the product quality of systems that use neural networks for their implementation.
Contemporary data communications and local networking principles
NASA Astrophysics Data System (ADS)
Chartrand, G. A.
1982-08-01
The most important issue of data communications today is networking which can be roughly divided into two catagories: local networking; and distributed processing. The most sought after aspect of local networking is office automation. Office automation really is the grand unification of all local communications and not of a new type of business office as the name might imply. This unification is the ability to have voice, data, and video carried by the same medium and managed by the same network resources. There are many different ways this unification can be done, and many manufacturers are designing systems to accomplish the task. Distributed processing attempts to share resources between computer systems and peripheral subsystems from the same or different manufacturers. There are several companies that are trying to solve both networking problems with the same network architecture.
CerebralWeb: a Cytoscape.js plug-in to visualize networks stratified by subcellular localization.
Frias, Silvia; Bryan, Kenneth; Brinkman, Fiona S L; Lynn, David J
2015-01-01
CerebralWeb is a light-weight JavaScript plug-in that extends Cytoscape.js to enable fast and interactive visualization of molecular interaction networks stratified based on subcellular localization or other user-supplied annotation. The application is designed to be easily integrated into any website and is configurable to support customized network visualization. CerebralWeb also supports the automatic retrieval of Cerebral-compatible localizations for human, mouse and bovine genes via a web service and enables the automated parsing of Cytoscape compatible XGMML network files. CerebralWeb currently supports embedded network visualization on the InnateDB (www.innatedb.com) and Allergy and Asthma Portal (allergen.innatedb.com) database and analysis resources. Database tool URL: http://www.innatedb.com/CerebralWeb © The Author(s) 2015. Published by Oxford University Press.
GIS as a tool for efficient management of transport streams
NASA Astrophysics Data System (ADS)
Zatserkovnyi, V. I.; Kobrin, O. V.
2015-10-01
The transport network, which is an ideal object for the automation and the increase of efficiency using geographic information systems (GIS), is considered. The transport problems, which have a lot of mathematical models of the traffic flow for their solution, are enumerated. GIS analysis tools that allow one to build optimal routes in the real road network with its capabilities and limitations are presented. They can solve the extremely important problem of modern Ukraine - the rapid increase of the number of cars and the glut of road network vehicles. The intelligent transport systems, which are created and developed on the basis of GPS, GIS, modern communications and telecommunications facilities, are considered.
Bluetooth Low Power Modes Applied to the Data Transportation Network in Home Automation Systems.
Etxaniz, Josu; Aranguren, Gerardo
2017-04-30
Even though home automation is a well-known research and development area, recent technological improvements in different areas such as context recognition, sensing, wireless communications or embedded systems have boosted wireless smart homes. This paper focuses on some of those areas related to home automation. The paper draws attention to wireless communications issues on embedded systems. Specifically, the paper discusses the multi-hop networking together with Bluetooth technology and latency, as a quality of service (QoS) metric. Bluetooth is a worldwide standard that provides low power multi-hop networking. It is a radio license free technology and establishes point-to-point and point-to-multipoint links, known as piconets, or multi-hop networks, known as scatternets. This way, many Bluetooth nodes can be interconnected to deploy ambient intelligent networks. This paper introduces the research on multi-hop latency done with park and sniff low power modes of Bluetooth over the test platform developed. Besides, an empirical model is obtained to calculate the latency of Bluetooth multi-hop communications over asynchronous links when links in scatternets are always in sniff or the park mode. Smart home devices and networks designers would take advantage of the models and the estimation of the delay they provide in communications along Bluetooth multi-hop networks.
Bluetooth Low Power Modes Applied to the Data Transportation Network in Home Automation Systems
Etxaniz, Josu; Aranguren, Gerardo
2017-01-01
Even though home automation is a well-known research and development area, recent technological improvements in different areas such as context recognition, sensing, wireless communications or embedded systems have boosted wireless smart homes. This paper focuses on some of those areas related to home automation. The paper draws attention to wireless communications issues on embedded systems. Specifically, the paper discusses the multi-hop networking together with Bluetooth technology and latency, as a quality of service (QoS) metric. Bluetooth is a worldwide standard that provides low power multi-hop networking. It is a radio license free technology and establishes point-to-point and point-to-multipoint links, known as piconets, or multi-hop networks, known as scatternets. This way, many Bluetooth nodes can be interconnected to deploy ambient intelligent networks. This paper introduces the research on multi-hop latency done with park and sniff low power modes of Bluetooth over the test platform developed. Besides, an empirical model is obtained to calculate the latency of Bluetooth multi-hop communications over asynchronous links when links in scatternets are always in sniff or the park mode. Smart home devices and networks designers would take advantage of the models and the estimation of the delay they provide in communications along Bluetooth multi-hop networks. PMID:28468294
Technology Infusion of CodeSonar into the Space Network Ground Segment (RII07)
NASA Technical Reports Server (NTRS)
Benson, Markland
2008-01-01
The NASA Software Assurance Research Program (in part) performs studies as to the feasibility of technologies for improving the safety, quality, reliability, cost, and performance of NASA software. This study considers the application of commercial automated source code analysis tools to mission critical ground software that is in the operations and sustainment portion of the product lifecycle.
ERIC Educational Resources Information Center
Malybaev, Saken K.; Malaybaev, Nurlan S.; Isina, Botakoz M.; Kenzhekeeva, Akbope R.; Khuangan, Nurbol
2016-01-01
The article presents the results of researches aimed at the creation of automated workplaces for railway transport specialists with the help of intelligent information systems. The analysis of tendencies of information technologies development in the transport network was conducted. It was determined that the most effective approach is to create…
Microvessel prediction in H&E Stained Pathology Images using fully convolutional neural networks.
Yi, Faliu; Yang, Lin; Wang, Shidan; Guo, Lei; Huang, Chenglong; Xie, Yang; Xiao, Guanghua
2018-02-27
Pathological angiogenesis has been identified in many malignancies as a potential prognostic factor and target for therapy. In most cases, angiogenic analysis is based on the measurement of microvessel density (MVD) detected by immunostaining of CD31 or CD34. However, most retrievable public data is generally composed of Hematoxylin and Eosin (H&E)-stained pathology images, for which is difficult to get the corresponding immunohistochemistry images. The role of microvessels in H&E stained images has not been widely studied due to their complexity and heterogeneity. Furthermore, identifying microvessels manually for study is a labor-intensive task for pathologists, with high inter- and intra-observer variation. Therefore, it is important to develop automated microvessel-detection algorithms in H&E stained pathology images for clinical association analysis. In this paper, we propose a microvessel prediction method using fully convolutional neural networks. The feasibility of our proposed algorithm is demonstrated through experimental results on H&E stained images. Furthermore, the identified microvessel features were significantly associated with the patient clinical outcomes. This is the first study to develop an algorithm for automated microvessel detection in H&E stained pathology images.
Cortina, Montserrat; Ecker, Christina; Calvo, Daniel; del Valle, Manuel
2008-01-22
An automated electronic tongue consisting of an array of potentiometric sensors and an artificial neural network (ANN) has been developed to resolve mixtures of anionic surfactants. The sensor array was formed by five different flow-through sensors for anionic surfactants, based on poly(vinyl chloride) membranes having cross-sensitivity features. Feedforward multilayer neural networks were used to predict surfactant concentrations. As a great amount of information is required for the correct modelling of the sensors response, a sequential injection analysis (SIA) system was used to automatically provide it. Dodecylsulfate (DS(-)), dodecylbenzenesulfonate (DBS(-)) and alpha-alkene sulfonate (ALF(-)) formed the three-analyte study case resolved in this work. Their concentrations varied from 0.2 to 4mM for ALF(-) and DBS(-) and from 0.2 to 5mM for DS(-). Good prediction ability was obtained with correlation coefficients better than 0.933 when the obtained values were compared with those expected for a set of 16 external test samples not used for training.
NASA Astrophysics Data System (ADS)
Okumura, Hiroshi; Suezaki, Masashi; Sueyasu, Hideki; Arai, Kohei
2003-03-01
An automated method that can select corresponding point candidates is developed. This method has the following three features: 1) employment of the RIN-net for corresponding point candidate selection; 2) employment of multi resolution analysis with Haar wavelet transformation for improvement of selection accuracy and noise tolerance; 3) employment of context information about corresponding point candidates for screening of selected candidates. Here, the 'RIN-net' means the back-propagation trained feed-forward 3-layer artificial neural network that feeds rotation invariants as input data. In our system, pseudo Zernike moments are employed as the rotation invariants. The RIN-net has N x N pixels field of view (FOV). Some experiments are conducted to evaluate corresponding point candidate selection capability of the proposed method by using various kinds of remotely sensed images. The experimental results show the proposed method achieves fewer training patterns, less training time, and higher selection accuracy than conventional method.
NASA Astrophysics Data System (ADS)
Hiramatsu, Yuya; Muramatsu, Chisako; Kobayashi, Hironobu; Hara, Takeshi; Fujita, Hiroshi
2017-03-01
Breast cancer screening with mammography and ultrasonography is expected to improve sensitivity compared with mammography alone, especially for women with dense breast. An automated breast volume scanner (ABVS) provides the operator-independent whole breast data which facilitate double reading and comparison with past exams, contralateral breast, and multimodality images. However, large volumetric data in screening practice increase radiologists' workload. Therefore, our goal is to develop a computer-aided detection scheme of breast masses in ABVS data for assisting radiologists' diagnosis and comparison with mammographic findings. In this study, false positive (FP) reduction scheme using deep convolutional neural network (DCNN) was investigated. For training DCNN, true positive and FP samples were obtained from the result of our initial mass detection scheme using the vector convergence filter. Regions of interest including the detected regions were extracted from the multiplanar reconstraction slices. We investigated methods to select effective FP samples for training the DCNN. Based on the free response receiver operating characteristic analysis, simple random sampling from the entire candidates was most effective in this study. Using DCNN, the number of FPs could be reduced by 60%, while retaining 90% of true masses. The result indicates the potential usefulness of DCNN for FP reduction in automated mass detection on ABVS images.
Pore network quantification of sandstones under experimental CO2 injection using image analysis
NASA Astrophysics Data System (ADS)
Berrezueta, Edgar; González-Menéndez, Luís; Ordóñez-Casado, Berta; Olaya, Peter
2015-04-01
Automated-image identification and quantification of minerals, pores and textures together with petrographic analysis can be applied to improve pore system characterization in sedimentary rocks. Our case study is focused on the application of these techniques to study the evolution of rock pore network subjected to super critical CO2-injection. We have proposed a Digital Image Analysis (DIA) protocol that guarantees measurement reproducibility and reliability. This can be summarized in the following stages: (i) detailed description of mineralogy and texture (before and after CO2-injection) by optical and scanning electron microscopy (SEM) techniques using thin sections; (ii) adjustment and calibration of DIA tools; (iii) data acquisition protocol based on image capture with different polarization conditions (synchronized movement of polarizers); (iv) study and quantification by DIA that allow (a) identification and isolation of pixels that belong to the same category: minerals vs. pores in each sample and (b) measurement of changes in pore network, after the samples have been exposed to new conditions (in our case: SC-CO2-injection). Finally, interpretation of the petrography and the measured data by an automated approach were done. In our applied study, the DIA results highlight the changes observed by SEM and microscopic techniques, which consisted in a porosity increase when CO2 treatment occurs. Other additional changes were minor: variations in the roughness and roundness of pore edges, and pore aspect ratio, shown in the bigger pore population. Additionally, statistic tests of pore parameters measured were applied to verify that the differences observed between samples before and after CO2-injection were significant.
NASA Astrophysics Data System (ADS)
Lösler, Michael; Haas, Rüdiger; Eschelbach, Cornelia
2013-08-01
The Global Geodetic Observing System (GGOS) requires sub-mm accuracy, automated and continual determinations of the so-called local tie vectors at co-location stations. Co-location stations host instrumentation for several space geodetic techniques and the local tie surveys involve the relative geometry of the reference points of these instruments. Thus, these reference points need to be determined in a common coordinate system, which is a particular challenge for rotating equipment like radio telescopes for geodetic Very Long Baseline Interferometry. In this work we describe a concept to achieve automated and continual determinations of radio telescope reference points with sub-mm accuracy. We developed a monitoring system, including Java-based sensor communication for automated surveys, network adjustment and further data analysis. This monitoring system was tested during a monitoring campaign performed at the Onsala Space Observatory in the summer of 2012. The results obtained in this campaign show that it is possible to perform automated determination of a radio telescope reference point during normal operations of the telescope. Accuracies on the sub-mm level can be achieved, and continual determinations can be realized by repeated determinations and recursive estimation methods.
Automated analysis of high-content microscopy data with deep learning.
Kraus, Oren Z; Grys, Ben T; Ba, Jimmy; Chong, Yolanda; Frey, Brendan J; Boone, Charles; Andrews, Brenda J
2017-04-18
Existing computational pipelines for quantitative analysis of high-content microscopy data rely on traditional machine learning approaches that fail to accurately classify more than a single dataset without substantial tuning and training, requiring extensive analysis. Here, we demonstrate that the application of deep learning to biological image data can overcome the pitfalls associated with conventional machine learning classifiers. Using a deep convolutional neural network (DeepLoc) to analyze yeast cell images, we show improved performance over traditional approaches in the automated classification of protein subcellular localization. We also demonstrate the ability of DeepLoc to classify highly divergent image sets, including images of pheromone-arrested cells with abnormal cellular morphology, as well as images generated in different genetic backgrounds and in different laboratories. We offer an open-source implementation that enables updating DeepLoc on new microscopy datasets. This study highlights deep learning as an important tool for the expedited analysis of high-content microscopy data. © 2017 The Authors. Published under the terms of the CC BY 4.0 license.
2012-11-02
Applied Actant-Network Theory: Toward the Automated Detection of Technoscientific Emergence from Full-Text Publications and Patents David C...Brock**, Olga Babko-Malaya*, James Pustejovsky***, Patrick Thomas****, *BAE Systems Advanced Information Technologies, ** David C. Brock Consulting... Wojick , D. 2008. Population modeling of the emergence and development of scientific fields. Scientometrics, 75(3):495–518. Cook, T. D. and
Integration of Hierarchical Goal Network Planning and Autonomous Path Planning
2016-03-01
Conference on Robotics and Automation (ICRA); 2010 May 3– 7; Anchorage, AK. p. 2902–2908. 4. Ayan NF, Kuter U, Yaman F, Goldman RP. Hotride...DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Automated planning has...world robotic systems. This report documents work to integrate a hierarchical goal network planning algorithm with low-level path planning. The system
The State of Planning of Automation Projects in the Libraries of Canada.
ERIC Educational Resources Information Center
Clement, Hope E. A.
Library automation in Canada is complicated by the large size, dispersed population, and cultural diversity of the country. The National Library of Canada is actively planning a Canadian library network based on national bibliographic services for which the library is now developing automated systems. Canadian libraries are involved in the…
The Benefits of Office Automation: A Casebook
1986-04-01
BENEFITS OF OFFICE AUTOMATION: 12 . PERSONAL AUTHOR(S) Warren, Hoyt M. Jr. Major, USAF 13. TYPE OF REPORT 113b. TIME COVERED 14. DATE OF REPORT Yr.. o...SELECTED OFFICE AUTOMATION EXPERIENCES........................ 12 4.1. Introduction ........... * ... * ** .................... 12 4.2. Laboratory Office...Network Experiment............... 12 4.2.1. Background and Scope......................... 12 4.2.2. Experiment Objectives
Library Automation: A Critical Review.
ERIC Educational Resources Information Center
Overmyer, LaVahn
This report has two main purposes: (1) To give an account of the use of automation in selected libraries throughout the country and in the development of networks; and (2) To discuss some of the fundamental considerations relevant to automation and the implications for library education, library research and the library profession. The first part…
Automated Age-related Macular Degeneration screening system using fundus images.
Kunumpol, P; Umpaipant, W; Kanchanaranya, N; Charoenpong, T; Vongkittirux, S; Kupakanjana, T; Tantibundhit, C
2017-07-01
This work proposed an automated screening system for Age-related Macular Degeneration (AMD), and distinguishing between wet or dry types of AMD using fundus images to assist ophthalmologists in eye disease screening and management. The algorithm employs contrast-limited adaptive histogram equalization (CLAHE) in image enhancement. Subsequently, discrete wavelet transform (DWT) and locality sensitivity discrimination analysis (LSDA) were used to extract features for a neural network model to classify the results. The results showed that the proposed algorithm was able to distinguish between normal eyes, dry AMD, or wet AMD with 98.63% sensitivity, 99.15% specificity, and 98.94% accuracy, suggesting promising potential as a medical support system for faster eye disease screening at lower costs.
A Whale of a Tale: Creating Spacecraft Telemetry Data Analysis Products for the Deep Impact Mission
NASA Technical Reports Server (NTRS)
Sturdevant, Kathryn
2006-01-01
A description of the Whale product generation utility and its means of analyzing project data for Deep Impact Missions is presented. The topics include: 1) Whale Definition; 2) Whale Overview; 3) Whale Challenges; 4) Network Configuration; 5) Network Diagram; 6) Whale Data Flow: Design Decisions; 7) Whale Data Flow Diagram; 8) Whale Data Flow; 9) Whale Team and Users; 10) Creeping Requirements; 11) Whale Competition; 12) Statistics: Processing Time; 13) CPU and Disk Usage; 14) The Ripple Effect of More Data; and 15) Data Validation and the Automation Challenge.
The automated ground network system
NASA Technical Reports Server (NTRS)
Smith, Miles T.; Militch, Peter N.
1993-01-01
The primary goal of the Automated Ground Network System (AGNS) project is to reduce Ground Network (GN) station life-cycle costs. To accomplish this goal, the AGNS project will employ an object-oriented approach to develop a new infrastructure that will permit continuous application of new technologies and methodologies to the Ground Network's class of problems. The AGNS project is a Total Quality (TQ) project. Through use of an open collaborative development environment, developers and users will have equal input into the end-to-end design and development process. This will permit direct user input and feedback and will enable rapid prototyping for requirements clarification. This paper describes the AGNS objectives, operations concept, and proposed design.
Physical explosion analysis in heat exchanger network design
NASA Astrophysics Data System (ADS)
Pasha, M.; Zaini, D.; Shariff, A. M.
2016-06-01
The failure of shell and tube heat exchangers is being extensively experienced by the chemical process industries. This failure can create a loss of production for long time duration. Moreover, loss of containment through heat exchanger could potentially lead to a credible event such as fire, explosion and toxic release. There is a need to analyse the possible worst case effect originated from the loss of containment of the heat exchanger at the early design stage. Physical explosion analysis during the heat exchanger network design is presented in this work. Baker and Prugh explosion models are deployed for assessing the explosion effect. Microsoft Excel integrated with process design simulator through object linking and embedded (OLE) automation for this analysis. Aspen HYSYS V (8.0) used as a simulation platform in this work. A typical heat exchanger network of steam reforming and shift conversion process was presented as a case study. It is investigated from this analysis that overpressure generated from the physical explosion of each heat exchanger can be estimated in a more precise manner by using Prugh model. The present work could potentially assist the design engineer to identify the critical heat exchanger in the network at the preliminary design stage.
Bae, Youngoh; Yoo, Byeong Wook; Lee, Jung Chan; Kim, Hee Chan
2017-05-01
Detection and diagnosis based on extracting features and classification using electroencephalography (EEG) signals are being studied vigorously. A network analysis of time series EEG signal data is one of many techniques that could help study brain functions. In this study, we analyze EEG to diagnose alcoholism. We propose a novel methodology to estimate the differences in the status of the brain based on EEG data of normal subjects and data from alcoholics by computing many parameters stemming from effective network using Granger causality. Among many parameters, only ten parameters were chosen as final candidates. By the combination of ten graph-based parameters, our results demonstrate predictable differences between alcoholics and normal subjects. A support vector machine classifier with best performance had 90% accuracy with sensitivity of 95.3%, and specificity of 82.4% for differentiating between the two groups.
ANNA: A Convolutional Neural Network Code for Spectroscopic Analysis
NASA Astrophysics Data System (ADS)
Lee-Brown, Donald; Anthony-Twarog, Barbara J.; Twarog, Bruce A.
2018-01-01
We present ANNA, a Python-based convolutional neural network code for the automated analysis of stellar spectra. ANNA provides a flexible framework that allows atmospheric parameters such as temperature and metallicity to be determined with accuracies comparable to those of established but less efficient techniques. ANNA performs its parameterization extremely quickly; typically several thousand spectra can be analyzed in less than a second. Additionally, the code incorporates features which greatly speed up the training process necessary for the neural network to measure spectra accurately, resulting in a tool that can easily be run on a single desktop or laptop computer. Thus, ANNA is useful in an era when spectrographs increasingly have the capability to collect dozens to hundreds of spectra each night. This talk will cover the basic features included in ANNA and demonstrate its performance in two use cases: an open cluster abundance analysis involving several hundred spectra, and a metal-rich field star study. Applicability of the code to large survey datasets will also be discussed.
Segmentation and feature extraction of retinal vascular morphology
NASA Astrophysics Data System (ADS)
Leopold, Henry A.; Orchard, Jeff; Zelek, John; Lakshminarayanan, Vasudevan
2017-02-01
Analysis of retinal fundus images is essential for physicians, optometrists and ophthalmologists in the diagnosis, care and treatment of patients. The first step of almost all forms of automated fundus analysis begins with the segmentation and subtraction of the retinal vasculature, while analysis of that same structure can aid in the diagnosis of certain retinal and cardiovascular conditions, such as diabetes or stroke. This paper investigates the use of a Convolutional Neural Network as a multi-channel classifier of retinal vessels using DRIVE, a database of fundus images. The result of the network with the application of a confidence threshold was slightly below the 2nd observer and gold standard, with an accuracy of 0.9419 and ROC of 0.9707. The output of the network with on post-processing boasted the highest sensitivity found in the literature with a score of 0.9568 and a good ROC score of 0.9689. The high sensitivity of the system makes it suitable for longitudinal morphology assessments, disease detection and other similar tasks.
Clustered Numerical Data Analysis Using Markov Lie Monoid Based Networks
NASA Astrophysics Data System (ADS)
Johnson, Joseph
2016-03-01
We have designed and build an optimal numerical standardization algorithm that links numerical values with their associated units, error level, and defining metadata thus supporting automated data exchange and new levels of artificial intelligence (AI). The software manages all dimensional and error analysis and computational tracing. Tables of entities verses properties of these generalized numbers (called ``metanumbers'') support a transformation of each table into a network among the entities and another network among their properties where the network connection matrix is based upon a proximity metric between the two items. We previously proved that every network is isomorphic to the Lie algebra that generates continuous Markov transformations. We have also shown that the eigenvectors of these Markov matrices provide an agnostic clustering of the underlying patterns. We will present this methodology and show how our new work on conversion of scientific numerical data through this process can reveal underlying information clusters ordered by the eigenvalues. We will also show how the linking of clusters from different tables can be used to form a ``supernet'' of all numerical information supporting new initiatives in AI.
Automated road network extraction from high spatial resolution multi-spectral imagery
NASA Astrophysics Data System (ADS)
Zhang, Qiaoping
For the last three decades, the Geomatics Engineering and Computer Science communities have considered automated road network extraction from remotely-sensed imagery to be a challenging and important research topic. The main objective of this research is to investigate the theory and methodology of automated feature extraction for image-based road database creation, refinement or updating, and to develop a series of algorithms for road network extraction from high resolution multi-spectral imagery. The proposed framework for road network extraction from multi-spectral imagery begins with an image segmentation using the k-means algorithm. This step mainly concerns the exploitation of the spectral information for feature extraction. The road cluster is automatically identified using a fuzzy classifier based on a set of predefined road surface membership functions. These membership functions are established based on the general spectral signature of road pavement materials and the corresponding normalized digital numbers on each multi-spectral band. Shape descriptors of the Angular Texture Signature are defined and used to reduce the misclassifications between roads and other spectrally similar objects (e.g., crop fields, parking lots, and buildings). An iterative and localized Radon transform is developed for the extraction of road centerlines from the classified images. The purpose of the transform is to accurately and completely detect the road centerlines. It is able to find short, long, and even curvilinear lines. The input image is partitioned into a set of subset images called road component images. An iterative Radon transform is locally applied to each road component image. At each iteration, road centerline segments are detected based on an accurate estimation of the line parameters and line widths. Three localization approaches are implemented and compared using qualitative and quantitative methods. Finally, the road centerline segments are grouped into a road network. The extracted road network is evaluated against a reference dataset using a line segment matching algorithm. The entire process is unsupervised and fully automated. Based on extensive experimentation on a variety of remotely-sensed multi-spectral images, the proposed methodology achieves a moderate success in automating road network extraction from high spatial resolution multi-spectral imagery.
A Petri Net model for distributed energy system
NASA Astrophysics Data System (ADS)
Konopko, Joanna
2015-12-01
Electrical networks need to evolve to become more intelligent, more flexible and less costly. The smart grid is the next generation power energy, uses two-way flows of electricity and information to create a distributed automated energy delivery network. Building a comprehensive smart grid is a challenge for system protection, optimization and energy efficient. Proper modeling and analysis is needed to build an extensive distributed energy system and intelligent electricity infrastructure. In this paper, the whole model of smart grid have been proposed using Generalized Stochastic Petri Nets (GSPN). The simulation of created model is also explored. The simulation of the model has allowed the analysis of how close the behavior of the model is to the usage of the real smart grid.
Planning for Information Resource Management by OSD.
1987-12-01
9,129 Objective 1.2- DSS 21,648 23,702 3,893 4,090 4,309 Objective 1.3- Computing 8,197 7,590 435 459 483 Objective 1.4- Distribute messages 75 213 29 ...System DCOAR 26 Office Automation Computer 1,961 3,483 Systems DCOAR 27 Network Performance and 125 454 Capital Budget DCOAR 29 Requirements Analysis...Defense Personnel Analysis 124 167 System ASD(FM&P) 15 Mobilization Training Base 29 63 Cap ASD(FM&P) 16 Manpower Management/ i 8 33 Program Review 01
A Neural Network Based Workstation for Automated Cell Proliferation Analysis
2001-10-25
work was supported by the Programa de Apoyo a Proyectos de Desarrollo e Investigacíon en Informática REDII 2000. We thank Blanca Itzel Taboada for...Meléndez1, G. Corkidi.2 1Centro de Instrumentos, UNAM. P.O. Box 70-186, México 04510, D.F. 2Instituto de Biotecnología, UNAM. P.O. Box 510-3, 62250...proliferation analysis, of cytological microscope images. The software of the system assists the expert biotechnologist during cell proliferation and
NASA Astrophysics Data System (ADS)
Omran, Adel; Dietrich, Schröder; Abouelmagd, Abdou; Michael, Märker
2016-09-01
Damages caused by flash floods hazards are an increasing phenomenon, especially in arid and semi-arid areas. Thus, the need to evaluate these areas based on their flash flood risk using maps and hydrological models is also becoming more important. For ungauged watersheds a tentative analysis can be carried out based on the geomorphometric characteristics of the terrain. To process regions with larger watersheds, where perhaps hundreds of watersheds have to be delineated, processed and classified, the overall process need to be automated. GIS packages such as ESRI's ArcGIS offer a number of sophisticated tools that help regarding such analysis. Yet there are still gaps and pitfalls that need to be considered if the tools are combined into a geoprocessing model to automate the complete assessment workflow. These gaps include issues such as i) assigning stream order according to Strahler theory, ii) calculating the threshold value for the stream network extraction, and iii) determining the pour points for each of the nodes of the Strahler ordered stream network. In this study a complete automated workflow based on ArcGIS Model Builder using standard tools will be introduced and discussed. Some additional tools have been implemented to complete the overall workflow. These tools have been programmed using Python and Java in the context of ArcObjects. The workflow has been applied to digital data from the southwestern Sinai Peninsula, Egypt. An optimum threshold value has been selected to optimize drainage configuration by statistically comparing all of the extracted stream configuration results from DEM with the available reference data from topographic maps. The code has succeeded in estimating the correct ranking of specific stream orders in an automatic manner without additional manual steps. As a result, the code has proven to save time and efforts; hence it's considered a very useful tool for processing large catchment basins.
NASA Astrophysics Data System (ADS)
Vrettaros, John; Vouros, George; Drigas, Athanasios S.
This article studies the expediency of using neural networks technology and the development of back-propagation networks (BPN) models for modeling automated evaluation of the answers and progress of deaf students' that possess basic knowledge of the English language and computer skills, within a virtual e-learning environment. The performance of the developed neural models is evaluated with the correlation factor between the neural networks' response values and the real value data as well as the percentage measurement of the error between the neural networks' estimate values and the real value data during its training process and afterwards with unknown data that weren't used in the training process.
Integrated Evaluation of Reliability and Power Consumption of Wireless Sensor Networks.
Dâmaso, Antônio; Rosa, Nelson; Maciel, Paulo
2017-11-05
Power consumption is a primary interest in Wireless Sensor Networks (WSNs), and a large number of strategies have been proposed to evaluate it. However, those approaches usually neither consider reliability issues nor the power consumption of applications executing in the network. A central concern is the lack of consolidated solutions that enable us to evaluate the power consumption of applications and the network stack also considering their reliabilities. To solve this problem, we introduce a fully automatic solution to design power consumption aware WSN applications and communication protocols. The solution presented in this paper comprises a methodology to evaluate the power consumption based on the integration of formal models, a set of power consumption and reliability models, a sensitivity analysis strategy to select WSN configurations and a toolbox named EDEN to fully support the proposed methodology. This solution allows accurately estimating the power consumption of WSN applications and the network stack in an automated way.
Effects of Gain/Loss Framing in Cyber Defense Decision-Making
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bos, Nathan; Paul, Celeste; Gersh, John
Cyber defense requires decision making under uncertainty. Yet this critical area has not been a strong focus of research in judgment and decision-making. Future defense systems, which will rely on software-defined networks and may employ ‘moving target’ defenses, will increasingly automate lower level detection and analysis, but will still require humans in the loop for higher level judgment. We studied the decision making process and outcomes of 17 experienced network defense professionals who worked through a set of realistic network defense scenarios. We manipulated gain versus loss framing in a cyber defense scenario, and found significant effects in one ofmore » two focal problems. Defenders that began with a network already in quarantine (gain framing) used a quarantine system more than those that did not (loss framing). We also found some difference in perceived workload and efficacy. Alternate explanations of these findings and implications for network defense are discussed.« less
Electrochemical Detection in Stacked Paper Networks.
Liu, Xiyuan; Lillehoj, Peter B
2015-08-01
Paper-based electrochemical biosensors are a promising technology that enables rapid, quantitative measurements on an inexpensive platform. However, the control of liquids in paper networks is generally limited to a single sample delivery step. Here, we propose a simple method to automate the loading and delivery of liquid samples to sensing electrodes on paper networks by stacking multiple layers of paper. Using these stacked paper devices (SPDs), we demonstrate a unique strategy to fully immerse planar electrodes by aqueous liquids via capillary flow. Amperometric measurements of xanthine oxidase revealed that electrochemical sensors on four-layer SPDs generated detection signals up to 75% higher compared with those on single-layer paper devices. Furthermore, measurements could be performed with minimal user involvement and completed within 30 min. Due to its simplicity, enhanced automation, and capability for quantitative measurements, stacked paper electrochemical biosensors can be useful tools for point-of-care testing in resource-limited settings. © 2015 Society for Laboratory Automation and Screening.
Automated EEG artifact elimination by applying machine learning algorithms to ICA-based features.
Radüntz, Thea; Scouten, Jon; Hochmuth, Olaf; Meffert, Beate
2017-08-01
Biological and non-biological artifacts cause severe problems when dealing with electroencephalogram (EEG) recordings. Independent component analysis (ICA) is a widely used method for eliminating various artifacts from recordings. However, evaluating and classifying the calculated independent components (IC) as artifact or EEG is not fully automated at present. In this study, we propose a new approach for automated artifact elimination, which applies machine learning algorithms to ICA-based features. We compared the performance of our classifiers with the visual classification results given by experts. The best result with an accuracy rate of 95% was achieved using features obtained by range filtering of the topoplots and IC power spectra combined with an artificial neural network. Compared with the existing automated solutions, our proposed method is not limited to specific types of artifacts, electrode configurations, or number of EEG channels. The main advantages of the proposed method is that it provides an automatic, reliable, real-time capable, and practical tool, which avoids the need for the time-consuming manual selection of ICs during artifact removal.
Automated EEG artifact elimination by applying machine learning algorithms to ICA-based features
NASA Astrophysics Data System (ADS)
Radüntz, Thea; Scouten, Jon; Hochmuth, Olaf; Meffert, Beate
2017-08-01
Objective. Biological and non-biological artifacts cause severe problems when dealing with electroencephalogram (EEG) recordings. Independent component analysis (ICA) is a widely used method for eliminating various artifacts from recordings. However, evaluating and classifying the calculated independent components (IC) as artifact or EEG is not fully automated at present. Approach. In this study, we propose a new approach for automated artifact elimination, which applies machine learning algorithms to ICA-based features. Main results. We compared the performance of our classifiers with the visual classification results given by experts. The best result with an accuracy rate of 95% was achieved using features obtained by range filtering of the topoplots and IC power spectra combined with an artificial neural network. Significance. Compared with the existing automated solutions, our proposed method is not limited to specific types of artifacts, electrode configurations, or number of EEG channels. The main advantages of the proposed method is that it provides an automatic, reliable, real-time capable, and practical tool, which avoids the need for the time-consuming manual selection of ICs during artifact removal.
Tropical Cyclone Intensity Estimation Using Deep Convolutional Neural Networks
NASA Technical Reports Server (NTRS)
Maskey, Manil; Cecil, Dan; Ramachandran, Rahul; Miller, Jeffrey J.
2018-01-01
Estimating tropical cyclone intensity by just using satellite image is a challenging problem. With successful application of the Dvorak technique for more than 30 years along with some modifications and improvements, it is still used worldwide for tropical cyclone intensity estimation. A number of semi-automated techniques have been derived using the original Dvorak technique. However, these techniques suffer from subjective bias as evident from the most recent estimations on October 10, 2017 at 1500 UTC for Tropical Storm Ophelia: The Dvorak intensity estimates ranged from T2.3/33 kt (Tropical Cyclone Number 2.3/33 knots) from UW-CIMSS (University of Wisconsin-Madison - Cooperative Institute for Meteorological Satellite Studies) to T3.0/45 kt from TAFB (the National Hurricane Center's Tropical Analysis and Forecast Branch) to T4.0/65 kt from SAB (NOAA/NESDIS Satellite Analysis Branch). In this particular case, two human experts at TAFB and SAB differed by 20 knots in their Dvorak analyses, and the automated version at the University of Wisconsin was 12 knots lower than either of them. The National Hurricane Center (NHC) estimates about 10-20 percent uncertainty in its post analysis when only satellite based estimates are available. The success of the Dvorak technique proves that spatial patterns in infrared (IR) imagery strongly relate to tropical cyclone intensity. This study aims to utilize deep learning, the current state of the art in pattern recognition and image recognition, to address the need for an automated and objective tropical cyclone intensity estimation. Deep learning is a multi-layer neural network consisting of several layers of simple computational units. It learns discriminative features without relying on a human expert to identify which features are important. Our study mainly focuses on convolutional neural network (CNN), a deep learning algorithm, to develop an objective tropical cyclone intensity estimation. CNN is a supervised learning algorithm requiring a large number of training data. Since the archives of intensity data and tropical cyclone centric satellite images is openly available for use, the training data is easily created by combining the two. Results, case studies, prototypes, and advantages of this approach will be discussed.
Automation of multi-agent control for complex dynamic systems in heterogeneous computational network
NASA Astrophysics Data System (ADS)
Oparin, Gennady; Feoktistov, Alexander; Bogdanova, Vera; Sidorov, Ivan
2017-01-01
The rapid progress of high-performance computing entails new challenges related to solving large scientific problems for various subject domains in a heterogeneous distributed computing environment (e.g., a network, Grid system, or Cloud infrastructure). The specialists in the field of parallel and distributed computing give the special attention to a scalability of applications for problem solving. An effective management of the scalable application in the heterogeneous distributed computing environment is still a non-trivial issue. Control systems that operate in networks, especially relate to this issue. We propose a new approach to the multi-agent management for the scalable applications in the heterogeneous computational network. The fundamentals of our approach are the integrated use of conceptual programming, simulation modeling, network monitoring, multi-agent management, and service-oriented programming. We developed a special framework for an automation of the problem solving. Advantages of the proposed approach are demonstrated on the parametric synthesis example of the static linear regulator for complex dynamic systems. Benefits of the scalable application for solving this problem include automation of the multi-agent control for the systems in a parallel mode with various degrees of its detailed elaboration.
An ethernet/IP security review with intrusion detection applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laughter, S. A.; Williams, R. D.
2006-07-01
Supervisory Control and Data Acquisition (SCADA) and automation networks, used throughout utility and manufacturing applications, have their own specific set of operational and security requirements when compared to corporate networks. The modern climate of heightened national security and awareness of terrorist threats has made the security of these systems of prime concern. There is a need to understand the vulnerabilities of these systems and how to monitor and protect them. Ethernet/IP is a member of a family of protocols based on the Control and Information Protocol (CIP). Ethernet/IP allows automation systems to be utilized on and integrated with traditional TCP/IPmore » networks, facilitating integration of these networks with corporate systems and even the Internet. A review of the CIP protocol and the additions Ethernet/IP makes to it has been done to reveal the kind of attacks made possible through the protocol. A set of rules for the SNORT Intrusion Detection software is developed based on the results of the security review. These can be used to monitor, and possibly actively protect, a SCADA or automation network that utilizes Ethernet/IP in its infrastructure. (authors)« less
Resting-State Functional Magnetic Resonance Imaging for Language Preoperative Planning
Branco, Paulo; Seixas, Daniela; Deprez, Sabine; Kovacs, Silvia; Peeters, Ronald; Castro, São L.; Sunaert, Stefan
2016-01-01
Functional magnetic resonance imaging (fMRI) is a well-known non-invasive technique for the study of brain function. One of its most common clinical applications is preoperative language mapping, essential for the preservation of function in neurosurgical patients. Typically, fMRI is used to track task-related activity, but poor task performance and movement artifacts can be critical limitations in clinical settings. Recent advances in resting-state protocols open new possibilities for pre-surgical mapping of language potentially overcoming these limitations. To test the feasibility of using resting-state fMRI instead of conventional active task-based protocols, we compared results from fifteen patients with brain lesions while performing a verb-to-noun generation task and while at rest. Task-activity was measured using a general linear model analysis and independent component analysis (ICA). Resting-state networks were extracted using ICA and further classified in two ways: manually by an expert and by using an automated template matching procedure. The results revealed that the automated classification procedure correctly identified language networks as compared to the expert manual classification. We found a good overlay between task-related activity and resting-state language maps, particularly within the language regions of interest. Furthermore, resting-state language maps were as sensitive as task-related maps, and had higher specificity. Our findings suggest that resting-state protocols may be suitable to map language networks in a quick and clinically efficient way. PMID:26869899
Efficient large-scale graph data optimization for intelligent video surveillance
NASA Astrophysics Data System (ADS)
Shang, Quanhong; Zhang, Shujun; Wang, Yanbo; Sun, Chen; Wang, Zepeng; Zhang, Luming
2017-08-01
Society is rapidly accepting the use of a wide variety of cameras Location and applications: site traffic monitoring, parking Lot surveillance, car and smart space. These ones here the camera provides data every day in an analysis Effective way. Recent advances in sensor technology Manufacturing, communications and computing are stimulating.The development of new applications that can change the traditional Vision system incorporating universal smart camera network. This Analysis of visual cues in multi camera networks makes wide Applications ranging from smart home and office automation to large area surveillance and traffic surveillance. In addition, dense Camera networks, most of which have large overlapping areas of cameras. In the view of good research, we focus on sparse camera networks. One Sparse camera network using large area surveillance. As few cameras as possible, most cameras do not overlap Each other’s field of vision. This task is challenging Lack of knowledge of topology Network, the specific changes in appearance and movement Track different opinions of the target, as well as difficulties Understanding complex events in a network. In this review in this paper, we present a comprehensive survey of recent studies Results to solve the problem of topology learning, Object appearance modeling and global activity understanding sparse camera network. In addition, some of the current open Research issues are discussed.
NASA Astrophysics Data System (ADS)
Stanley, Kieran M.; Grant, Aoife; O'Doherty, Simon; Young, Dickon; Manning, Alistair J.; Stavert, Ann R.; Spain, T. Gerard; Salameh, Peter K.; Harth, Christina M.; Simmonds, Peter G.; Sturges, William T.; Oram, David E.; Derwent, Richard G.
2018-03-01
A network of three tall tower measurement stations was set up in 2012 across the United Kingdom to expand measurements made at the long-term background northern hemispheric site, Mace Head, Ireland. Reliable and precise in situ greenhouse gas (GHG) analysis systems were developed and deployed at three sites in the UK with automated instrumentation measuring a suite of GHGs. The UK Deriving Emissions linked to Climate Change (UK DECC) network uses tall (165-230 m) open-lattice telecommunications towers, which provide a convenient platform for boundary layer trace gas sampling. In this paper we describe the automated measurement system and first results from the UK DECC network for CO2, CH4, N2O, SF6, CO and H2. CO2 and CH4 are measured at all of the UK DECC sites by cavity ring-down spectroscopy (CRDS) with multiple inlet heights at two of the three tall tower sites to assess for boundary layer stratification. The short-term precisions (1σ on 1 min means) of CRDS measurements at background mole fractions for January 2012 to September 2015 is < 0.05 µmol mol-1 for CO2 and < 0.3 nmol mol-1 for CH4. Repeatability of standard injections (1σ) is < 0.03 µmol mol-1 for CO2 and < 0.3 nmol mol-1 for CH4 for the same time period. N2O and SF6 are measured at three of the sites, and CO and H2 measurements are made at two of the sites, from a single inlet height using gas chromatography (GC) with an electron capture detector (ECD), flame ionisation detector (FID) or reduction gas analyser (RGA). Repeatability of individual injections (1σ) on GC and RGA instruments between January 2012 and September 2015 for CH4, N2O, SF6, CO and H2 measurements were < 2.8 nmol mol-1, < 0.4 nmol mol-1, < 0.07 pmol mol-1, < 2 nmol mol-1 and < 3 nmol mol-1, respectively. Instrumentation in the network is fully automated and includes sensors for measuring a variety of instrumental parameters such as flow, pressures, and sampling temperatures. Automated alerts are generated and emailed to site operators when instrumental parameters are not within defined set ranges. Automated instrument shutdowns occur for critical errors such as carrier gas flow rate deviations. Results from the network give good spatial and temporal coverage of atmospheric mixing ratios within the UK since early 2012. Results also show that all measured GHGs are increasing in mole fraction over the selected reporting period and, except for SF6, exhibit a seasonal trend. CO2 and CH4 also show strong diurnal cycles, with night-time maxima and daytime minima in mole fractions.
Automated Guideway Network Traffic Modeling
DOT National Transportation Integrated Search
1972-02-01
In the literature concerning automated guideway transportation systems, such as dual mode, a great deal of effort has been expended on the use of deterministic reservation schemes and the problem of merging streams of vehicles. However, little attent...
Automated Guideway Ground Transportation Network Simulation
DOT National Transportation Integrated Search
1975-08-01
The report discusses some automated guideway management problems relating to ground transportation systems and provides an outline of the types of models and algorithms that could be used to develop simulation tools for evaluating system performance....
Research Libraries--Automation and Cooperation.
ERIC Educational Resources Information Center
McDonald, David R.; Hurowitz, Robert
1982-01-01
Description of Research Libraries Information Network, an automated technical processing and information retrieval system, notes subsystems (acquisitions, cataloging, message, print, tables), functions, design, and benefits to participating libraries. (Request complimentary subscription on institution letterhead from Editor, "Perspectives in…
NET: a new framework for the vectorization and examination of network data.
Lasser, Jana; Katifori, Eleni
2017-01-01
The analysis of complex networks both in general and in particular as pertaining to real biological systems has been the focus of intense scientific attention in the past and present. In this paper we introduce two tools that provide fast and efficient means for the processing and quantification of biological networks like Drosophila tracheoles or leaf venation patterns: the Network Extraction Tool ( NET ) to extract data and the Graph-edit-GUI ( GeGUI ) to visualize and modify networks. NET is especially designed for high-throughput semi-automated analysis of biological datasets containing digital images of networks. The framework starts with the segmentation of the image and then proceeds to vectorization using methodologies from optical character recognition. After a series of steps to clean and improve the quality of the extracted data the framework produces a graph in which the network is represented only by its nodes and neighborhood-relations. The final output contains information about the adjacency matrix of the graph, the width of the edges and the positions of the nodes in space. NET also provides tools for statistical analysis of the network properties, such as the number of nodes or total network length. Other, more complex metrics can be calculated by importing the vectorized network to specialized network analysis packages. GeGUI is designed to facilitate manual correction of non-planar networks as these may contain artifacts or spurious junctions due to branches crossing each other. It is tailored for but not limited to the processing of networks from microscopy images of Drosophila tracheoles. The networks extracted by NET closely approximate the network depicted in the original image. NET is fast, yields reproducible results and is able to capture the full geometry of the network, including curved branches. Additionally GeGUI allows easy handling and visualization of the networks.
Deep machine learning provides state-of-the-art performance in image-based plant phenotyping.
Pound, Michael P; Atkinson, Jonathan A; Townsend, Alexandra J; Wilson, Michael H; Griffiths, Marcus; Jackson, Aaron S; Bulat, Adrian; Tzimiropoulos, Georgios; Wells, Darren M; Murchie, Erik H; Pridmore, Tony P; French, Andrew P
2017-10-01
In plant phenotyping, it has become important to be able to measure many features on large image sets in order to aid genetic discovery. The size of the datasets, now often captured robotically, often precludes manual inspection, hence the motivation for finding a fully automated approach. Deep learning is an emerging field that promises unparalleled results on many data analysis problems. Building on artificial neural networks, deep approaches have many more hidden layers in the network, and hence have greater discriminative and predictive power. We demonstrate the use of such approaches as part of a plant phenotyping pipeline. We show the success offered by such techniques when applied to the challenging problem of image-based plant phenotyping and demonstrate state-of-the-art results (>97% accuracy) for root and shoot feature identification and localization. We use fully automated trait identification using deep learning to identify quantitative trait loci in root architecture datasets. The majority (12 out of 14) of manually identified quantitative trait loci were also discovered using our automated approach based on deep learning detection to locate plant features. We have shown deep learning-based phenotyping to have very good detection and localization accuracy in validation and testing image sets. We have shown that such features can be used to derive meaningful biological traits, which in turn can be used in quantitative trait loci discovery pipelines. This process can be completely automated. We predict a paradigm shift in image-based phenotyping bought about by such deep learning approaches, given sufficient training sets. © The Authors 2017. Published by Oxford University Press.
Wang, Zhiwei; Liu, Chaoyue; Cheng, Danpeng; Wang, Liang; Yang, Xin; Cheng, Kwang-Ting
2018-05-01
Automated methods for detecting clinically significant (CS) prostate cancer (PCa) in multi-parameter magnetic resonance images (mp-MRI) are of high demand. Existing methods typically employ several separate steps, each of which is optimized individually without considering the error tolerance of other steps. As a result, they could either involve unnecessary computational cost or suffer from errors accumulated over steps. In this paper, we present an automated CS PCa detection system, where all steps are optimized jointly in an end-to-end trainable deep neural network. The proposed neural network consists of concatenated subnets: 1) a novel tissue deformation network (TDN) for automated prostate detection and multimodal registration and 2) a dual-path convolutional neural network (CNN) for CS PCa detection. Three types of loss functions, i.e., classification loss, inconsistency loss, and overlap loss, are employed for optimizing all parameters of the proposed TDN and CNN. In the training phase, the two nets mutually affect each other and effectively guide registration and extraction of representative CS PCa-relevant features to achieve results with sufficient accuracy. The entire network is trained in a weakly supervised manner by providing only image-level annotations (i.e., presence/absence of PCa) without exact priors of lesions' locations. Compared with most existing systems which require supervised labels, e.g., manual delineation of PCa lesions, it is much more convenient for clinical usage. Comprehensive evaluation based on fivefold cross validation using 360 patient data demonstrates that our system achieves a high accuracy for CS PCa detection, i.e., a sensitivity of 0.6374 and 0.8978 at 0.1 and 1 false positives per normal/benign patient.
Automated general temperature correction method for dielectric soil moisture sensors
NASA Astrophysics Data System (ADS)
Kapilaratne, R. G. C. Jeewantinie; Lu, Minjiao
2017-08-01
An effective temperature correction method for dielectric sensors is important to ensure the accuracy of soil water content (SWC) measurements of local to regional-scale soil moisture monitoring networks. These networks are extensively using highly temperature sensitive dielectric sensors due to their low cost, ease of use and less power consumption. Yet there is no general temperature correction method for dielectric sensors, instead sensor or site dependent correction algorithms are employed. Such methods become ineffective at soil moisture monitoring networks with different sensor setups and those that cover diverse climatic conditions and soil types. This study attempted to develop a general temperature correction method for dielectric sensors which can be commonly used regardless of the differences in sensor type, climatic conditions and soil type without rainfall data. In this work an automated general temperature correction method was developed by adopting previously developed temperature correction algorithms using time domain reflectometry (TDR) measurements to ThetaProbe ML2X, Stevens Hydra probe II and Decagon Devices EC-TM sensor measurements. The rainy day effects removal procedure from SWC data was automated by incorporating a statistical inference technique with temperature correction algorithms. The temperature correction method was evaluated using 34 stations from the International Soil Moisture Monitoring Network and another nine stations from a local soil moisture monitoring network in Mongolia. Soil moisture monitoring networks used in this study cover four major climates and six major soil types. Results indicated that the automated temperature correction algorithms developed in this study can eliminate temperature effects from dielectric sensor measurements successfully even without on-site rainfall data. Furthermore, it has been found that actual daily average of SWC has been changed due to temperature effects of dielectric sensors with a significant error factor comparable to ±1% manufacturer's accuracy.
SONG-China Project: A Global Automated Observation Network
NASA Astrophysics Data System (ADS)
Yang, Z. Z.; Lu, X. M.; Tian, J. F.; Zhuang, C. G.; Wang, K.; Deng, L. C.
2017-09-01
Driven by advancements in technology and scientific objectives, data acquisition in observational astronomy has been changed greatly in recent years. Fully automated or even autonomous ground-based network of telescopes has now become a tendency for time-domain observational projects. The Stellar Observations Network Group (SONG) is an international collaboration with the participation and contribution of the Chinese astronomy community. The scientific goal of SONG is time-domain astrophysics such as asteroseismology and open cluster research. The SONG project aims to build a global network of 1 m telescopes equipped with high-precision and high-resolution spectrographs, and two-channel lucky-imaging cameras. It is the Chinese initiative to install a 50 cm binocular photometry telescope at each SONG node sharing the network platform and infrastructure. This work is focused on design and implementation in technology and methodology of SONG/50BiN, a typical ground-based network composed of multiple sites and a variety of instruments.
uFarm: a smart farm management system based on RFID
NASA Astrophysics Data System (ADS)
Kim, Hyoungsuk; Lee, Moonsup; Jung, Jonghyuk; Lee, Hyunwook; Kim, Taehyoun
2007-12-01
Recently, the livestock industry in Korea has been threatened by many challenges such as low productivity due to labor intensiveness, global competition compelled by the Free Trade Agreement (FTA), and emerging animal disease issues such as BSE or foot-and-mouth. In this paper, we propose a smart farm management system, called uFarm, which would come up with such challenges by automating farm management. First, we automate labor-intensive jobs using equipments based on sensors and actuators. The automation subsystem can be controlled by remote user through wireless network. Second, we provide real-time traceability of information on farm animals using the radio-frequency identification (RFID) method and embedded data server with network connectivity.
Structural brain network analysis in families multiply affected with bipolar I disorder.
Forde, Natalie J; O'Donoghue, Stefani; Scanlon, Cathy; Emsell, Louise; Chaddock, Chris; Leemans, Alexander; Jeurissen, Ben; Barker, Gareth J; Cannon, Dara M; Murray, Robin M; McDonald, Colm
2015-10-30
Disrupted structural connectivity is associated with psychiatric illnesses including bipolar disorder (BP). Here we use structural brain network analysis to investigate connectivity abnormalities in multiply affected BP type I families, to assess the utility of dysconnectivity as a biomarker and its endophenotypic potential. Magnetic resonance diffusion images for 19 BP type I patients in remission, 21 of their first degree unaffected relatives, and 18 unrelated healthy controls underwent tractography. With the automated anatomical labelling atlas being used to define nodes, a connectivity matrix was generated for each subject. Network metrics were extracted with the Brain Connectivity Toolbox and then analysed for group differences, accounting for potential confounding effects of age, gender and familial association. Whole brain analysis revealed no differences between groups. Analysis of specific mainly frontal regions, previously implicated as potentially endophenotypic by functional magnetic resonance imaging analysis of the same cohort, revealed a significant effect of group in the right medial superior frontal gyrus and left middle frontal gyrus driven by reduced organisation in patients compared with controls. The organisation of whole brain networks of those affected with BP I does not differ from their unaffected relatives or healthy controls. In discreet frontal regions, however, anatomical connectivity is disrupted in patients but not in their unaffected relatives. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Automated Clean Chemistry for Bulk Analysis of Environmental Swipe Samples - FY17 Year End Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ticknor, Brian W.; Metzger, Shalina C.; McBay, Eddy H.
Sample preparation methods for mass spectrometry are being automated using commercial-off-the-shelf (COTS) equipment to shorten lengthy and costly manual chemical purification procedures. This development addresses a serious need in the International Atomic Energy Agency’s Network of Analytical Laboratories (IAEA NWAL) to increase efficiency in the Bulk Analysis of Environmental Samples for Safeguards program with a method that allows unattended, overnight operation. In collaboration with Elemental Scientific Inc., the prepFAST-MC2 was designed based on COTS equipment. It was modified for uranium/plutonium separations using renewable columns packed with Eichrom TEVA and UTEVA resins, with a chemical separation method based on the Oakmore » Ridge National Laboratory (ORNL) NWAL chemical procedure. The newly designed prepFAST-SR has had several upgrades compared with the original prepFAST-MC2. Both systems are currently installed in the Ultra-Trace Forensics Science Center at ORNL.« less
ERIC Educational Resources Information Center
Karpisek, Marian; And Others
1995-01-01
Presents five articles and a company resource directory to help librarians successfully incorporate technology into school libraries. Discusses actual situations, examines student needs, and gives advice to help librarians with library automation systems, videodiscs, library security systems, media retrieval, networking CD-ROMs, and locating…
NASA Astrophysics Data System (ADS)
Baumgartner, D. J.; Pötzi, W.; Freislich, H.; Strutzmann, H.; Veronig, A. M.; Foelsche, U.; Rieder, H. E.
2017-06-01
In recent decades, automated sensors for sunshine duration (SD) measurements have been introduced in meteorological networks, thereby replacing traditional instruments, most prominently the Campbell-Stokes (CS) sunshine recorder. Parallel records of automated and traditional SD recording systems are rare. Nevertheless, such records are important to understand the differences/similarities in SD totals obtained with different instruments and how changes in monitoring device type affect the homogeneity of SD records. This study investigates the differences/similarities in parallel SD records obtained with a CS and two automated SD sensors between 2007 and 2016 at the Kanzelhöhe Observatory, Austria. Comparing individual records of daily SD totals, we find differences of both positive and negative sign, with smallest differences between the automated sensors. The larger differences between CS-derived SD totals and those from automated sensors can be attributed (largely) to the higher sensitivity threshold of the CS instrument. Correspondingly, the closest agreement among all sensors is found during summer, the time of year when sensitivity thresholds are least critical. Furthermore, we investigate the performance of various models to create the so-called sensor-type-equivalent (STE) SD records. Our analysis shows that regression models including all available data on daily (or monthly) time scale perform better than simple three- (or four-) point regression models. Despite general good performance, none of the considered regression models (of linear or quadratic form) emerges as the "optimal" model. Although STEs prove useful for relating SD records of individual sensors on daily/monthly time scales, this does not ensure that STE (or joint) records can be used for trend analysis.
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2012 CFR
2012-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2014 CFR
2014-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2013 CFR
2013-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2011 CFR
2011-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2010 CFR
2010-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
ERIC Educational Resources Information Center
Husby, Ole
1990-01-01
The challenges and potential benefits of automating university libraries are reviewed, with special attention given to cooperative systems. Aspects discussed include database size, the role of the university computer center, storage modes, multi-institutional systems, resource sharing, cooperative system management, networking, and intelligent…
NASA Technical Reports Server (NTRS)
Worrall, Diana M. (Editor); Biemesderfer, Chris (Editor); Barnes, Jeannette (Editor)
1992-01-01
Consideration is given to a definition of a distribution format for X-ray data, the Einstein on-line system, the NASA/IPAC extragalactic database, COBE astronomical databases, Cosmic Background Explorer astronomical databases, the ADAM software environment, the Groningen Image Processing System, search for a common data model for astronomical data analysis systems, deconvolution for real and synthetic apertures, pitfalls in image reconstruction, a direct method for spectral and image restoration, and a discription of a Poisson imagery super resolution algorithm. Also discussed are multivariate statistics on HI and IRAS images, a faint object classification using neural networks, a matched filter for improving SNR of radio maps, automated aperture photometry of CCD images, interactive graphics interpreter, the ROSAT extreme ultra-violet sky survey, a quantitative study of optimal extraction, an automated analysis of spectra, applications of synthetic photometry, an algorithm for extra-solar planet system detection and data reduction facilities for the William Herschel telescope.
An automated calibration laboratory - Requirements and design approach
NASA Technical Reports Server (NTRS)
O'Neil-Rood, Nora; Glover, Richard D.
1990-01-01
NASA's Dryden Flight Research Facility (Ames-Dryden), operates a diverse fleet of research aircraft which are heavily instrumented to provide both real time data for in-flight monitoring and recorded data for postflight analysis. Ames-Dryden's existing automated calibration (AUTOCAL) laboratory is a computerized facility which tests aircraft sensors to certify accuracy for anticipated harsh flight environments. Recently, a major AUTOCAL lab upgrade was initiated; the goal of this modernization is to enhance productivity and improve configuration management for both software and test data. The new system will have multiple testing stations employing distributed processing linked by a local area network to a centralized database. The baseline requirements for the new AUTOCAL lab and the design approach being taken for its mechanization are described.
UCLA High Speed, High Volume Laboratory Network for Infectious Diseases. Addendum
2009-08-01
s) and should not be construed as an official Department of the Army position, policy or decision unless so designated by other documentation... Design : Because of current public health and national security threats, influenza surveillance and analysis will be the initial focus. In the upcoming...throughput and automated systems will enable processing of tens of thousands of samples and provide critical laboratory capacity. Its overall design and
Mazzaferri, Javier; Larrivée, Bruno; Cakir, Bertan; Sapieha, Przemyslaw; Costantino, Santiago
2018-03-02
Preclinical studies of vascular retinal diseases rely on the assessment of developmental dystrophies in the oxygen induced retinopathy rodent model. The quantification of vessel tufts and avascular regions is typically computed manually from flat mounted retinas imaged using fluorescent probes that highlight the vascular network. Such manual measurements are time-consuming and hampered by user variability and bias, thus a rapid and objective method is needed. Here, we introduce a machine learning approach to segment and characterize vascular tufts, delineate the whole vasculature network, and identify and analyze avascular regions. Our quantitative retinal vascular assessment (QuRVA) technique uses a simple machine learning method and morphological analysis to provide reliable computations of vascular density and pathological vascular tuft regions, devoid of user intervention within seconds. We demonstrate the high degree of error and variability of manual segmentations, and designed, coded, and implemented a set of algorithms to perform this task in a fully automated manner. We benchmark and validate the results of our analysis pipeline using the consensus of several manually curated segmentations using commonly used computer tools. The source code of our implementation is released under version 3 of the GNU General Public License ( https://www.mathworks.com/matlabcentral/fileexchange/65699-javimazzaf-qurva ).
Aguzzi, Jacopo; Costa, Corrado; Robert, Katleen; Matabos, Marjolaine; Antonucci, Francesca; Juniper, S. Kim; Menesatti, Paolo
2011-01-01
The development and deployment of sensors for undersea cabled observatories is presently biased toward the measurement of habitat variables, while sensor technologies for biological community characterization through species identification and individual counting are less common. The VENUS cabled multisensory network (Vancouver Island, Canada) deploys seafloor camera systems at several sites. Our objective in this study was to implement new automated image analysis protocols for the recognition and counting of benthic decapods (i.e., the galatheid squat lobster, Munida quadrispina), as well as for the evaluation of changes in bacterial mat coverage (i.e., Beggiatoa spp.), using a camera deployed in Saanich Inlet (103 m depth). For the counting of Munida we remotely acquired 100 digital photos at hourly intervals from 2 to 6 December 2009. In the case of bacterial mat coverage estimation, images were taken from 2 to 8 December 2009 at the same time frequency. The automated image analysis protocols for both study cases were created in MatLab 7.1. Automation for Munida counting incorporated the combination of both filtering and background correction (Median- and Top-Hat Filters) with Euclidean Distances (ED) on Red-Green-Blue (RGB) channels. The Scale-Invariant Feature Transform (SIFT) features and Fourier Descriptors (FD) of tracked objects were then extracted. Animal classifications were carried out with the tools of morphometric multivariate statistic (i.e., Partial Least Square Discriminant Analysis; PLSDA) on Mean RGB (RGBv) value for each object and Fourier Descriptors (RGBv+FD) matrices plus SIFT and ED. The SIFT approach returned the better results. Higher percentages of images were correctly classified and lower misclassification errors (an animal is present but not detected) occurred. In contrast, RGBv+FD and ED resulted in a high incidence of records being generated for non-present animals. Bacterial mat coverage was estimated in terms of Percent Coverage and Fractal Dimension. A constant Region of Interest (ROI) was defined and background extraction by a Gaussian Blurring Filter was performed. Image subtraction within ROI was followed by the sum of the RGB channels matrices. Percent Coverage was calculated on the resulting image. Fractal Dimension was estimated using the box-counting method. The images were then resized to a dimension in pixels equal to a power of 2, allowing subdivision into sub-multiple quadrants. In comparisons of manual and automated Percent Coverage and Fractal Dimension estimates, the former showed an overestimation tendency for both parameters. The primary limitations on the automatic analysis of benthic images were habitat variations in sediment texture and water column turbidity. The application of filters for background corrections is a required preliminary step for the efficient recognition of animals and bacterial mat patches. PMID:22346657
Koenig, Helen C; Finkel, Barbara B; Khalsa, Satjeet S; Lanken, Paul N; Prasad, Meeta; Urbani, Richard; Fuchs, Barry D
2011-01-01
Lung protective ventilation reduces mortality in patients with acute lung injury, but underrecognition of acute lung injury has limited its use. We recently validated an automated electronic acute lung injury surveillance system in patients with major trauma in a single intensive care unit. In this study, we assessed the system's performance as a prospective acute lung injury screening tool in a diverse population of intensive care unit patients. Patients were screened prospectively for acute lung injury over 21 wks by the automated system and by an experienced research coordinator who manually screened subjects for enrollment in Acute Respiratory Distress Syndrome Clinical Trials Network (ARDSNet) trials. Performance of the automated system was assessed by comparing its results with the manual screening process. Discordant results were adjudicated blindly by two physician reviewers. In addition, a sensitivity analysis using a range of assumptions was conducted to better estimate the system's performance. The Hospital of the University of Pennsylvania, an academic medical center and ARDSNet center (1994-2006). Intubated patients in medical and surgical intensive care units. None. Of 1270 patients screened, 84 were identified with acute lung injury (incidence of 6.6%). The automated screening system had a sensitivity of 97.6% (95% confidence interval, 96.8-98.4%) and a specificity of 97.6% (95% confidence interval, 96.8-98.4%). The manual screening algorithm had a sensitivity of 57.1% (95% confidence interval, 54.5-59.8%) and a specificity of 99.7% (95% confidence interval, 99.4-100%). Sensitivity analysis demonstrated a range for sensitivity of 75.0-97.6% of the automated system under varying assumptions. Under all assumptions, the automated system demonstrated higher sensitivity than and comparable specificity to the manual screening method. An automated electronic system identified patients with acute lung injury with high sensitivity and specificity in diverse intensive care units of a large academic medical center. Further studies are needed to evaluate the effect of automated prompts that such a system can initiate on the use of lung protective ventilation in patients with acute lung injury.
32 CFR 2001.50 - Telecommunications automated information systems and network security.
Code of Federal Regulations, 2014 CFR
2014-07-01
... and network security. 2001.50 Section 2001.50 National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED... network security. Each agency head shall ensure that classified information electronically accessed...
32 CFR 2001.50 - Telecommunications automated information systems and network security.
Code of Federal Regulations, 2013 CFR
2013-07-01
... and network security. 2001.50 Section 2001.50 National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED... network security. Each agency head shall ensure that classified information electronically accessed...
32 CFR 2001.50 - Telecommunications automated information systems and network security.
Code of Federal Regulations, 2012 CFR
2012-07-01
... and network security. 2001.50 Section 2001.50 National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED... network security. Each agency head shall ensure that classified information electronically accessed...
Report of the panel on international programs
NASA Technical Reports Server (NTRS)
Anderson, Allen Joel; Fuchs, Karl W.; Ganeka, Yasuhiro; Gaur, Vinod; Green, Andrew A.; Siegfried, W.; Lambert, Anthony; Rais, Jacub; Reighber, Christopher; Seeger, Herman
1991-01-01
The panel recommends that NASA participate and take an active role in the continuous monitoring of existing regional networks, the realization of high resolution geopotential and topographic missions, the establishment of interconnection of the reference frames as defined by different space techniques, the development and implementation of automation for all ground-to-space observing systems, calibration and validation experiments for measuring techniques and data, the establishment of international space-based networks for real-time transmission of high density space data in standardized formats, tracking and support for non-NASA missions, and the extension of state-of-the art observing and analysis techniques to developing nations.
A Petri Net model for distributed energy system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Konopko, Joanna
2015-12-31
Electrical networks need to evolve to become more intelligent, more flexible and less costly. The smart grid is the next generation power energy, uses two-way flows of electricity and information to create a distributed automated energy delivery network. Building a comprehensive smart grid is a challenge for system protection, optimization and energy efficient. Proper modeling and analysis is needed to build an extensive distributed energy system and intelligent electricity infrastructure. In this paper, the whole model of smart grid have been proposed using Generalized Stochastic Petri Nets (GSPN). The simulation of created model is also explored. The simulation of themore » model has allowed the analysis of how close the behavior of the model is to the usage of the real smart grid.« less
Baek, K; Morris, L S; Kundu, P; Voon, V
2017-03-01
The efficient organization and communication of brain networks underlie cognitive processing and their disruption can lead to pathological behaviours. Few studies have focused on whole-brain networks in obesity and binge eating disorder (BED). Here we used multi-echo resting-state functional magnetic resonance imaging (rsfMRI) along with a data-driven graph theory approach to assess brain network characteristics in obesity and BED. Multi-echo rsfMRI scans were collected from 40 obese subjects (including 20 BED patients) and 40 healthy controls and denoised using multi-echo independent component analysis (ME-ICA). We constructed a whole-brain functional connectivity matrix with normalized correlation coefficients between regional mean blood oxygenation level-dependent (BOLD) signals from 90 brain regions in the Automated Anatomical Labeling atlas. We computed global and regional network properties in the binarized connectivity matrices with an edge density of 5%-25%. We also verified our findings using a separate parcellation, the Harvard-Oxford atlas parcellated into 470 regions. Obese subjects exhibited significantly reduced global and local network efficiency as well as decreased modularity compared with healthy controls, showing disruption in small-world and modular network structures. In regional metrics, the putamen, pallidum and thalamus exhibited significantly decreased nodal degree and efficiency in obese subjects. Obese subjects also showed decreased connectivity of cortico-striatal/cortico-thalamic networks associated with putaminal and cortical motor regions. These findings were significant with ME-ICA with limited group differences observed with conventional denoising or single-echo analysis. Using this data-driven analysis of multi-echo rsfMRI data, we found disruption in global network properties and motor cortico-striatal networks in obesity consistent with habit formation theories. Our findings highlight the role of network properties in pathological food misuse as possible biomarkers and therapeutic targets.
Cao, Weidong; Bean, Brian; Corey, Scott; Coursey, Johnathan S; Hasson, Kenton C; Inoue, Hiroshi; Isano, Taisuke; Kanderian, Sami; Lane, Ben; Liang, Hongye; Murphy, Brian; Owen, Greg; Shinoda, Nobuhiko; Zeng, Shulin; Knight, Ivor T
2016-06-01
We report the development of an automated genetic analyzer for human sample testing based on microfluidic rapid polymerase chain reaction (PCR) with high-resolution melting analysis (HRMA). The integrated DNA microfluidic cartridge was used on a platform designed with a robotic pipettor system that works by sequentially picking up different test solutions from a 384-well plate, mixing them in the tips, and delivering mixed fluids to the DNA cartridge. A novel image feedback flow control system based on a Canon 5D Mark II digital camera was developed for controlling fluid movement through a complex microfluidic branching network without the use of valves. The same camera was used for measuring the high-resolution melt curve of DNA amplicons that were generated in the microfluidic chip. Owing to fast heating and cooling as well as sensitive temperature measurement in the microfluidic channels, the time frame for PCR and HRMA was dramatically reduced from hours to minutes. Preliminary testing results demonstrated that rapid serial PCR and HRMA are possible while still achieving high data quality that is suitable for human sample testing. © 2015 Society for Laboratory Automation and Screening.
Li, Tim MH; Wong, Paul WC; Lai, Eliza SY; Yip, Paul SF
2013-01-01
Background Internet-based learning programs provide people with massive health care information and self-help guidelines on improving their health. The advent of Web 2.0 and social networks renders significant flexibility to embedding highly interactive components, such as games, to foster learning processes. The effectiveness of game-based learning on social networks has not yet been fully evaluated. Objectives The aim of this study was to assess the effectiveness of a fully automated, Web-based, social network electronic game on enhancing mental health knowledge and problem-solving skills of young people. We investigated potential motivational constructs directly affecting the learning outcome. Gender differences in learning outcome and motivation were also examined. Methods A pre/posttest design was used to evaluate the fully automated Web-based intervention. Participants, recruited from a closed online user group, self-assessed their mental health literacy and motivational constructs before and after completing the game within a 3-week period. The electronic game was designed according to cognitive-behavioral approaches. Completers and intent-to-treat analyses, using multiple imputation for missing data, were performed. Regression analysis with backward selection was employed when examining the relationship between knowledge enhancement and motivational constructs. Results The sample included 73 undergraduates (42 females) for completers analysis. The gaming approach was effective in enhancing young people’s mental health literacy (d=0.65). The finding was also consistent with the intent-to-treat analysis, which included 127 undergraduates (75 females). No gender differences were found in learning outcome (P=.97). Intrinsic goal orientation was the primary factor in learning motivation, whereas test anxiety was successfully alleviated in the game setting. No gender differences were found on any learning motivation subscales (P>.10). We also found that participants’ self-efficacy for learning and performance, as well as test anxiety, significantly affected their learning outcomes, whereas other motivational subscales were statistically nonsignificant. Conclusions Electronic games implemented through social networking sites appear to effectively enhance users’ mental health literacy. PMID:23676714
Parametric analysis of parameters for electrical-load forecasting using artificial neural networks
NASA Astrophysics Data System (ADS)
Gerber, William J.; Gonzalez, Avelino J.; Georgiopoulos, Michael
1997-04-01
Accurate total system electrical load forecasting is a necessary part of resource management for power generation companies. The better the hourly load forecast, the more closely the power generation assets of the company can be configured to minimize the cost. Automating this process is a profitable goal and neural networks should provide an excellent means of doing the automation. However, prior to developing such a system, the optimal set of input parameters must be determined. The approach of this research was to determine what those inputs should be through a parametric study of potentially good inputs. Input parameters tested were ambient temperature, total electrical load, the day of the week, humidity, dew point temperature, daylight savings time, length of daylight, season, forecast light index and forecast wind velocity. For testing, a limited number of temperatures and total electrical loads were used as a basic reference input parameter set. Most parameters showed some forecasting improvement when added individually to the basic parameter set. Significantly, major improvements were exhibited with the day of the week, dew point temperatures, additional temperatures and loads, forecast light index and forecast wind velocity.
Lubbock, Alexander L. R.; Katz, Elad; Harrison, David J.; Overton, Ian M.
2013-01-01
Tissue microarrays (TMAs) allow multiplexed analysis of tissue samples and are frequently used to estimate biomarker protein expression in tumour biopsies. TMA Navigator (www.tmanavigator.org) is an open access web application for analysis of TMA data and related information, accommodating categorical, semi-continuous and continuous expression scores. Non-biological variation, or batch effects, can hinder data analysis and may be mitigated using the ComBat algorithm, which is incorporated with enhancements for automated application to TMA data. Unsupervised grouping of samples (patients) is provided according to Gaussian mixture modelling of marker scores, with cardinality selected by Bayesian information criterion regularization. Kaplan–Meier survival analysis is available, including comparison of groups identified by mixture modelling using the Mantel-Cox log-rank test. TMA Navigator also supports network inference approaches useful for TMA datasets, which often constitute comparatively few markers. Tissue and cell-type specific networks derived from TMA expression data offer insights into the molecular logic underlying pathophenotypes, towards more effective and personalized medicine. Output is interactive, and results may be exported for use with external programs. Private anonymous access is available, and user accounts may be generated for easier data management. PMID:23761446
Automated System Marketplace 1993. Part I: Focus on Minicomputers.
ERIC Educational Resources Information Center
Bridge, Frank R.
1993-01-01
The first part of the annual automation marketplace survey examines minicomputer systems in libraries. Highlights include vendor consolidation and acquisitions; system interconnection; networked databases; products related to the Americans with Disabilities Act; multimedia; vendor installations worldwide; academic versus public library…
Choosing the Right Integrator for Your Building Automation Project.
ERIC Educational Resources Information Center
Podgorski, Will
2002-01-01
Examines the prevailing definitions and responsibilities of product, network, and system integrators for building automation systems; offers a novel approach to system integration; and sets realistic expectations for the owner in terms of benefits, outcomes, and overall values. (EV)
Using Bayesian networks to support decision-focused information retrieval
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lehner, P.; Elsaesser, C.; Seligman, L.
This paper has described an approach to controlling the process of pulling data/information from distributed data bases in a way that is specific to a persons specific decision making context. Our prototype implementation of this approach uses a knowledge-based planner to generate a plan, an automatically constructed Bayesian network to evaluate the plan, specialized processing of the network to derive key information items that would substantially impact the evaluation of the plan (e.g., determine that replanning is needed), automated construction of Standing Requests for Information (SRIs) which are automated functions that monitor changes and trends in distributed data base thatmore » are relevant to the key information items. This emphasis of this paper is on how Bayesian networks are used.« less
Automation of Some Operations of a Wind Tunnel Using Artificial Neural Networks
NASA Technical Reports Server (NTRS)
Decker, Arthur J.; Buggele, Alvin E.
1996-01-01
Artificial neural networks were used successfully to sequence operations in a small, recently modernized, supersonic wind tunnel at NASA-Lewis Research Center. The neural nets generated correct estimates of shadowgraph patterns, pressure sensor readings and mach numbers for conditions occurring shortly after startup and extending to fully developed flow. Artificial neural networks were trained and tested for estimating: sensor readings from shadowgraph patterns, shadowgraph patterns from shadowgraph patterns and sensor readings from sensor readings. The 3.81 by 10 in. (0.0968 by 0.254 m) tunnel was operated with its mach 2.0 nozzle, and shadowgraph was recorded near the nozzle exit. These results support the thesis that artificial neural networks can be combined with current workstation technology to automate wind tunnel operations.
Concurrent white matter bundles and grey matter networks using independent component analysis.
O'Muircheartaigh, Jonathan; Jbabdi, Saad
2018-04-15
Developments in non-invasive diffusion MRI tractography techniques have permitted the investigation of both the anatomy of white matter pathways connecting grey matter regions and their structural integrity. In parallel, there has been an expansion in automated techniques aimed at parcellating grey matter into distinct regions based on functional imaging. Here we apply independent component analysis to whole-brain tractography data to automatically extract brain networks based on their associated white matter pathways. This method decomposes the tractography data into components that consist of paired grey matter 'nodes' and white matter 'edges', and automatically separates major white matter bundles, including known cortico-cortical and cortico-subcortical tracts. We show how this framework can be used to investigate individual variations in brain networks (in terms of both nodes and edges) as well as their associations with individual differences in behaviour and anatomy. Finally, we investigate correspondences between tractography-based brain components and several canonical resting-state networks derived from functional MRI. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Connell, Andrea M.
2011-01-01
The Deep Space Network (DSN) has three communication facilities which handle telemetry, commands, and other data relating to spacecraft missions. The network requires these three sites to share data with each other and with the Jet Propulsion Laboratory for processing and distribution. Many database management systems have replication capabilities built in, which means that data updates made at one location will be automatically propagated to other locations. This project examines multiple replication solutions, looking for stability, automation, flexibility, performance, and cost. After comparing these features, Oracle Streams is chosen for closer analysis. Two Streams environments are configured - one with a Master/Slave architecture, in which a single server is the source for all data updates, and the second with a Multi-Master architecture, in which updates originating from any of the servers will be propagated to all of the others. These environments are tested for data type support, conflict resolution, performance, changes to the data structure, and behavior during and after network or server outages. Through this experimentation, it is determined which requirements of the DSN can be met by Oracle Streams and which cannot.
Rapid test for the detection of hazardous microbiological material
NASA Astrophysics Data System (ADS)
Mordmueller, Mario; Bohling, Christian; John, Andreas; Schade, Wolfgang
2009-09-01
After attacks with anthrax pathogens have been committed since 2001 all over the world the fast detection and determination of biological samples has attracted interest. A very promising method for a rapid test is Laser Induced Breakdown Spectroscopy (LIBS). LIBS is an optical method which uses time-resolved or time-integrated spectral analysis of optical plasma emission after pulsed laser excitation. Even though LIBS is well established for the determination of metals and other inorganic materials the analysis of microbiological organisms is difficult due to their very similar stoichiometric composition. To analyze similar LIBS-spectra computer assisted chemometrics is a very useful approach. In this paper we report on first results of developing a compact and fully automated rapid test for the detection of hazardous microbiological material. Experiments have been carried out with two setups: A bulky one which is composed of standard laboratory components and a compact one consisting of miniaturized industrial components. Both setups work at an excitation wavelength of λ=1064nm (Nd:YAG). Data analysis is done by Principal Component Analysis (PCA) with an adjacent neural network for fully automated sample identification.
Automated detection of videotaped neonatal seizures based on motion segmentation methods.
Karayiannis, Nicolaos B; Tao, Guozhi; Frost, James D; Wise, Merrill S; Hrachovy, Richard A; Mizrahi, Eli M
2006-07-01
This study was aimed at the development of a seizure detection system by training neural networks using quantitative motion information extracted by motion segmentation methods from short video recordings of infants monitored for seizures. The motion of the infants' body parts was quantified by temporal motion strength signals extracted from video recordings by motion segmentation methods based on optical flow computation. The area of each frame occupied by the infants' moving body parts was segmented by direct thresholding, by clustering of the pixel velocities, and by clustering the motion parameters obtained by fitting an affine model to the pixel velocities. The computational tools and procedures developed for automated seizure detection were tested and evaluated on 240 short video segments selected and labeled by physicians from a set of video recordings of 54 patients exhibiting myoclonic seizures (80 segments), focal clonic seizures (80 segments), and random infant movements (80 segments). The experimental study described in this paper provided the basis for selecting the most effective strategy for training neural networks to detect neonatal seizures as well as the decision scheme used for interpreting the responses of the trained neural networks. Depending on the decision scheme used for interpreting the responses of the trained neural networks, the best neural networks exhibited sensitivity above 90% or specificity above 90%. The best among the motion segmentation methods developed in this study produced quantitative features that constitute a reliable basis for detecting myoclonic and focal clonic neonatal seizures. The performance targets of this phase of the project may be achieved by combining the quantitative features described in this paper with those obtained by analyzing motion trajectory signals produced by motion tracking methods. A video system based upon automated analysis potentially offers a number of advantages. Infants who are at risk for seizures could be monitored continuously using relatively inexpensive and non-invasive video techniques that supplement direct observation by nursery personnel. This would represent a major advance in seizure surveillance and offers the possibility for earlier identification of potential neurological problems and subsequent intervention.
Forecasting Flare Activity Using Deep Convolutional Neural Networks
NASA Astrophysics Data System (ADS)
Hernandez, T.
2017-12-01
Current operational flare forecasting relies on human morphological analysis of active regions and the persistence of solar flare activity through time (i.e. that the Sun will continue to do what it is doing right now: flaring or remaining calm). In this talk we present the results of applying deep Convolutional Neural Networks (CNNs) to the problem of solar flare forecasting. CNNs operate by training a set of tunable spatial filters that, in combination with neural layer interconnectivity, allow CNNs to automatically identify significant spatial structures predictive for classification and regression problems. We will start by discussing the applicability and success rate of the approach, the advantages it has over non-automated forecasts, and how mining our trained neural network provides a fresh look into the mechanisms behind magnetic energy storage and release.
Information security system quality assessment through the intelligent tools
NASA Astrophysics Data System (ADS)
Trapeznikov, E. V.
2018-04-01
The technology development has shown the automated system information security comprehensive analysis necessity. The subject area analysis indicates the study relevance. The research objective is to develop the information security system quality assessment methodology based on the intelligent tools. The basis of the methodology is the information security assessment model in the information system through the neural network. The paper presents the security assessment model, its algorithm. The methodology practical implementation results in the form of the software flow diagram are represented. The practical significance of the model being developed is noted in conclusions.
NASA Technical Reports Server (NTRS)
Pham, Timothy T.; Machuzak, Richard J.; Bedrossian, Alina; Kelly, Richard M.; Liao, Jason C.
2012-01-01
This software provides an automated capability to measure and qualify the frequency stability performance of the Deep Space Network (DSN) ground system, using daily spacecraft tracking data. The results help to verify if the DSN performance is meeting its specification, therefore ensuring commitments to flight missions; in particular, the radio science investigations. The rich set of data also helps the DSN Operations and Maintenance team to identify the trends and patterns, allowing them to identify the antennas of lower performance and implement corrective action in a timely manner. Unlike the traditional approach where the performance can only be obtained from special calibration sessions that are both time-consuming and require manual setup, the new method taps into the daily spacecraft tracking data. This new approach significantly increases the amount of data available for analysis, roughly by two orders of magnitude, making it possible to conduct trend analysis with good confidence. The software is built with automation in mind for end-to-end processing. From the inputs gathering to computation analysis and later data visualization of the results, all steps are done automatically, making the data production at near zero cost. This allows the limited engineering resource to focus on high-level assessment and to follow up with the exceptions/deviations. To make it possible to process the continual stream of daily incoming data without much effort, and to understand the results quickly, the processing needs to be automated and the data summarized at a high level. Special attention needs to be given to data gathering, input validation, handling anomalous conditions, computation, and presenting the results in a visual form that makes it easy to spot items of exception/ deviation so that further analysis can be directed and corrective actions followed.
NASA Technical Reports Server (NTRS)
Pham, Timothy T.; Machuzak, Richard J.; Bedrossian, Alina; Kelly, Richard M.; Liao, Jason C.
2012-01-01
This software provides an automated capability to measure and qualify the frequency stability performance of the Deep Space Network (DSN) ground system, using daily spacecraft tracking data. The results help to verify if the DSN performance is meeting its specification, therefore ensuring commitments to flight missions; in particular, the radio science investigations. The rich set of data also helps the DSN Operations and Maintenance team to identify the trends and patterns, allowing them to identify the antennas of lower performance and implement corrective action in a timely manner. Unlike the traditional approach where the performance can only be obtained from special calibration sessions that are both time-consuming and require manual setup, the new method taps into the daily spacecraft tracking data. This new approach significantly increases the amount of data available for analysis, roughly by two orders of magnitude, making it possible to conduct trend analysis with good confidence. The software is built with automation in mind for end-to-end processing. From the inputs gathering to computation analysis and later data visualization of the results, all steps are done automatically, making the data production at near zero cost. This allows the limited engineering resource to focus on high-level assessment and to follow up with the exceptions/deviations. To make it possible to process the continual stream of daily incoming data without much effort, and to understand the results quickly, the processing needs to be automated and the data summarized at a high level. Special attention needs to be given to data gathering, input validation, handling anomalous conditions, computation, and presenting the results in a visual form that makes it easy to spot items of exception/deviation so that further analysis can be directed and corrective actions followed.
Vlaic, Sebastian; Hoffmann, Bianca; Kupfer, Peter; Weber, Michael; Dräger, Andreas
2013-09-01
GRN2SBML automatically encodes gene regulatory networks derived from several inference tools in systems biology markup language. Providing a graphical user interface, the networks can be annotated via the simple object access protocol (SOAP)-based application programming interface of BioMart Central Portal and minimum information required in the annotation of models registry. Additionally, we provide an R-package, which processes the output of supported inference algorithms and automatically passes all required parameters to GRN2SBML. Therefore, GRN2SBML closes a gap in the processing pipeline between the inference of gene regulatory networks and their subsequent analysis, visualization and storage. GRN2SBML is freely available under the GNU Public License version 3 and can be downloaded from http://www.hki-jena.de/index.php/0/2/490. General information on GRN2SBML, examples and tutorials are available at the tool's web page.
Biomorphic networks: approach to invariant feature extraction and segmentation for ATR
NASA Astrophysics Data System (ADS)
Baek, Andrew; Farhat, Nabil H.
1998-10-01
Invariant features in two dimensional binary images are extracted in a single layer network of locally coupled spiking (pulsating) model neurons with prescribed synapto-dendritic response. The feature vector for an image is represented as invariant structure in the aggregate histogram of interspike intervals obtained by computing time intervals between successive spikes produced from each neuron over a given period of time and combining such intervals from all neurons in the network into a histogram. Simulation results show that the feature vectors are more pattern-specific and invariant under translation, rotation, and change in scale or intensity than achieved in earlier work. We also describe an application of such networks to segmentation of line (edge-enhanced or silhouette) images. The biomorphic spiking network's capabilities in segmentation and invariant feature extraction may prove to be, when they are combined, valuable in Automated Target Recognition (ATR) and other automated object recognition systems.
Cytopathological image analysis using deep-learning networks in microfluidic microscopy.
Gopakumar, G; Hari Babu, K; Mishra, Deepak; Gorthi, Sai Siva; Sai Subrahmanyam, Gorthi R K
2017-01-01
Cytopathologic testing is one of the most critical steps in the diagnosis of diseases, including cancer. However, the task is laborious and demands skill. Associated high cost and low throughput drew considerable interest in automating the testing process. Several neural network architectures were designed to provide human expertise to machines. In this paper, we explore and propose the feasibility of using deep-learning networks for cytopathologic analysis by performing the classification of three important unlabeled, unstained leukemia cell lines (K562, MOLT, and HL60). The cell images used in the classification are captured using a low-cost, high-throughput cell imaging technique: microfluidics-based imaging flow cytometry. We demonstrate that without any conventional fine segmentation followed by explicit feature extraction, the proposed deep-learning algorithms effectively classify the coarsely localized cell lines. We show that the designed deep belief network as well as the deeply pretrained convolutional neural network outperform the conventionally used decision systems and are important in the medical domain, where the availability of labeled data is limited for training. We hope that our work enables the development of a clinically significant high-throughput microfluidic microscopy-based tool for disease screening/triaging, especially in resource-limited settings.
Automated Interpretation of Blood Culture Gram Stains by Use of a Deep Convolutional Neural Network.
Smith, Kenneth P; Kang, Anthony D; Kirby, James E
2018-03-01
Microscopic interpretation of stained smears is one of the most operator-dependent and time-intensive activities in the clinical microbiology laboratory. Here, we investigated application of an automated image acquisition and convolutional neural network (CNN)-based approach for automated Gram stain classification. Using an automated microscopy platform, uncoverslipped slides were scanned with a 40× dry objective, generating images of sufficient resolution for interpretation. We collected 25,488 images from positive blood culture Gram stains prepared during routine clinical workup. These images were used to generate 100,213 crops containing Gram-positive cocci in clusters, Gram-positive cocci in chains/pairs, Gram-negative rods, or background (no cells). These categories were targeted for proof-of-concept development as they are associated with the majority of bloodstream infections. Our CNN model achieved a classification accuracy of 94.9% on a test set of image crops. Receiver operating characteristic (ROC) curve analysis indicated a robust ability to differentiate between categories with an area under the curve of >0.98 for each. After training and validation, we applied the classification algorithm to new images collected from 189 whole slides without human intervention. Sensitivity and specificity were 98.4% and 75.0% for Gram-positive cocci in chains and pairs, 93.2% and 97.2% for Gram-positive cocci in clusters, and 96.3% and 98.1% for Gram-negative rods. Taken together, our data support a proof of concept for a fully automated classification methodology for blood-culture Gram stains. Importantly, the algorithm was highly adept at identifying image crops with organisms and could be used to present prescreened, classified crops to technologists to accelerate smear review. This concept could potentially be extended to all Gram stain interpretive activities in the clinical laboratory. Copyright © 2018 American Society for Microbiology.
Automated Network Mapping and Topology Verification
2016-06-01
collection of information includes amplifying data about the networked devices such as hardware details, logical addressing schemes, 7 operating ...collection of information, including suggestions for reducing this burden, to Washington headquarters Services, Directorate for Information Operations ...maximum 200 words) The current military reliance on computer networks for operational missions and administrative duties makes network
Autoscoring Essays Based on Complex Networks
ERIC Educational Resources Information Center
Ke, Xiaohua; Zeng, Yongqiang; Luo, Haijiao
2016-01-01
This article presents a novel method, the Complex Dynamics Essay Scorer (CDES), for automated essay scoring using complex network features. Texts produced by college students in China were represented as scale-free networks (e.g., a word adjacency model) from which typical network features, such as the in-/out-degrees, clustering coefficient (CC),…
NASA Technical Reports Server (NTRS)
Ramamoorthy, P. A.; Huang, Song; Govind, Girish
1991-01-01
In fault diagnosis, control and real-time monitoring, both timing and accuracy are critical for operators or machines to reach proper solutions or appropriate actions. Expert systems are becoming more popular in the manufacturing community for dealing with such problems. In recent years, neural networks have revived and their applications have spread to many areas of science and engineering. A method of using neural networks to implement rule-based expert systems for time-critical applications is discussed here. This method can convert a given rule-based system into a neural network with fixed weights and thresholds. The rules governing the translation are presented along with some examples. We also present the results of automated machine implementation of such networks from the given rule-base. This significantly simplifies the translation process to neural network expert systems from conventional rule-based systems. Results comparing the performance of the proposed approach based on neural networks vs. the classical approach are given. The possibility of very large scale integration (VLSI) realization of such neural network expert systems is also discussed.
Reconstruction of biological pathways and metabolic networks from in silico labeled metabolites.
Hadadi, Noushin; Hafner, Jasmin; Soh, Keng Cher; Hatzimanikatis, Vassily
2017-01-01
Reaction atom mappings track the positional changes of all of the atoms between the substrates and the products as they undergo the biochemical transformation. However, information on atom transitions in the context of metabolic pathways is not widely available in the literature. The understanding of metabolic pathways at the atomic level is of great importance as it can deconvolute the overlapping catabolic/anabolic pathways resulting in the observed metabolic phenotype. The automated identification of atom transitions within a metabolic network is a very challenging task since the degree of complexity of metabolic networks dramatically increases when we transit from metabolite-level studies to atom-level studies. Despite being studied extensively in various approaches, the field of atom mapping of metabolic networks is lacking an automated approach, which (i) accounts for the information of reaction mechanism for atom mapping and (ii) is extendable from individual atom-mapped reactions to atom-mapped reaction networks. Hereby, we introduce a computational framework, iAM.NICE (in silico Atom Mapped Network Integrated Computational Explorer), for the systematic atom-level reconstruction of metabolic networks from in silico labelled substrates. iAM.NICE is to our knowledge the first automated atom-mapping algorithm that is based on the underlying enzymatic biotransformation mechanisms, and its application goes beyond individual reactions and it can be used for the reconstruction of atom-mapped metabolic networks. We illustrate the applicability of our method through the reconstruction of atom-mapped reactions of the KEGG database and we provide an example of an atom-level representation of the core metabolic network of E. coli. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Computer technology applications in industrial and organizational psychology.
Crespin, Timothy R; Austin, James T
2002-08-01
This article reviews computer applications developed and utilized by industrial-organizational (I-O) psychologists, both in practice and in research. A primary emphasis is on applications developed for Internet usage, because this "network of networks" changes the way I-O psychologists work. The review focuses on traditional and emerging topics in I-O psychology. The first topic involves information technology applications in measurement, defined broadly across levels of analysis (persons, groups, organizations) and domains (abilities, personality, attitudes). Discussion then focuses on individual learning at work, both in formal training and in coping with continual automation of work. A section on job analysis follows, illustrating the role of computers and the Internet in studying jobs. Shifting focus to the group level of analysis, we briefly review how information technology is being used to understand and support cooperative work. Finally, special emphasis is given to the emerging "third discipline" in I-O psychology research-computational modeling of behavioral events in organizations. Throughout this review, themes of innovation and dissemination underlie a continuum between research and practice. The review concludes by setting a framework for I-O psychology in a computerized and networked world.
High-power transmitter automation, part 2
NASA Technical Reports Server (NTRS)
Gregg, M. A.
1981-01-01
The current status of the transmitter automation development is reported. The work described is applicable to all transmitters in the Deep Space Network. New interface and software designs are described which improve reliability and reduce the time required for subsystem turn on and klystron saturation.
1993-12-10
applied to the 3-component IRIS/IDA data under simulated operational conditions. The result was a reduction in the number false-alarms produced by the automated processing and interpretation system by about 60%
NASA Astrophysics Data System (ADS)
Nakagawa, M.; Akano, K.; Kobayashi, T.; Sekiguchi, Y.
2017-09-01
Image-based virtual reality (VR) is a virtual space generated with panoramic images projected onto a primitive model. In imagebased VR, realistic VR scenes can be generated with lower rendering cost, and network data can be described as relationships among VR scenes. The camera network data are generated manually or by an automated procedure using camera position and rotation data. When panoramic images are acquired in indoor environments, network data should be generated without Global Navigation Satellite Systems (GNSS) positioning data. Thus, we focused on image-based VR generation using a panoramic camera in indoor environments. We propose a methodology to automate network data generation using panoramic images for an image-based VR space. We verified and evaluated our methodology through five experiments in indoor environments, including a corridor, elevator hall, room, and stairs. We confirmed that our methodology can automatically reconstruct network data using panoramic images for image-based VR in indoor environments without GNSS position data.
Using Bayesian Networks and Decision Theory to Model Physical Security
2003-02-01
Home automation technologies allow a person to monitor and control various activities within a home or office setting. Cameras, sensors and other...components used along with the simple rules in the home automation software provide an environment where the lights, security and other appliances can be...monitored and controlled. These home automation technologies, however, lack the power to reason under uncertain conditions and thus the system can
Automated Power-Distribution System
NASA Technical Reports Server (NTRS)
Thomason, Cindy; Anderson, Paul M.; Martin, James A.
1990-01-01
Automated power-distribution system monitors and controls electrical power to modules in network. Handles both 208-V, 20-kHz single-phase alternating current and 120- to 150-V direct current. Power distributed to load modules from power-distribution control units (PDCU's) via subsystem distributors. Ring busses carry power to PDCU's from power source. Needs minimal attention. Detects faults and also protects against them. Potential applications include autonomous land vehicles and automated industrial process systems.
Stanislawski, Larry V.; Survila, Kornelijus; Wendel, Jeffrey; Liu, Yan; Buttenfield, Barbara P.
2018-01-01
This paper describes a workflow for automating the extraction of elevation-derived stream lines using open source tools with parallel computing support and testing the effectiveness of procedures in various terrain conditions within the conterminous United States. Drainage networks are extracted from the US Geological Survey 1/3 arc-second 3D Elevation Program elevation data having a nominal cell size of 10 m. This research demonstrates the utility of open source tools with parallel computing support for extracting connected drainage network patterns and handling depressions in 30 subbasins distributed across humid, dry, and transitional climate regions and in terrain conditions exhibiting a range of slopes. Special attention is given to low-slope terrain, where network connectivity is preserved by generating synthetic stream channels through lake and waterbody polygons. Conflation analysis compares the extracted streams with a 1:24,000-scale National Hydrography Dataset flowline network and shows that similarities are greatest for second- and higher-order tributaries.
Integrated Evaluation of Reliability and Power Consumption of Wireless Sensor Networks
Dâmaso, Antônio; Maciel, Paulo
2017-01-01
Power consumption is a primary interest in Wireless Sensor Networks (WSNs), and a large number of strategies have been proposed to evaluate it. However, those approaches usually neither consider reliability issues nor the power consumption of applications executing in the network. A central concern is the lack of consolidated solutions that enable us to evaluate the power consumption of applications and the network stack also considering their reliabilities. To solve this problem, we introduce a fully automatic solution to design power consumption aware WSN applications and communication protocols. The solution presented in this paper comprises a methodology to evaluate the power consumption based on the integration of formal models, a set of power consumption and reliability models, a sensitivity analysis strategy to select WSN configurations and a toolbox named EDEN to fully support the proposed methodology. This solution allows accurately estimating the power consumption of WSN applications and the network stack in an automated way. PMID:29113078
Vadigepalli, Rajanikanth; Chakravarthula, Praveen; Zak, Daniel E; Schwaber, James S; Gonye, Gregory E
2003-01-01
We have developed a bioinformatics tool named PAINT that automates the promoter analysis of a given set of genes for the presence of transcription factor binding sites. Based on coincidence of regulatory sites, this tool produces an interaction matrix that represents a candidate transcriptional regulatory network. This tool currently consists of (1) a database of promoter sequences of known or predicted genes in the Ensembl annotated mouse genome database, (2) various modules that can retrieve and process the promoter sequences for binding sites of known transcription factors, and (3) modules for visualization and analysis of the resulting set of candidate network connections. This information provides a substantially pruned list of genes and transcription factors that can be examined in detail in further experimental studies on gene regulation. Also, the candidate network can be incorporated into network identification methods in the form of constraints on feasible structures in order to render the algorithms tractable for large-scale systems. The tool can also produce output in various formats suitable for use in external visualization and analysis software. In this manuscript, PAINT is demonstrated in two case studies involving analysis of differentially regulated genes chosen from two microarray data sets. The first set is from a neuroblastoma N1E-115 cell differentiation experiment, and the second set is from neuroblastoma N1E-115 cells at different time intervals following exposure to neuropeptide angiotensin II. PAINT is available for use as an agent in BioSPICE simulation and analysis framework (www.biospice.org), and can also be accessed via a WWW interface at www.dbi.tju.edu/dbi/tools/paint/.
DOT National Transportation Integrated Search
2016-01-01
State highway agencies (SHAs) routinely employ semi-automated and automated image-based methods for network-level : pavement-cracking data collection, and there are different types of pavement-cracking data collected by SHAs for reporting and : manag...
Ionospheric Modeling: Development, Verification and Validation
2005-09-01
facilitate the automated processing of a large network of GPS receiver data. 4.; CALIBRATION AND VALIDATION OF IONOSPHERIC SENSORS We have been...NOFS Workshop, Estes Park, CO, January 2005. W. Rideout, A. Coster, P. Doherty, MIT Haystack Automated Processing of GPS Data to Produce Worldwide TEC
NASA Technical Reports Server (NTRS)
Decker, Arthur J.; Krasowski, Michael J.; Weiland, Kenneth E.
1993-01-01
This report describes an effort at NASA Lewis Research Center to use artificial neural networks to automate the alignment and control of optical measurement systems. Specifically, it addresses the use of commercially available neural network software and hardware to direct alignments of the common laser-beam-smoothing spatial filter. The report presents a general approach for designing alignment records and combining these into training sets to teach optical alignment functions to neural networks and discusses the use of these training sets to train several types of neural networks. Neural network configurations used include the adaptive resonance network, the back-propagation-trained network, and the counter-propagation network. This work shows that neural networks can be used to produce robust sequencers. These sequencers can learn by example to execute the step-by-step procedures of optical alignment and also can learn adaptively to correct for environmentally induced misalignment. The long-range objective is to use neural networks to automate the alignment and operation of optical measurement systems in remote, harsh, or dangerous aerospace environments. This work also shows that when neural networks are trained by a human operator, training sets should be recorded, training should be executed, and testing should be done in a manner that does not depend on intellectual judgments of the human operator.
Ruan, W; Bürkle, T; Dudeck, J
2000-01-01
In this paper we present a data dictionary server for the automated navigation of information sources. The underlying knowledge is represented within a medical data dictionary. The mapping between medical terms and information sources is based on a semantic network. The key aspect of implementing the dictionary server is how to represent the semantic network in a way that is easier to navigate and to operate, i.e. how to abstract the semantic network and to represent it in memory for various operations. This paper describes an object-oriented design based on Java that represents the semantic network in terms of a group of objects. A node and its relationships to its neighbors are encapsulated in one object. Based on such a representation model, several operations have been implemented. They comprise the extraction of parts of the semantic network which can be reached from a given node as well as finding all paths between a start node and a predefined destination node. This solution is independent of any given layout of the semantic structure. Therefore the module, called Giessen Data Dictionary Server can act independent of a specific clinical information system. The dictionary server will be used to present clinical information, e.g. treatment guidelines or drug information sources to the clinician in an appropriate working context. The server is invoked from clinical documentation applications which contain an infobutton. Automated navigation will guide the user to all the information relevant to her/his topic, which is currently available inside our closed clinical network.
Connecting remote systems for demonstration of automation technologies
NASA Technical Reports Server (NTRS)
Brown, R. M.; Yee, R.
1988-01-01
An initial estimate of the communications requirements of the Systems Autonomy Demonstration Project (SADP) development and demonstration environments is presented. A proposed network paradigm is developed, and options for network topologies are explored.
Centralized and distributed control architectures under Foundation Fieldbus network.
Persechini, Maria Auxiliadora Muanis; Jota, Fábio Gonçalves
2013-01-01
This paper aims at discussing possible automation and control system architectures based on fieldbus networks in which the controllers can be implemented either in a centralized or in a distributed form. An experimental setup is used to demonstrate some of the addressed issues. The control and automation architecture is composed of a supervisory system, a programmable logic controller and various other devices connected to a Foundation Fieldbus H1 network. The procedures used in the network configuration, in the process modelling and in the design and implementation of controllers are described. The specificities of each one of the considered logical organizations are also discussed. Finally, experimental results are analysed using an algorithm for the assessment of control loops to compare the performances between the centralized and the distributed implementations. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Oneill-Rood, Nora; Glover, Richard D.
1990-01-01
NASA's Dryden Flight Research Facility (Ames-Dryden), operates a diverse fleet of research aircraft which are heavily instrumented to provide both real time data for in-flight monitoring and recorded data for postflight analysis. Ames-Dryden's existing automated calibration (AUTOCAL) laboratory is a computerized facility which tests aircraft sensors to certify accuracy for anticipated harsh flight environments. Recently, a major AUTOCAL lab upgrade was initiated; the goal of this modernization is to enhance productivity and improve configuration management for both software and test data. The new system will have multiple testing stations employing distributed processing linked by a local area network to a centralized database. The baseline requirements for the new AUTOCAL lab and the design approach being taken for its mechanization are described.
Gomez, Carles; Paradells, Josep
2015-09-10
Urban Automation Networks (UANs) are being deployed worldwide in order to enable Smart City applications. Given the crucial role of UANs, as well as their diversity, it is critically important to assess their properties and trade-offs. This article introduces the requirements and challenges for UANs, characterizes the main current and emerging UAN paradigms, provides guidelines for their design and/or choice, and comparatively examines their performance in terms of a variety of parameters including coverage, power consumption, latency, standardization status and economic cost.
Gomez, Carles; Paradells, Josep
2015-01-01
Urban Automation Networks (UANs) are being deployed worldwide in order to enable Smart City applications. Given the crucial role of UANs, as well as their diversity, it is critically important to assess their properties and trade-offs. This article introduces the requirements and challenges for UANs, characterizes the main current and emerging UAN paradigms, provides guidelines for their design and/or choice, and comparatively examines their performance in terms of a variety of parameters including coverage, power consumption, latency, standardization status and economic cost. PMID:26378534
Gap-free segmentation of vascular networks with automatic image processing pipeline.
Hsu, Chih-Yang; Ghaffari, Mahsa; Alaraj, Ali; Flannery, Michael; Zhou, Xiaohong Joe; Linninger, Andreas
2017-03-01
Current image processing techniques capture large vessels reliably but often fail to preserve connectivity in bifurcations and small vessels. Imaging artifacts and noise can create gaps and discontinuity of intensity that hinders segmentation of vascular trees. However, topological analysis of vascular trees require proper connectivity without gaps, loops or dangling segments. Proper tree connectivity is also important for high quality rendering of surface meshes for scientific visualization or 3D printing. We present a fully automated vessel enhancement pipeline with automated parameter settings for vessel enhancement of tree-like structures from customary imaging sources, including 3D rotational angiography, magnetic resonance angiography, magnetic resonance venography, and computed tomography angiography. The output of the filter pipeline is a vessel-enhanced image which is ideal for generating anatomical consistent network representations of the cerebral angioarchitecture for further topological or statistical analysis. The filter pipeline combined with computational modeling can potentially improve computer-aided diagnosis of cerebrovascular diseases by delivering biometrics and anatomy of the vasculature. It may serve as the first step in fully automatic epidemiological analysis of large clinical datasets. The automatic analysis would enable rigorous statistical comparison of biometrics in subject-specific vascular trees. The robust and accurate image segmentation using a validated filter pipeline would also eliminate operator dependency that has been observed in manual segmentation. Moreover, manual segmentation is time prohibitive given that vascular trees have more than thousands of segments and bifurcations so that interactive segmentation consumes excessive human resources. Subject-specific trees are a first step toward patient-specific hemodynamic simulations for assessing treatment outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.
SWS: accessing SRS sites contents through Web Services.
Romano, Paolo; Marra, Domenico
2008-03-26
Web Services and Workflow Management Systems can support creation and deployment of network systems, able to automate data analysis and retrieval processes in biomedical research. Web Services have been implemented at bioinformatics centres and workflow systems have been proposed for biological data analysis. New databanks are often developed by taking into account these technologies, but many existing databases do not allow a programmatic access. Only a fraction of available databanks can thus be queried through programmatic interfaces. SRS is a well know indexing and search engine for biomedical databanks offering public access to many databanks and analysis tools. Unfortunately, these data are not easily and efficiently accessible through Web Services. We have developed 'SRS by WS' (SWS), a tool that makes information available in SRS sites accessible through Web Services. Information on known sites is maintained in a database, srsdb. SWS consists in a suite of WS that can query both srsdb, for information on sites and databases, and SRS sites. SWS returns results in a text-only format and can be accessed through a WSDL compliant client. SWS enables interoperability between workflow systems and SRS implementations, by also managing access to alternative sites, in order to cope with network and maintenance problems, and selecting the most up-to-date among available systems. Development and implementation of Web Services, allowing to make a programmatic access to an exhaustive set of biomedical databases can significantly improve automation of in-silico analysis. SWS supports this activity by making biological databanks that are managed in public SRS sites available through a programmatic interface.
A design automation framework for computational bioenergetics in biological networks.
Angione, Claudio; Costanza, Jole; Carapezza, Giovanni; Lió, Pietro; Nicosia, Giuseppe
2013-10-01
The bioenergetic activity of mitochondria can be thoroughly investigated by using computational methods. In particular, in our work we focus on ATP and NADH, namely the metabolites representing the production of energy in the cell. We develop a computational framework to perform an exhaustive investigation at the level of species, reactions, genes and metabolic pathways. The framework integrates several methods implementing the state-of-the-art algorithms for many-objective optimization, sensitivity, and identifiability analysis applied to biological systems. We use this computational framework to analyze three case studies related to the human mitochondria and the algal metabolism of Chlamydomonas reinhardtii, formally described with algebraic differential equations or flux balance analysis. Integrating the results of our framework applied to interacting organelles would provide a general-purpose method for assessing the production of energy in a biological network.
Automated seismic waveform location using Multichannel Coherency Migration (MCM)-I. Theory
NASA Astrophysics Data System (ADS)
Shi, Peidong; Angus, Doug; Rost, Sebastian; Nowacki, Andy; Yuan, Sanyi
2018-03-01
With the proliferation of dense seismic networks sampling the full seismic wavefield, recorded seismic data volumes are getting bigger and automated analysis tools to locate seismic events are essential. Here, we propose a novel Multichannel Coherency Migration (MCM) method to locate earthquakes in continuous seismic data and reveal the location and origin time of seismic events directly from recorded waveforms. By continuously calculating the coherency between waveforms from different receiver pairs, MCM greatly expands the available information which can be used for event location. MCM does not require phase picking or phase identification, which allows fully automated waveform analysis. By migrating the coherency between waveforms, MCM leads to improved source energy focusing. We have tested and compared MCM to other migration-based methods in noise-free and noisy synthetic data. The tests and analysis show that MCM is noise resistant and can achieve more accurate results compared with other migration-based methods. MCM is able to suppress strong interference from other seismic sources occurring at a similar time and location. It can be used with arbitrary 3D velocity models and is able to obtain reasonable location results with smooth but inaccurate velocity models. MCM exhibits excellent location performance and can be easily parallelized giving it large potential to be developed as a real-time location method for very large datasets.
Advances in Mössbauer data analysis
NASA Astrophysics Data System (ADS)
de Souza, Paulo A.
1998-08-01
The whole Mössbauer community generates a huge amount of data in several fields of human knowledge since the first publication of Rudolf Mössbauer. Interlaboratory measurements of the same substance may result in minor differences in the Mössbauer Parameters (MP) of isomer shift, quadrupole splitting and internal magnetic field. Therefore, a conventional data bank of published MP will be of limited help in identification of substances. Data bank search for exact information became incapable to differentiate the values of Mössbauer parameters within the experimental errors (e.g., IS = 0.22 mm/s from IS = 0.23 mm/s), but physically both values may be considered the same. An artificial neural network (ANN) is able to identify a substance and its crystalline structure from measured MP, and its slight variations do not represent an obstacle for the ANN identification. A barrier to the popularization of Mössbauer spectroscopy as an analytical technique is the absence of a full automated equipment, since the analysis of a Mössbauer spectrum normally is time-consuming and requires a specialist. In this work, the fitting process of a Mössbauer spectrum was completely automated through the use of genetic algorithms and fuzzy logic. Both software and hardware systems were implemented turning out to be a fully automated Mössbauer data analysis system. The developed system will be presented.
Development and implementation of a PACS network and resource manager
NASA Astrophysics Data System (ADS)
Stewart, Brent K.; Taira, Ricky K.; Dwyer, Samuel J., III; Huang, H. K.
1992-07-01
Clinical acceptance of PACS is predicated upon maximum uptime. Upon component failure, detection, diagnosis, reconfiguration and repair must occur immediately. Our current PACS network is large, heterogeneous, complex and wide-spread geographically. The overwhelming number of network devices, computers and software processes involved in a departmental or inter-institutional PACS makes development of tools for network and resource management critical. The authors have developed and implemented a comprehensive solution (PACS Network-Resource Manager) using the OSI Network Management Framework with network element agents that respond to queries and commands for network management stations. Managed resources include: communication protocol layers for Ethernet, FDDI and UltraNet; network devices; computer and operating system resources; and application, database and network services. The Network-Resource Manager is currently being used for warning, fault, security violation and configuration modification event notification. Analysis, automation and control applications have been added so that PACS resources can be dynamically reconfigured and so that users are notified when active involvement is required. Custom data and error logging have been implemented that allow statistics for each PACS subsystem to be charted for performance data. The Network-Resource Manager allows our departmental PACS system to be monitored continuously and thoroughly, with a minimal amount of personal involvement and time.
Automated conflict resolution issues
NASA Technical Reports Server (NTRS)
Wike, Jeffrey S.
1991-01-01
A discussion is presented of how conflicts for Space Network resources should be resolved in the ATDRSS era. The following topics are presented: a description of how resource conflicts are currently resolved; a description of issues associated with automated conflict resolution; present conflict resolution strategies; and topics for further discussion.
Zand, Pouria; Dilo, Arta; Havinga, Paul
2013-06-27
Current wireless technologies for industrial applications, such as WirelessHART and ISA100.11a, use a centralized management approach where a central network manager handles the requirements of the static network. However, such a centralized approach has several drawbacks. For example, it cannot cope with dynamicity/disturbance in large-scale networks in a real-time manner and it incurs a high communication overhead and latency for exchanging management traffic. In this paper, we therefore propose a distributed network management scheme, D-MSR. It enables the network devices to join the network, schedule their communications, establish end-to-end connections by reserving the communication resources for addressing real-time requirements, and cope with network dynamicity (e.g., node/edge failures) in a distributed manner. According to our knowledge, this is the first distributed management scheme based on IEEE 802.15.4e standard, which guides the nodes in different phases from joining until publishing their sensor data in the network. We demonstrate via simulation that D-MSR can address real-time and reliable communication as well as the high throughput requirements of industrial automation wireless networks, while also achieving higher efficiency in network management than WirelessHART, in terms of delay and overhead.
Zand, Pouria; Dilo, Arta; Havinga, Paul
2013-01-01
Current wireless technologies for industrial applications, such as WirelessHART and ISA100.11a, use a centralized management approach where a central network manager handles the requirements of the static network. However, such a centralized approach has several drawbacks. For example, it cannot cope with dynamicity/disturbance in large-scale networks in a real-time manner and it incurs a high communication overhead and latency for exchanging management traffic. In this paper, we therefore propose a distributed network management scheme, D-MSR. It enables the network devices to join the network, schedule their communications, establish end-to-end connections by reserving the communication resources for addressing real-time requirements, and cope with network dynamicity (e.g., node/edge failures) in a distributed manner. According to our knowledge, this is the first distributed management scheme based on IEEE 802.15.4e standard, which guides the nodes in different phases from joining until publishing their sensor data in the network. We demonstrate via simulation that D-MSR can address real-time and reliable communication as well as the high throughput requirements of industrial automation wireless networks, while also achieving higher efficiency in network management than WirelessHART, in terms of delay and overhead. PMID:23807687
NASA Astrophysics Data System (ADS)
Abidin, Anas Z.; Chockanathan, Udaysankar; DSouza, Adora M.; Inglese, Matilde; Wismüller, Axel
2017-03-01
Clinically Isolated Syndrome (CIS) is often considered to be the first neurological episode associated with Multiple sclerosis (MS). At an early stage the inflammatory demyelination occurring in the CNS can manifest as a change in neuronal metabolism, with multiple asymptomatic white matter lesions detected in clinical MRI. Such damage may induce topological changes of brain networks, which can be captured by advanced functional MRI (fMRI) analysis techniques. We test this hypothesis by capturing the effective relationships of 90 brain regions, defined in the Automated Anatomic Labeling (AAL) atlas, using a large-scale Granger Causality (lsGC) framework. The resulting networks are then characterized using graph-theoretic measures that quantify various network topology properties at a global as well as at a local level. We study for differences in these properties in network graphs obtained for 18 subjects (10 male and 8 female, 9 with CIS and 9 healthy controls). Global network properties captured trending differences with modularity and clustering coefficient (p<0.1). Additionally, local network properties, such as local efficiency and the strength of connections, captured statistically significant (p<0.01) differences in some regions of the inferior frontal and parietal lobe. We conclude that multivariate analysis of fMRI time-series can reveal interesting information about changes occurring in the brain in early stages of MS.
Neuromorphic Optical Signal Processing and Image Understanding for Automated Target Recognition
1989-12-01
34 Stochastic Learning Machine " Neuromorphic Target Identification * Cognitive Networks 3. Conclusions ..... ................ .. 12 4. Publications...16 5. References ...... ................... . 17 6. Appendices ....... .................. 18 I. Optoelectronic Neural Networks and...Learning Machines. II. Stochastic Optical Learning Machine. III. Learning Network for Extrapolation AccesFon For and Radar Target Identification
22 CFR 122.2 - Submission of registration statement.
Code of Federal Regulations, 2013 CFR
2013-04-01
... payment via Automated Clearing House or Federal Reserve Wire Network payable to the Department of State of... Federal Reserve Wire Network (FedWire) are electronic networks used to process financial transactions in...., Comptroller, Treasurer, General Counsel) or any member of the board of directors: (i) Has ever been indicted...
22 CFR 122.2 - Submission of registration statement.
Code of Federal Regulations, 2012 CFR
2012-04-01
... payment via Automated Clearing House or Federal Reserve Wire Network payable to the Department of State of... Federal Reserve Wire Network (FedWire) are electronic networks used to process financial transactions in...., Comptroller, Treasurer, General Counsel) or any member of the board of directors: (i) Has ever been indicted...
Systems Librarian and Automation Review.
ERIC Educational Resources Information Center
Schuyler, Michael
1992-01-01
Discusses software sharing on computer networks and the need for proper bandwidth; and describes the technology behind FidoNet, a computer network made up of electronic bulletin boards. Network features highlighted include front-end mailers, Zone Mail Hour, Nodelist, NetMail, EchoMail, computer conferences, tosser and scanner programs, and host…
Librarian's Handbook for Costing Network Services.
ERIC Educational Resources Information Center
Western Interstate Library Coordinating Organization, Boulder, CO.
This handbook, using an input-process-output model, provides library managers with a method to evaluate network service offerings and decide which would be cost-beneficial to procure. The scope of services addressed encompasses those network services based on an automated bibliographic data base intended primarily for cataloging support. Sections…
NASA Astrophysics Data System (ADS)
Christianson, D. S.; Beekwilder, N.; Chan, S.; Cheah, Y. W.; Chu, H.; Dengel, S.; O'Brien, F.; Pastorello, G.; Sandesh, M.; Torn, M. S.; Agarwal, D.
2017-12-01
AmeriFlux is a network of scientists who independently collect eddy covariance and related environmental observations at over 250 locations across the Americas. As part of the AmeriFlux Management Project, the AmeriFlux Data Team manages standardization, collection, quality assurance / quality control (QA/QC), and distribution of data submitted by network members. To generate data products that are timely, QA/QC'd, and repeatable, and have traceable provenance, we developed a semi-automated data processing pipeline. The new pipeline consists of semi-automated format and data QA/QC checks. Results are communicated via on-line reports as well as an issue-tracking system. Data processing time has been reduced from 2-3 days to a few hours of manual review time, resulting in faster data availability from the time of data submission. The pipeline is scalable to the network level and has the following key features. (1) On-line results of the format QA/QC checks are available immediately for data provider review. This enables data providers to correct and resubmit data quickly. (2) The format QA/QC assessment includes an automated attempt to fix minor format errors. Data submissions that are formatted in the new AmeriFlux FP-In standard can be queued for the data QA/QC assessment, often with minimal delay. (3) Automated data QA/QC checks identify and communicate potentially erroneous data via online, graphical quick views that highlight observations with unexpected values, incorrect units, time drifts, invalid multivariate correlations, and/or radiation shadows. (4) Progress through the pipeline is integrated with an issue-tracking system that facilitates communications between data providers and the data processing team in an organized and searchable fashion. Through development of these and other features of the pipeline, we present solutions to challenges that include optimizing automated with manual processing, bridging legacy data management infrastructure with various software tools, and working across interdisciplinary and international science cultures. Additionally, we discuss results from community member feedback that helped refine QA/QC communications for efficient data submission and revision.
Liu, Yu; Xia, Jun; Shi, Chun-Xiang; Hong, Yang
2009-01-01
The crowning objective of this research was to identify a better cloud classification method to upgrade the current window-based clustering algorithm used operationally for China’s first operational geostationary meteorological satellite FengYun-2C (FY-2C) data. First, the capabilities of six widely-used Artificial Neural Network (ANN) methods are analyzed, together with the comparison of two other methods: Principal Component Analysis (PCA) and a Support Vector Machine (SVM), using 2864 cloud samples manually collected by meteorologists in June, July, and August in 2007 from three FY-2C channel (IR1, 10.3–11.3 μm; IR2, 11.5–12.5 μm and WV 6.3–7.6 μm) imagery. The result shows that: (1) ANN approaches, in general, outperformed the PCA and the SVM given sufficient training samples and (2) among the six ANN networks, higher cloud classification accuracy was obtained with the Self-Organizing Map (SOM) and Probabilistic Neural Network (PNN). Second, to compare the ANN methods to the present FY-2C operational algorithm, this study implemented SOM, one of the best ANN network identified from this study, as an automated cloud classification system for the FY-2C multi-channel data. It shows that SOM method has improved the results greatly not only in pixel-level accuracy but also in cloud patch-level classification by more accurately identifying cloud types such as cumulonimbus, cirrus and clouds in high latitude. Findings of this study suggest that the ANN-based classifiers, in particular the SOM, can be potentially used as an improved Automated Cloud Classification Algorithm to upgrade the current window-based clustering method for the FY-2C operational products. PMID:22346714
Liu, Yu; Xia, Jun; Shi, Chun-Xiang; Hong, Yang
2009-01-01
The crowning objective of this research was to identify a better cloud classification method to upgrade the current window-based clustering algorithm used operationally for China's first operational geostationary meteorological satellite FengYun-2C (FY-2C) data. First, the capabilities of six widely-used Artificial Neural Network (ANN) methods are analyzed, together with the comparison of two other methods: Principal Component Analysis (PCA) and a Support Vector Machine (SVM), using 2864 cloud samples manually collected by meteorologists in June, July, and August in 2007 from three FY-2C channel (IR1, 10.3-11.3 μm; IR2, 11.5-12.5 μm and WV 6.3-7.6 μm) imagery. The result shows that: (1) ANN approaches, in general, outperformed the PCA and the SVM given sufficient training samples and (2) among the six ANN networks, higher cloud classification accuracy was obtained with the Self-Organizing Map (SOM) and Probabilistic Neural Network (PNN). Second, to compare the ANN methods to the present FY-2C operational algorithm, this study implemented SOM, one of the best ANN network identified from this study, as an automated cloud classification system for the FY-2C multi-channel data. It shows that SOM method has improved the results greatly not only in pixel-level accuracy but also in cloud patch-level classification by more accurately identifying cloud types such as cumulonimbus, cirrus and clouds in high latitude. Findings of this study suggest that the ANN-based classifiers, in particular the SOM, can be potentially used as an improved Automated Cloud Classification Algorithm to upgrade the current window-based clustering method for the FY-2C operational products.
Benthin, Cody; Pannu, Sonal; Khan, Akram; Gong, Michelle
2016-10-01
The nature, variability, and extent of early warning clinical practice alerts derived from automated query of electronic health records (e-alerts) currently used in acute care settings for clinical care or research is unknown. To describe e-alerts in current use in acute care settings at medical centers participating in a nationwide critical care research network. We surveyed investigators at 38 institutions involved in the National Institutes of Health-funded Clinical Trials Network for the Prevention and Early Treatment of Acute Lung Injury (PETAL) for quantitative and qualitative analysis. Thirty sites completed the survey (79% response rate). All sites used electronic health record systems. Epic Systems was used at 56% of sites; the others used alternate commercially available vendors or homegrown systems. Respondents at 57% of sites represented in this survey used e-alerts. All but 1 of these 17 sites used an e-alert for early detection of sepsis-related syndromes, and 35% used an e-alert for pneumonia. E-alerts were triggered by abnormal laboratory values (37%), vital signs (37%), or radiology reports (15%) and were used about equally for clinical decision support and research. Only 59% of sites with e-alerts have evaluated them either for accuracy or for validity. A majority of the research network sites participating in this survey use e-alerts for early notification of potential threats to hospitalized patients; however, there was significant variability in the nature of e-alerts between institutions. Use of one common electronic health record vendor at more than half of the participating sites suggests that it may be possible to standardize e-alerts across multiple sites in research networks, particularly among sites using the same medical record platform.
Lee, Hyungwoo; Kang, Kyung Eun; Chung, Hyewon; Kim, Hyung Chan
2018-04-12
To evaluate an automated segmentation algorithm with a convolutional neural network (CNN) to quantify and detect intraretinal fluid (IRF), subretinal fluid (SRF), pigment epithelial detachment (PED), and subretinal hyperreflective material (SHRM) through analyses of spectral domain optical coherence tomography (SD-OCT) images from patients with neovascular age-related macular degeneration (nAMD). Reliability and validity analysis of a diagnostic tool. We constructed a dataset including 930 B-scans from 93 eyes of 93 patients with nAMD. A CNN-based deep neural network was trained using 11550 augmented images derived from 550 B-scans. The performance of the trained network was evaluated using a validation set including 140 B-scans and a test set of 240 B-scans. The Dice coefficient, positive predictive value (PPV), sensitivity, relative area difference (RAD), and intraclass correlation coefficient (ICC) were used to evaluate segmentation and detection performance. Good agreement was observed for both segmentation and detection of lesions between the trained network and clinicians. The Dice coefficients for segmentation of IRF, SRF, SHRM, and PED were 0.78, 0.82, 0.75, and 0.80, respectively; the PPVs were 0.79, 0.80, 0.75, and 0.80, respectively; and the sensitivities were 0.77, 0.84, 0.73, and 0.81, respectively. The RADs were -4.32%, -10.29%, 4.13%, and 0.34%, respectively, and the ICCs were 0.98, 0.98, 0.97, and 0.98, respectively. All lesions were detected with high PPVs (range 0.94-0.99) and sensitivities (range 0.97-0.99). A CNN-based network provides clinicians with quantitative data regarding nAMD through automatic segmentation and detection of pathological lesions, including IRF, SRF, PED, and SHRM. Copyright © 2018 Elsevier Inc. All rights reserved.
Alternative Dual Mode Network Control Strategies
DOT National Transportation Integrated Search
1972-03-01
From a literature survey a qualitative evaluation was made of four network control strategies for the fundamental control philosophy of the moving synchronous slot. In the literature concerning automated transportation systems, such as dual mode, a g...
The Role of the Network in Automated Acquisitions.
ERIC Educational Resources Information Center
Madden, Mary A.
1980-01-01
This examination of the acquisitions services offered by networks, the not-for-profit bibliographic services, stresses significant characteristics inherent in their structure and functions, and contrasts advantages and disadvantages for individual libraries. (Author/RAA)
Applying Semantic Web Services and Wireless Sensor Networks for System Integration
NASA Astrophysics Data System (ADS)
Berkenbrock, Gian Ricardo; Hirata, Celso Massaki; de Oliveira Júnior, Frederico Guilherme Álvares; de Oliveira, José Maria Parente
In environments like factories, buildings, and homes automation services tend to often change during their lifetime. Changes are concerned to business rules, process optimization, cost reduction, and so on. It is important to provide a smooth and straightforward way to deal with these changes so that could be handled in a faster and low cost manner. Some prominent solutions use the flexibility of Wireless Sensor Networks and the meaningful description of Semantic Web Services to provide service integration. In this work, we give an overview of current solutions for machinery integration that combine both technologies as well as a discussion about some perspectives and open issues when applying Wireless Sensor Networks and Semantic Web Services for automation services integration.
NASA Astrophysics Data System (ADS)
Abidin, Anas Zainul; D'Souza, Adora M.; Nagarajan, Mahesh B.; Wismüller, Axel
2016-03-01
About 50% of subjects infected with HIV present deficits in cognitive domains, which are known collectively as HIV associated neurocognitive disorder (HAND). The underlying synaptodendritic damage can be captured using resting state functional MRI, as has been demonstrated by a few earlier studies. Such damage may induce topological changes of brain connectivity networks. We test this hypothesis by capturing the functional interdependence of 90 brain network nodes using a Mutual Connectivity Analysis (MCA) framework with non-linear time series modeling based on Generalized Radial Basis function (GRBF) neural networks. The network nodes are selected based on the regions defined in the Automated Anatomic Labeling (AAL) atlas. Each node is represented by the average time series of the voxels of that region. The resulting networks are then characterized using graph-theoretic measures that quantify various network topology properties at a global as well as at a local level. We tested for differences in these properties in network graphs obtained for 10 subjects (6 male and 4 female, 5 HIV+ and 5 HIV-). Global network properties captured some differences between these subject cohorts, though significant differences were seen only with the clustering coefficient measure. Local network properties, such as local efficiency and the degree of connections, captured significant differences in regions of the frontal lobe, precentral and cingulate cortex amongst a few others. These results suggest that our method can be used to effectively capture differences occurring in brain network connectivity properties revealed by resting-state functional MRI in neurological disease states, such as HAND.
2007-03-01
results (Ingols 2005). 2.4.3 Skybox - Skybox view Skybox View is a commercially available tool developed by Skybox Security that can automatically...generate attack graphs through the use of host-based agents, management interfaces, and an analysis server located on the target network ( Skybox 2006... Skybox , an examination of recent patents submitted by Skybox identified the algorithmic complexity of the product as n4, where n represents the number
2010-04-01
the lower and higher asymptotic 95 % confidence intervals were 0.812 and 0.924, respectively. Based on these results, we consider automated ...commercial success of the cardioverter- defibrillator industry clearly demonstrates that the obstacles at hand can be successfully overcome. 4.1...Does HRC Work in Real-Life Noisy Data? One of the methods reportedly designed to deal with nonstationary RRI data sets is the point correlation
Caffarel, Jennifer; Gibson, G John; Harrison, J Phil; Griffiths, Clive J; Drinnan, Michael J
2006-03-01
We have compared sleep staging by an automated neural network (ANN) system, BioSleep (Oxford BioSignals) and a human scorer using the Rechtschaffen and Kales scoring system. Sleep study recordings from 114 patients with suspected obstructed sleep apnoea syndrome (OSA) were analysed by ANN and by a blinded human scorer. We also examined human scorer reliability by calculating the agreement between the index scorer and a second independent blinded scorer for 28 of the 114 studies. For each study, we built contingency tables on an epoch-by-epoch (30 s epochs) comparison basis. From these, we derived kappa (kappa) coefficients for different combinations of sleep stages. The overall agreement of automatic and manual scoring for the 114 studies for the classification {wake / light-sleep / deep-sleep / REM} was poor (median kappa = 0.305) and only a little better (kappa = 0.449) for the crude {wake / sleep} distinction. For the subgroup of 28 randomly selected studies, the overall agreement of automatic and manual scoring was again relatively low (kappa = 0.331 for {wake light-sleep / deep-sleep REM} and kappa = 0.505 for {wake / sleep}), whereas inter-scorer reliability was higher (kappa = -0.641 for {wake / light-sleep / deep-sleep / REM} and kappa = 0.737 for {wake / sleep}). We conclude that such an ANN-based analysis system is not sufficiently accurate for sleep study analyses using the R&K classification system.
Camerlengo, Terry; Ozer, Hatice Gulcin; Onti-Srinivasan, Raghuram; Yan, Pearlly; Huang, Tim; Parvin, Jeffrey; Huang, Kun
2012-01-01
Next Generation Sequencing is highly resource intensive. NGS Tasks related to data processing, management and analysis require high-end computing servers or even clusters. Additionally, processing NGS experiments requires suitable storage space and significant manual interaction. At The Ohio State University's Biomedical Informatics Shared Resource, we designed and implemented a scalable architecture to address the challenges associated with the resource intensive nature of NGS secondary analysis built around Illumina Genome Analyzer II sequencers and Illumina's Gerald data processing pipeline. The software infrastructure includes a distributed computing platform consisting of a LIMS called QUEST (http://bisr.osumc.edu), an Automation Server, a computer cluster for processing NGS pipelines, and a network attached storage device expandable up to 40TB. The system has been architected to scale to multiple sequencers without requiring additional computing or labor resources. This platform provides demonstrates how to manage and automate NGS experiments in an institutional or core facility setting.
Space exploration: The interstellar goal and Titan demonstration
NASA Technical Reports Server (NTRS)
1982-01-01
Automated interstellar space exploration is reviewed. The Titan demonstration mission is discussed. Remote sensing and automated modeling are considered. Nuclear electric propulsion, main orbiting spacecraft, lander/rover, subsatellites, atmospheric probes, powered air vehicles, and a surface science network comprise mission component concepts. Machine, intelligence in space exploration is discussed.
Automation of the CAS Document Delivery Service.
ERIC Educational Resources Information Center
Steensland, M. C.; Soukup, K. M.
1986-01-01
The automation of online order retrieval for Chemical Abstracts Service Document Delivery Service was accomplished by shifting to an order retrieval/dispatch process linked to a Unix network. The Unix-based environment, its terminal emulation, page-break, and user-friendly interface software, and later enhancements are reviewed. Resultant increase…
Integrated Communications and Work Efficiency: Impacts on Organizational Structure and Power.
ERIC Educational Resources Information Center
Wigand, Rolf T.
This paper reviews the work environment surrounding integrated office systems, synthesizes the known effects of automated office technologies, and discusses their impact on work efficiency in office environments. Particular attention is given to the effect of automated technologies on networks, workflow/processes, and organizational structure and…
Test and Evaluation of Neural Network Applications for Seismic Signal Discrimination
1992-09-28
IMS) for automated processing and interpretation of regional seismic data. Also reported is the result of a preliminary study on the application of...of analyst-verified events that were missed by the automated processing decreased by more than a factor of 2 (about 10 events/week). The second
Automating the conflict resolution process
NASA Technical Reports Server (NTRS)
Wike, Jeffrey S.
1991-01-01
The purpose is to initiate a discussion of how the conflict resolution process at the Network Control Center can be made more efficient. Described here are how resource conflicts are currently resolved as well as the impacts of automating conflict resolution in the ATDRSS era. A variety of conflict resolution strategies are presented.
Automated Bilingual Circulation System Using PC Local Area Networks.
ERIC Educational Resources Information Center
Iskanderani, A. I.; Anwar, M. A.
1992-01-01
Describes a personal computer and LAN-based automated circulation system capable of handling both Arabic and Latin characters that was developed for use at King Abdullaziz University (Jeddah, Saudi Arabia). Outlines system requirements, system structure, hardware needs, and individual functional modules of the system. Numerous examples and flow…
Automating the Presentation of Computer Networks
2006-12-01
software to overlay operational state information. Other network management tools like Computer Associates Unicenter [6,7] generate internal network...and required manual placement assistance. A number of software libraries [20] offer a wealth of automatic layout algorithms and presentation...FX010857971033.aspx [2] Microsoft (2005) Visio 2003 Product Demo, http://www.microsoft.com/office/visio/prodinfo/demo.mspx [3] Smartdraw (2005) Network
ERIC Educational Resources Information Center
Lynch, Clifford A.
1989-01-01
Reviews the history of the network that supports the MELVYL online union catalog, describes current technological and policy issues, and discusses the role the network plays in integrating local automation, the union catalog, access to resource databases, and other initiatives. Sidebars by Mark Needleman discuss the TCP/IP protocol suite, internet…
A Parallel Adaboost-Backpropagation Neural Network for Massive Image Dataset Classification
NASA Astrophysics Data System (ADS)
Cao, Jianfang; Chen, Lichao; Wang, Min; Shi, Hao; Tian, Yun
2016-12-01
Image classification uses computers to simulate human understanding and cognition of images by automatically categorizing images. This study proposes a faster image classification approach that parallelizes the traditional Adaboost-Backpropagation (BP) neural network using the MapReduce parallel programming model. First, we construct a strong classifier by assembling the outputs of 15 BP neural networks (which are individually regarded as weak classifiers) based on the Adaboost algorithm. Second, we design Map and Reduce tasks for both the parallel Adaboost-BP neural network and the feature extraction algorithm. Finally, we establish an automated classification model by building a Hadoop cluster. We use the Pascal VOC2007 and Caltech256 datasets to train and test the classification model. The results are superior to those obtained using traditional Adaboost-BP neural network or parallel BP neural network approaches. Our approach increased the average classification accuracy rate by approximately 14.5% and 26.0% compared to the traditional Adaboost-BP neural network and parallel BP neural network, respectively. Furthermore, the proposed approach requires less computation time and scales very well as evaluated by speedup, sizeup and scaleup. The proposed approach may provide a foundation for automated large-scale image classification and demonstrates practical value.
A Parallel Adaboost-Backpropagation Neural Network for Massive Image Dataset Classification.
Cao, Jianfang; Chen, Lichao; Wang, Min; Shi, Hao; Tian, Yun
2016-12-01
Image classification uses computers to simulate human understanding and cognition of images by automatically categorizing images. This study proposes a faster image classification approach that parallelizes the traditional Adaboost-Backpropagation (BP) neural network using the MapReduce parallel programming model. First, we construct a strong classifier by assembling the outputs of 15 BP neural networks (which are individually regarded as weak classifiers) based on the Adaboost algorithm. Second, we design Map and Reduce tasks for both the parallel Adaboost-BP neural network and the feature extraction algorithm. Finally, we establish an automated classification model by building a Hadoop cluster. We use the Pascal VOC2007 and Caltech256 datasets to train and test the classification model. The results are superior to those obtained using traditional Adaboost-BP neural network or parallel BP neural network approaches. Our approach increased the average classification accuracy rate by approximately 14.5% and 26.0% compared to the traditional Adaboost-BP neural network and parallel BP neural network, respectively. Furthermore, the proposed approach requires less computation time and scales very well as evaluated by speedup, sizeup and scaleup. The proposed approach may provide a foundation for automated large-scale image classification and demonstrates practical value.
A Parallel Adaboost-Backpropagation Neural Network for Massive Image Dataset Classification
Cao, Jianfang; Chen, Lichao; Wang, Min; Shi, Hao; Tian, Yun
2016-01-01
Image classification uses computers to simulate human understanding and cognition of images by automatically categorizing images. This study proposes a faster image classification approach that parallelizes the traditional Adaboost-Backpropagation (BP) neural network using the MapReduce parallel programming model. First, we construct a strong classifier by assembling the outputs of 15 BP neural networks (which are individually regarded as weak classifiers) based on the Adaboost algorithm. Second, we design Map and Reduce tasks for both the parallel Adaboost-BP neural network and the feature extraction algorithm. Finally, we establish an automated classification model by building a Hadoop cluster. We use the Pascal VOC2007 and Caltech256 datasets to train and test the classification model. The results are superior to those obtained using traditional Adaboost-BP neural network or parallel BP neural network approaches. Our approach increased the average classification accuracy rate by approximately 14.5% and 26.0% compared to the traditional Adaboost-BP neural network and parallel BP neural network, respectively. Furthermore, the proposed approach requires less computation time and scales very well as evaluated by speedup, sizeup and scaleup. The proposed approach may provide a foundation for automated large-scale image classification and demonstrates practical value. PMID:27905520
The role of total laboratory automation in a consolidated laboratory network.
Seaberg, R S; Stallone, R O; Statland, B E
2000-05-01
In an effort to reduce overall laboratory costs and improve overall laboratory efficiencies at all of its network hospitals, the North Shore-Long Island Health System recently established a Consolidated Laboratory Network with a Core Laboratory at its center. We established and implemented a centralized Core Laboratory designed around the Roche/Hitachi CLAS Total Laboratory Automation system to perform the general and esoteric laboratory testing throughout the system in a timely and cost-effective fashion. All remaining STAT testing will be performed within the Rapid Response Laboratories (RRLs) at each of the system's hospitals. Results for this laboratory consolidation and implementation effort demonstrated a decrease in labor costs and improved turnaround time (TAT) at the core laboratory. Anticipated system savings are approximately $2.7 million. TATs averaged 1.3 h within the Core Laboratory and less than 30 min in the RRLs. When properly implemented, automation systems can reduce overall laboratory expenses, enhance patient services, and address the overall concerns facing the laboratory today: job satisfaction, decreased length of stay, and safety. The financial savings realized are primarily a result of labor reductions.
Madec, Morgan; Pecheux, François; Gendrault, Yves; Rosati, Elise; Lallement, Christophe; Haiech, Jacques
2016-10-01
The topic of this article is the development of an open-source automated design framework for synthetic biology, specifically for the design of artificial gene regulatory networks based on a digital approach. In opposition to other tools, GeNeDA is an open-source online software based on existing tools used in microelectronics that have proven their efficiency over the last 30 years. The complete framework is composed of a computation core directly adapted from an Electronic Design Automation tool, input and output interfaces, a library of elementary parts that can be achieved with gene regulatory networks, and an interface with an electrical circuit simulator. Each of these modules is an extension of microelectronics tools and concepts: ODIN II, ABC, the Verilog language, SPICE simulator, and SystemC-AMS. GeNeDA is first validated on a benchmark of several combinatorial circuits. The results highlight the importance of the part library. Then, this framework is used for the design of a sequential circuit including a biological state machine.
Advanced systems engineering and network planning support
NASA Technical Reports Server (NTRS)
Walters, David H.; Barrett, Larry K.; Boyd, Ronald; Bazaj, Suresh; Mitchell, Lionel; Brosi, Fred
1990-01-01
The objective of this task was to take a fresh look at the NASA Space Network Control (SNC) element for the Advanced Tracking and Data Relay Satellite System (ATDRSS) such that it can be made more efficient and responsive to the user by introducing new concepts and technologies appropriate for the 1997 timeframe. In particular, it was desired to investigate the technologies and concepts employed in similar systems that may be applicable to the SNC. The recommendations resulting from this study include resource partitioning, on-line access to subsets of the SN schedule, fluid scheduling, increased use of demand access on the MA service, automating Inter-System Control functions using monitor by exception, increase automation for distributed data management and distributed work management, viewing SN operational control in terms of the OSI Management framework, and the introduction of automated interface management.
Bicycle share feasibility study : New Orleans.
DOT National Transportation Integrated Search
2012-05-01
Bicycle share is a network of bicycles and automated kiosks that allows users to make short trips (1-3 : miles) quickly, conveniently and affordably. Bicycle share is a component of a strong transportation : network, potentially moving 100,000 people...
Analysis of Malicious Traffic in Modbus/TCP Communications
NASA Astrophysics Data System (ADS)
Kobayashi, Tiago H.; Batista, Aguinaldo B.; Medeiros, João Paulo S.; Filho, José Macedo F.; Brito, Agostinho M.; Pires, Paulo S. Motta
This paper presents the results of our analysis about the influence of Information Technology (IT) malicious traffic on an IP-based automation environment. We utilized a traffic generator, called MACE (Malicious trAffic Composition Environment), to inject malicious traffic in a Modbus/TCP communication system and a sniffer to capture and analyze network traffic. The realized tests show that malicious traffic represents a serious risk to critical information infrastructures. We show that this kind of traffic can increase latency of Modbus/TCP communication and that, in some cases, can put Modbus/TCP devices out of communication.
Deep learning in color: towards automated quark/gluon jet discrimination
Komiske, Patrick T.; Metodiev, Eric M.; Schwartz, Matthew D.
2017-01-25
Artificial intelligence offers the potential to automate challenging data-processing tasks in collider physics. Here, to establish its prospects, we explore to what extent deep learning with convolutional neural networks can discriminate quark and gluon jets better than observables designed by physicists. Our approach builds upon the paradigm that a jet can be treated as an image, with intensity given by the local calorimeter deposits. We supplement this construction by adding color to the images, with red, green and blue intensities given by the transverse momentum in charged particles, transverse momentum in neutral particles, and pixel-level charged particle counts. Overall, themore » deep networks match or outperform traditional jet variables. We also find that, while various simulations produce different quark and gluon jets, the neural networks are surprisingly insensitive to these differences, similar to traditional observables. This suggests that the networks can extract robust physical information from imperfect simulations.« less
Deep learning in color: towards automated quark/gluon jet discrimination
NASA Astrophysics Data System (ADS)
Komiske, Patrick T.; Metodiev, Eric M.; Schwartz, Matthew D.
2017-01-01
Artificial intelligence offers the potential to automate challenging data-processing tasks in collider physics. To establish its prospects, we explore to what extent deep learning with convolutional neural networks can discriminate quark and gluon jets better than observables designed by physicists. Our approach builds upon the paradigm that a jet can be treated as an image, with intensity given by the local calorimeter deposits. We supplement this construction by adding color to the images, with red, green and blue intensities given by the transverse momentum in charged particles, transverse momentum in neutral particles, and pixel-level charged particle counts. Overall, the deep networks match or outperform traditional jet variables. We also find that, while various simulations produce different quark and gluon jets, the neural networks are surprisingly insensitive to these differences, similar to traditional observables. This suggests that the networks can extract robust physical information from imperfect simulations.
Deep learning in color: towards automated quark/gluon jet discrimination
DOE Office of Scientific and Technical Information (OSTI.GOV)
Komiske, Patrick T.; Metodiev, Eric M.; Schwartz, Matthew D.
Artificial intelligence offers the potential to automate challenging data-processing tasks in collider physics. Here, to establish its prospects, we explore to what extent deep learning with convolutional neural networks can discriminate quark and gluon jets better than observables designed by physicists. Our approach builds upon the paradigm that a jet can be treated as an image, with intensity given by the local calorimeter deposits. We supplement this construction by adding color to the images, with red, green and blue intensities given by the transverse momentum in charged particles, transverse momentum in neutral particles, and pixel-level charged particle counts. Overall, themore » deep networks match or outperform traditional jet variables. We also find that, while various simulations produce different quark and gluon jets, the neural networks are surprisingly insensitive to these differences, similar to traditional observables. This suggests that the networks can extract robust physical information from imperfect simulations.« less
Kusumoto, Dai; Lachmann, Mark; Kunihiro, Takeshi; Yuasa, Shinsuke; Kishino, Yoshikazu; Kimura, Mai; Katsuki, Toshiomi; Itoh, Shogo; Seki, Tomohisa; Fukuda, Keiichi
2018-06-05
Deep learning technology is rapidly advancing and is now used to solve complex problems. Here, we used deep learning in convolutional neural networks to establish an automated method to identify endothelial cells derived from induced pluripotent stem cells (iPSCs), without the need for immunostaining or lineage tracing. Networks were trained to predict whether phase-contrast images contain endothelial cells based on morphology only. Predictions were validated by comparison to immunofluorescence staining for CD31, a marker of endothelial cells. Method parameters were then automatically and iteratively optimized to increase prediction accuracy. We found that prediction accuracy was correlated with network depth and pixel size of images to be analyzed. Finally, K-fold cross-validation confirmed that optimized convolutional neural networks can identify endothelial cells with high performance, based only on morphology. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Chase Qishi; Zhu, Michelle Mengxia
The advent of large-scale collaborative scientific applications has demonstrated the potential for broad scientific communities to pool globally distributed resources to produce unprecedented data acquisition, movement, and analysis. System resources including supercomputers, data repositories, computing facilities, network infrastructures, storage systems, and display devices have been increasingly deployed at national laboratories and academic institutes. These resources are typically shared by large communities of users over Internet or dedicated networks and hence exhibit an inherent dynamic nature in their availability, accessibility, capacity, and stability. Scientific applications using either experimental facilities or computation-based simulations with various physical, chemical, climatic, and biological models featuremore » diverse scientific workflows as simple as linear pipelines or as complex as a directed acyclic graphs, which must be executed and supported over wide-area networks with massively distributed resources. Application users oftentimes need to manually configure their computing tasks over networks in an ad hoc manner, hence significantly limiting the productivity of scientists and constraining the utilization of resources. The success of these large-scale distributed applications requires a highly adaptive and massively scalable workflow platform that provides automated and optimized computing and networking services. This project is to design and develop a generic Scientific Workflow Automation and Management Platform (SWAMP), which contains a web-based user interface specially tailored for a target application, a set of user libraries, and several easy-to-use computing and networking toolkits for application scientists to conveniently assemble, execute, monitor, and control complex computing workflows in heterogeneous high-performance network environments. SWAMP will enable the automation and management of the entire process of scientific workflows with the convenience of a few mouse clicks while hiding the implementation and technical details from end users. Particularly, we will consider two types of applications with distinct performance requirements: data-centric and service-centric applications. For data-centric applications, the main workflow task involves large-volume data generation, catalog, storage, and movement typically from supercomputers or experimental facilities to a team of geographically distributed users; while for service-centric applications, the main focus of workflow is on data archiving, preprocessing, filtering, synthesis, visualization, and other application-specific analysis. We will conduct a comprehensive comparison of existing workflow systems and choose the best suited one with open-source code, a flexible system structure, and a large user base as the starting point for our development. Based on the chosen system, we will develop and integrate new components including a black box design of computing modules, performance monitoring and prediction, and workflow optimization and reconfiguration, which are missing from existing workflow systems. A modular design for separating specification, execution, and monitoring aspects will be adopted to establish a common generic infrastructure suited for a wide spectrum of science applications. We will further design and develop efficient workflow mapping and scheduling algorithms to optimize the workflow performance in terms of minimum end-to-end delay, maximum frame rate, and highest reliability. We will develop and demonstrate the SWAMP system in a local environment, the grid network, and the 100Gpbs Advanced Network Initiative (ANI) testbed. The demonstration will target scientific applications in climate modeling and high energy physics and the functions to be demonstrated include workflow deployment, execution, steering, and reconfiguration. Throughout the project period, we will work closely with the science communities in the fields of climate modeling and high energy physics including Spallation Neutron Source (SNS) and Large Hadron Collider (LHC) projects to mature the system for production use.« less
Satellite battery testing status
NASA Astrophysics Data System (ADS)
Haag, R.; Hall, S.
1986-09-01
Because of the large numbers of satellite cells currently being tested and anticipated at the Naval Weapons Support Center (NAVWPNSUPPCEN) Crane, Indiana, satellite cell testing is being integrated into the Battery Test Automation Project (BTAP). The BTAP, designed to meet the growing needs for battery testing at the NAVWPNSUPPCEN Crane, will consist of several Automated Test Stations (ATSs) which monitor batteries under test. Each ATS will interface with an Automation Network Controller (ANC) which will collect test data for reduction.
Satellite battery testing status
NASA Technical Reports Server (NTRS)
Haag, R.; Hall, S.
1986-01-01
Because of the large numbers of satellite cells currently being tested and anticipated at the Naval Weapons Support Center (NAVWPNSUPPCEN) Crane, Indiana, satellite cell testing is being integrated into the Battery Test Automation Project (BTAP). The BTAP, designed to meet the growing needs for battery testing at the NAVWPNSUPPCEN Crane, will consist of several Automated Test Stations (ATSs) which monitor batteries under test. Each ATS will interface with an Automation Network Controller (ANC) which will collect test data for reduction.
Automated Help System For A Supercomputer
NASA Technical Reports Server (NTRS)
Callas, George P.; Schulbach, Catherine H.; Younkin, Michael
1994-01-01
Expert-system software developed to provide automated system of user-helping displays in supercomputer system at Ames Research Center Advanced Computer Facility. Users located at remote computer terminals connected to supercomputer and each other via gateway computers, local-area networks, telephone lines, and satellite links. Automated help system answers routine user inquiries about how to use services of computer system. Available 24 hours per day and reduces burden on human experts, freeing them to concentrate on helping users with complicated problems.
Amith, Muhammad; Cunningham, Rachel; Savas, Lara S; Boom, Julie; Schvaneveldt, Roger; Tao, Cui; Cohen, Trevor
2017-10-01
This study demonstrates the use of distributed vector representations and Pathfinder Network Scaling (PFNETS) to represent online vaccine content created by health experts and by laypeople. By analyzing a target audience's conceptualization of a topic, domain experts can develop targeted interventions to improve the basic health knowledge of consumers. The underlying assumption is that the content created by different groups reflects the mental organization of their knowledge. Applying automated text analysis to this content may elucidate differences between the knowledge structures of laypeople (heath consumers) and professionals (health experts). This paper utilizes vaccine information generated by laypeople and health experts to investigate the utility of this approach. We used an established technique from cognitive psychology, Pathfinder Network Scaling to infer the structure of the associational networks between concepts learned from online content using methods of distributional semantics. In doing so, we extend the original application of PFNETS to infer knowledge structures from individual participants, to infer the prevailing knowledge structures within communities of content authors. The resulting graphs reveal opportunities for public health and vaccination education experts to improve communication and intervention efforts directed towards health consumers. Our efforts demonstrate the feasibility of using an automated procedure to examine the manifestation of conceptual models within large bodies of free text, revealing evidence of conflicting understanding of vaccine concepts among health consumers as compared with health experts. Additionally, this study provides insight into the differences between consumer and expert abstraction of domain knowledge, revealing vaccine-related knowledge gaps that suggest opportunities to improve provider-patient communication. Copyright © 2017 Elsevier Inc. All rights reserved.
Stanislawski, Larry V.; Falgout, Jeff T.; Buttenfield, Barbara P.
2015-01-01
Hydrographic networks form an important data foundation for cartographic base mapping and for hydrologic analysis. Drainage density patterns for these networks can be derived to characterize local landscape, bedrock and climate conditions, and further inform hydrologic and geomorphological analysis by indicating areas where too few headwater channels have been extracted. But natural drainage density patterns are not consistently available in existing hydrographic data for the United States because compilation and capture criteria historically varied, along with climate, during the period of data collection over the various terrain types throughout the country. This paper demonstrates an automated workflow that is being tested in a high-performance computing environment by the U.S. Geological Survey (USGS) to map natural drainage density patterns at the 1:24,000-scale (24K) for the conterminous United States. Hydrographic network drainage patterns may be extracted from elevation data to guide corrections for existing hydrographic network data. The paper describes three stages in this workflow including data pre-processing, natural channel extraction, and generation of drainage density patterns from extracted channels. The workflow is concurrently implemented by executing procedures on multiple subbasin watersheds within the U.S. National Hydrography Dataset (NHD). Pre-processing defines parameters that are needed for the extraction process. Extraction proceeds in standard fashion: filling sinks, developing flow direction and weighted flow accumulation rasters. Drainage channels with assigned Strahler stream order are extracted within a subbasin and simplified. Drainage density patterns are then estimated with 100-meter resolution and subsequently smoothed with a low-pass filter. The extraction process is found to be of better quality in higher slope terrains. Concurrent processing through the high performance computing environment is shown to facilitate and refine the choice of drainage density extraction parameters and more readily improve extraction procedures than conventional processing.
Besmer, Michael D.; Weissbrodt, David G.; Kratochvil, Bradley E.; Sigrist, Jürg A.; Weyland, Mathias S.; Hammes, Frederik
2014-01-01
Fluorescent staining coupled with flow cytometry (FCM) is often used for the monitoring, quantification and characterization of bacteria in engineered and environmental aquatic ecosystems including seawater, freshwater, drinking water, wastewater, and industrial bioreactors. However, infrequent grab sampling hampers accurate characterization and subsequent understanding of microbial dynamics in all of these ecosystems. A logic technological progression is high throughput and full automation of the sampling, staining, measurement, and data analysis steps. Here we assess the feasibility and applicability of automated FCM by means of actual data sets produced with prototype instrumentation. As proof-of-concept we demonstrate examples of microbial dynamics in (i) flowing tap water from a municipal drinking water supply network and (ii) river water from a small creek subject to two rainfall events. In both cases, automated measurements were done at 15-min intervals during 12–14 consecutive days, yielding more than 1000 individual data points for each ecosystem. The extensive data sets derived from the automated measurements allowed for the establishment of baseline data for each ecosystem, as well as for the recognition of daily variations and specific events that would most likely be missed (or miss-characterized) by infrequent sampling. In addition, the online FCM data from the river water was combined and correlated with online measurements of abiotic parameters, showing considerable potential for a better understanding of cause-and-effect relationships in aquatic ecosystems. Although several challenges remain, the successful operation of an automated online FCM system and the basic interpretation of the resulting data sets represent a breakthrough toward the eventual establishment of fully automated online microbiological monitoring technologies. PMID:24917858
Exploring the retinal connectome
Anderson, James R.; Jones, Bryan W.; Watt, Carl B.; Shaw, Margaret V.; Yang, Jia-Hui; DeMill, David; Lauritzen, James S.; Lin, Yanhua; Rapp, Kevin D.; Mastronarde, David; Koshevoy, Pavel; Grimm, Bradley; Tasdizen, Tolga; Whitaker, Ross
2011-01-01
Purpose A connectome is a comprehensive description of synaptic connectivity for a neural domain. Our goal was to produce a connectome data set for the inner plexiform layer of the mammalian retina. This paper describes our first retinal connectome, validates the method, and provides key initial findings. Methods We acquired and assembled a 16.5 terabyte connectome data set RC1 for the rabbit retina at ≈2 nm resolution using automated transmission electron microscope imaging, automated mosaicking, and automated volume registration. RC1 represents a column of tissue 0.25 mm in diameter, spanning the inner nuclear, inner plexiform, and ganglion cell layers. To enhance ultrastructural tracing, we included molecular markers for 4-aminobutyrate (GABA), glutamate, glycine, taurine, glutamine, and the in vivo activity marker, 1-amino-4-guanidobutane. This enabled us to distinguish GABAergic and glycinergic amacrine cells; to identify ON bipolar cells coupled to glycinergic cells; and to discriminate different kinds of bipolar, amacrine, and ganglion cells based on their molecular signatures and activity. The data set was explored and annotated with Viking, our multiuser navigation tool. Annotations were exported to additional applications to render cells, visualize network graphs, and query the database. Results Exploration of RC1 showed that the 2 nm resolution readily recapitulated well known connections and revealed several new features of retinal organization: (1) The well known AII amacrine cell pathway displayed more complexity than previously reported, with no less than 17 distinct signaling modes, including ribbon synapse inputs from OFF bipolar cells, wide-field ON cone bipolar cells and rod bipolar cells, and extensive input from cone-pathway amacrine cells. (2) The axons of most cone bipolar cells formed a distinct signal integration compartment, with ON cone bipolar cell axonal synapses targeting diverse cell types. Both ON and OFF bipolar cells receive axonal veto synapses. (3) Chains of conventional synapses were very common, with intercalated glycinergic-GABAergic chains and very long chains associated with starburst amacrine cells. Glycinergic amacrine cells clearly play a major role in ON-OFF crossover inhibition. (4) Molecular and excitation mapping clearly segregates ultrastructurally defined bipolar cell groups into different response clusters. (5) Finally, low-resolution electron or optical imaging cannot reliably map synaptic connections by process geometry, as adjacency without synaptic contact is abundant in the retina. Only direct visualization of synapses and gap junctions suffices. Conclusions Connectome assembly and analysis using conventional transmission electron microscopy is now practical for network discovery. Our surveys of volume RC1 demonstrate that previously studied systems such as the AII amacrine cell network involve more network motifs than previously known. The AII network, primarily considered a scotopic pathway, clearly derives ribbon synapse input from photopic ON and OFF cone bipolar cell networks and extensive photopic GABAergic amacrine cell inputs. Further, bipolar cells show extensive inputs and outputs along their axons, similar to multistratified nonmammalian bipolar cells. Physiologic evidence of significant ON-OFF channel crossover is strongly supported by our anatomic data, showing alternating glycine-to-GABA paths. Long chains of amacrine cell networks likely arise from homocellular GABAergic synapses between starburst amacrine cells. Deeper analysis of RC1 offers the opportunity for more complete descriptions of specific networks. PMID:21311605
Exploring the retinal connectome.
Anderson, James R; Jones, Bryan W; Watt, Carl B; Shaw, Margaret V; Yang, Jia-Hui; Demill, David; Lauritzen, James S; Lin, Yanhua; Rapp, Kevin D; Mastronarde, David; Koshevoy, Pavel; Grimm, Bradley; Tasdizen, Tolga; Whitaker, Ross; Marc, Robert E
2011-02-03
A connectome is a comprehensive description of synaptic connectivity for a neural domain. Our goal was to produce a connectome data set for the inner plexiform layer of the mammalian retina. This paper describes our first retinal connectome, validates the method, and provides key initial findings. We acquired and assembled a 16.5 terabyte connectome data set RC1 for the rabbit retina at ≈ 2 nm resolution using automated transmission electron microscope imaging, automated mosaicking, and automated volume registration. RC1 represents a column of tissue 0.25 mm in diameter, spanning the inner nuclear, inner plexiform, and ganglion cell layers. To enhance ultrastructural tracing, we included molecular markers for 4-aminobutyrate (GABA), glutamate, glycine, taurine, glutamine, and the in vivo activity marker, 1-amino-4-guanidobutane. This enabled us to distinguish GABAergic and glycinergic amacrine cells; to identify ON bipolar cells coupled to glycinergic cells; and to discriminate different kinds of bipolar, amacrine, and ganglion cells based on their molecular signatures and activity. The data set was explored and annotated with Viking, our multiuser navigation tool. Annotations were exported to additional applications to render cells, visualize network graphs, and query the database. Exploration of RC1 showed that the 2 nm resolution readily recapitulated well known connections and revealed several new features of retinal organization: (1) The well known AII amacrine cell pathway displayed more complexity than previously reported, with no less than 17 distinct signaling modes, including ribbon synapse inputs from OFF bipolar cells, wide-field ON cone bipolar cells and rod bipolar cells, and extensive input from cone-pathway amacrine cells. (2) The axons of most cone bipolar cells formed a distinct signal integration compartment, with ON cone bipolar cell axonal synapses targeting diverse cell types. Both ON and OFF bipolar cells receive axonal veto synapses. (3) Chains of conventional synapses were very common, with intercalated glycinergic-GABAergic chains and very long chains associated with starburst amacrine cells. Glycinergic amacrine cells clearly play a major role in ON-OFF crossover inhibition. (4) Molecular and excitation mapping clearly segregates ultrastructurally defined bipolar cell groups into different response clusters. (5) Finally, low-resolution electron or optical imaging cannot reliably map synaptic connections by process geometry, as adjacency without synaptic contact is abundant in the retina. Only direct visualization of synapses and gap junctions suffices. Connectome assembly and analysis using conventional transmission electron microscopy is now practical for network discovery. Our surveys of volume RC1 demonstrate that previously studied systems such as the AII amacrine cell network involve more network motifs than previously known. The AII network, primarily considered a scotopic pathway, clearly derives ribbon synapse input from photopic ON and OFF cone bipolar cell networks and extensive photopic GABAergic amacrine cell inputs. Further, bipolar cells show extensive inputs and outputs along their axons, similar to multistratified nonmammalian bipolar cells. Physiologic evidence of significant ON-OFF channel crossover is strongly supported by our anatomic data, showing alternating glycine-to-GABA paths. Long chains of amacrine cell networks likely arise from homocellular GABAergic synapses between starburst amacrine cells. Deeper analysis of RC1 offers the opportunity for more complete descriptions of specific networks.
Lequan Yu; Hao Chen; Qi Dou; Jing Qin; Pheng Ann Heng
2017-01-01
Automated polyp detection in colonoscopy videos has been demonstrated to be a promising way for colorectal cancer prevention and diagnosis. Traditional manual screening is time consuming, operator dependent, and error prone; hence, automated detection approach is highly demanded in clinical practice. However, automated polyp detection is very challenging due to high intraclass variations in polyp size, color, shape, and texture, and low interclass variations between polyps and hard mimics. In this paper, we propose a novel offline and online three-dimensional (3-D) deep learning integration framework by leveraging the 3-D fully convolutional network (3D-FCN) to tackle this challenging problem. Compared with the previous methods employing hand-crafted features or 2-D convolutional neural network, the 3D-FCN is capable of learning more representative spatio-temporal features from colonoscopy videos, and hence has more powerful discrimination capability. More importantly, we propose a novel online learning scheme to deal with the problem of limited training data by harnessing the specific information of an input video in the learning process. We integrate offline and online learning to effectively reduce the number of false positives generated by the offline network and further improve the detection performance. Extensive experiments on the dataset of MICCAI 2015 Challenge on Polyp Detection demonstrated the better performance of our method when compared with other competitors.
Automated Construction of Node Software Using Attributes in a Ubiquitous Sensor Network Environment
Lee, Woojin; Kim, Juil; Kang, JangMook
2010-01-01
In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN) applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications using attributes. In the proposed technique, application construction proceeds by first developing a model for the sensor network and then designing node software by setting the values of the predefined attributes. After that, the sensor network model and the design of node software are verified. The final source codes of the node software are automatically generated from the sensor network model. We illustrate the efficiency of the proposed technique by using a gas/light monitoring application through a case study of a Gas and Light Monitoring System based on the Nano-Qplus operating system. We evaluate the technique using a quantitative metric—the memory size of execution code for node software. Using the proposed approach, developers are able to easily construct sensor network applications and rapidly generate a large number of node softwares at a time in a ubiquitous sensor network environment. PMID:22163678
Scaling the PuNDIT project for wide area deployments
NASA Astrophysics Data System (ADS)
McKee, Shawn; Batista, Jorge; Carcassi, Gabriele; Dovrolis, Constantine; Lee, Danny
2017-10-01
In today’s world of distributed scientific collaborations, there are many challenges to providing reliable inter-domain network infrastructure. Network operators use a combination of active monitoring and trouble tickets to detect problems, but these are often ineffective at identifying issues that impact wide-area network users. Additionally, these approaches do not scale to wide area inter-domain networks due to unavailability of data from all the domains along typical network paths. The Pythia Network Diagnostic InfrasTructure (PuNDIT) project aims to create a scalable infrastructure for automating the detection and localization of problems across these networks. The project goal is to gather and analyze metrics from existing perfSONAR monitoring infrastructures to identify the signatures of possible problems, locate affected network links, and report them to the user in an intuitive fashion. Simply put, PuNDIT seeks to convert complex network metrics into easily understood diagnoses in an automated manner. We present our progress in creating the PuNDIT system and our status in developing, testing and deploying PuNDIT. We report on the project progress to-date, describe the current implementation architecture and demonstrate some of the various user interfaces it will support. We close by discussing the remaining challenges and next steps and where we see the project going in the future.
Automated construction of node software using attributes in a ubiquitous sensor network environment.
Lee, Woojin; Kim, Juil; Kang, JangMook
2010-01-01
In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN) applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications using attributes. In the proposed technique, application construction proceeds by first developing a model for the sensor network and then designing node software by setting the values of the predefined attributes. After that, the sensor network model and the design of node software are verified. The final source codes of the node software are automatically generated from the sensor network model. We illustrate the efficiency of the proposed technique by using a gas/light monitoring application through a case study of a Gas and Light Monitoring System based on the Nano-Qplus operating system. We evaluate the technique using a quantitative metric-the memory size of execution code for node software. Using the proposed approach, developers are able to easily construct sensor network applications and rapidly generate a large number of node softwares at a time in a ubiquitous sensor network environment.
VLSI synthesis of digital application specific neural networks
NASA Technical Reports Server (NTRS)
Beagles, Grant; Winters, Kel
1991-01-01
Neural networks tend to fall into two general categories: (1) software simulations, or (2) custom hardware that must be trained. The scope of this project is the merger of these two classifications into a system whereby a software model of a network is trained to perform a specific task and the results used to synthesize a standard cell realization of the network using automated tools.
Automated detection of optical counterparts to GRBs with RAPTOR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wozniak, P. R.; Vestrand, W. T.; Evans, S.
2006-05-19
The RAPTOR system (RAPid Telescopes for Optical Response) is an array of several distributed robotic telescopes that automatically respond to GCN localization alerts. Raptor-S is a 0.4-m telescope with 24 arc min. field of view employing a 1k x 1k Marconi CCD detector, and has already detected prompt optical emission from several GRBs within the first minute of the explosion. We present a real-time data analysis and alert system for automated identification of optical transients in Raptor-S GRB response data down to the sensitivity limit of {approx} 19 mag. Our custom data processing pipeline is designed to minimize the timemore » required to reliably identify transients and extract actionable information. The system utilizes a networked PostgreSQL database server for catalog access and distributes email alerts with successful detections.« less
Automation Hooks Architecture Trade Study for Flexible Test Orchestration
NASA Technical Reports Server (NTRS)
Lansdowne, Chatwin A.; Maclean, John R.; Graffagnino, Frank J.; McCartney, Patrick A.
2010-01-01
We describe the conclusions of a technology and communities survey supported by concurrent and follow-on proof-of-concept prototyping to evaluate feasibility of defining a durable, versatile, reliable, visible software interface to support strategic modularization of test software development. The objective is that test sets and support software with diverse origins, ages, and abilities can be reliably integrated into test configurations that assemble and tear down and reassemble with scalable complexity in order to conduct both parametric tests and monitored trial runs. The resulting approach is based on integration of three recognized technologies that are currently gaining acceptance within the test industry and when combined provide a simple, open and scalable test orchestration architecture that addresses the objectives of the Automation Hooks task. The technologies are automated discovery using multicast DNS Zero Configuration Networking (zeroconf), commanding and data retrieval using resource-oriented Restful Web Services, and XML data transfer formats based on Automatic Test Markup Language (ATML). This open-source standards-based approach provides direct integration with existing commercial off-the-shelf (COTS) analysis software tools.
NASA Astrophysics Data System (ADS)
Borovikov, Yu S.; Sulaymanov, A. O.; Andreev, M. V.
2015-10-01
Development, research and operation of smart grids (SG) with active-adaptive networks (AAS) are actual tasks for today. Planned integration of high-speed FACTS devices greatly complicates complex dynamic properties of power systems. As a result the operating conditions of equipment of power systems are significantly changing. Such situation creates the new actual problem of development and research of relay protection and automation (RPA) which will be able to adequately operate in the SGs and adapt to its regimes. Effectiveness of solution of the problem depends on using tools - different simulators of electric power systems. Analysis of the most famous and widely exploited simulators led to the conclusion about the impossibility of using them for solution of the mentioned problem. In Tomsk Polytechnic University developed the prototype of hybrid multiprocessor software and hardware system - Hybrid Real-Time Power System Simulator (HRTSim). Because of its unique features this simulator can be used for solution of mentioned tasks. This article introduces the concept of development and research of relay protection and automation with usage of HRTSim.
NASA Tech Briefs, December 2006
NASA Technical Reports Server (NTRS)
2006-01-01
Topic include: Inferring Gear Damage from Oil-Debris and Vibration Data; Forecasting of Storm-Surge Floods Using ADCIRC and Optimized DEMs; User Interactive Software for Analysis of Human Physiological Data; Representation of Serendipitous Scientific Data; Automatic Locking of Laser Frequency to an Absorption Peak; Self-Passivating Lithium/Solid Electrolyte/Iodine Cells; Four-Quadrant Analog Multipliers Using G4-FETs; Noise Source for Calibrating a Microwave Polarimeter; Hybrid Deployable Foam Antennas and Reflectors; Coating MCPs with AlN and GaN; Domed, 40-cm-Diameter Ion Optics for an Ion Thruster; Gesture-Controlled Interfaces for Self-Service Machines; Dynamically Alterable Arrays of Polymorphic Data Types; Identifying Trends in Deep Space Network Monitor Data; Predicting Lifetime of a Thermomechanically Loaded Component; Partial Automation of Requirements Tracing; Automated Synthesis of Architecture of Avionic Systems; SSRL Emergency Response Shore Tool; Wholly Aromatic Ether-Imides as n-Type Semiconductors; Carbon-Nanotube-Carpet Heat-Transfer Pads; Pulse-Flow Microencapsulation System; Automated Low-Gravitation Facility Would Make Optical Fibers; Alignment Cube with One Diffractive Face; Graphite Composite Booms with Integral Hinges; Tool for Sampling Permafrost on a Remote Planet; and Special Semaphore Scheme for UHF Spacecraft Communications.
Concept utilizing telex network for operational management requirements
NASA Astrophysics Data System (ADS)
Kowalczyk, E.
1982-09-01
The simplest and least expensive means ensuring fast transmission of documented (recorded) information on a country-wide scale for all kinds of users is the telex network, which is fully automated in Poland at this time (except for foreign message traffic) with more than 17,000 subscribers. As a digital network, the telex network constitutes, in principle, a base network that is ready for use to transmit information as part of remote data transmission systems.
3D Actin Network Centerline Extraction with Multiple Active Contours
Xu, Ting; Vavylonis, Dimitrios; Huang, Xiaolei
2013-01-01
Fluorescence microscopy is frequently used to study two and three dimensional network structures formed by cytoskeletal polymer fibers such as actin filaments and actin cables. While these cytoskeletal structures are often dilute enough to allow imaging of individual filaments or bundles of them, quantitative analysis of these images is challenging. To facilitate quantitative, reproducible and objective analysis of the image data, we propose a semi-automated method to extract actin networks and retrieve their topology in 3D. Our method uses multiple Stretching Open Active Contours (SOACs) that are automatically initialized at image intensity ridges and then evolve along the centerlines of filaments in the network. SOACs can merge, stop at junctions, and reconfigure with others to allow smooth crossing at junctions of filaments. The proposed approach is generally applicable to images of curvilinear networks with low SNR. We demonstrate its potential by extracting the centerlines of synthetic meshwork images, actin networks in 2D Total Internal Reflection Fluorescence Microscopy images, and 3D actin cable meshworks of live fission yeast cells imaged by spinning disk confocal microscopy. Quantitative evaluation of the method using synthetic images shows that for images with SNR above 5.0, the average vertex error measured by the distance between our result and ground truth is 1 voxel, and the average Hausdorff distance is below 10 voxels. PMID:24316442
Peirone, Laura S; Pereyra Irujo, Gustavo A; Bolton, Alejandro; Erreguerena, Ignacio; Aguirrezábal, Luis A N
2018-01-01
Conventional field phenotyping for drought tolerance, the most important factor limiting yield at a global scale, is labor-intensive and time-consuming. Automated greenhouse platforms can increase the precision and throughput of plant phenotyping and contribute to a faster release of drought tolerant varieties. The aim of this work was to establish a framework of analysis to identify early traits which could be efficiently measured in a greenhouse automated phenotyping platform, for predicting the drought tolerance of field grown soybean genotypes. A group of genotypes was evaluated, which showed variation in their drought susceptibility index (DSI) for final biomass and leaf area. A large number of traits were measured before and after the onset of a water deficit treatment, which were analyzed under several criteria: the significance of the regression with the DSI, phenotyping cost, earliness, and repeatability. The most efficient trait was found to be transpiration efficiency measured at 13 days after emergence. This trait was further tested in a second experiment with different water deficit intensities, and validated using a different set of genotypes against field data from a trial network in a third experiment. The framework applied in this work for assessing traits under different criteria could be helpful for selecting those most efficient for automated phenotyping.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Chase
A number of Department of Energy (DOE) science applications, involving exascale computing systems and large experimental facilities, are expected to generate large volumes of data, in the range of petabytes to exabytes, which will be transported over wide-area networks for the purpose of storage, visualization, and analysis. The objectives of this proposal are to (1) develop and test the component technologies and their synthesis methods to achieve source-to-sink high-performance flows, and (2) develop tools that provide these capabilities through simple interfaces to users and applications. In terms of the former, we propose to develop (1) optimization methods that align andmore » transition multiple storage flows to multiple network flows on multicore, multibus hosts; and (2) edge and long-haul network path realization and maintenance using advanced provisioning methods including OSCARS and OpenFlow. We also propose synthesis methods that combine these individual technologies to compose high-performance flows using a collection of constituent storage-network flows, and realize them across the storage and local network connections as well as long-haul connections. We propose to develop automated user tools that profile the hosts, storage systems, and network connections; compose the source-to-sink complex flows; and set up and maintain the needed network connections.« less
NASA Technical Reports Server (NTRS)
Rickman, Doug; Shire, J.; Qualters, J.; Mitchell, K.; Pollard, S.; Rao, R.; Kajumba, N.; Quattrochi, D.; Estes, M., Jr.; Meyer, P.;
2009-01-01
Objectives. To provide an overview of four environmental public health surveillance projects developed by CDC and its partners for the Health and Environment Linked for Information Exchange, Atlanta (HELIX-Atlanta) and to illustrate common issues and challenges encountered in developing an environmental public health tracking system. Methods. HELIX-Atlanta, initiated in October 2003 to develop data linkage and analysis methods that can be used by the National Environmental Public Health Tracking Network (Tracking Network), conducted four projects. We highlight the projects' work, assess attainment of the HELIX-Atlanta goals and discuss three surveillance attributes. Results. Among the major challenges was the complexity of analytic issues which required multidiscipline teams with technical expertise. This expertise and the data resided across multiple organizations. Conclusions:Establishing formal procedures for sharing data, defining data analysis standards and automating analyses, and committing staff with appropriate expertise is needed to support wide implementation of environmental public health tracking.
Neural Network Analysis on the NOvA Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Safford, Twymun K.; Himmel, Alex
NOνA is collaboration of 180 scientists and engineers from 28 institutions which plans to study neutrino oscillations using the existing NuMI neutrino beam at Fermilab. The NOνA experiment is designed to search for oscillations of muon neutrinos to electron neutrinos by comparing the electron neutrino event rate measured at the Fermilab site with the electron neutrino event rate measured at a location just south of International Falls, MN 810 kilometers distant from Fermilab. If oscillations occur, the far site will see the appearance of electrons in the muon neutrino beam produced at Fermilab. The presence of background radiation obscures themore » desired particles and trails to be observed. Using neural network analysis, the goal of the project was to implement machine learning to automate the removal of background radiation to render pixel maps of the particle trajectories.« less
ERIC Educational Resources Information Center
Shlechter, Theodore M.; And Others
This paper focuses upon the research and development (R&D) process associated with developing automated feedback materials for the SIMulation NETworking (SIMNET) training system. This R&D process involved a partnership among instructional developers, practitioners, and researchers. Users' input has been utilized to help: (1) design the…
Technical Processing Librarians in the 1980's: Current Trends and Future Forecasts.
ERIC Educational Resources Information Center
Kennedy, Gail
1980-01-01
This review of recent and anticipated advances in library automation technology and methodology includes a review of the effects of OCLC, MARC formatting, AACR2, and increasing costs, as well as predictions of the impact on library technical processing of networking, expansion of automation, minicomputers, specialized reference services, and…
ERIC Educational Resources Information Center
Hartt, Richard W.
This report discusses the characteristics, operations, and automation requirements of technical libraries providing services to organizations involved in aerospace and defense scientific and technical work, and describes the Local Automation Model project. This on-going project is designed to demonstrate the concept of a fully integrated library…
Petkewich, Matthew D.; Daamen, Ruby C.; Roehl, Edwin A.; Conrads, Paul
2016-09-29
The generation of Everglades Depth Estimation Network (EDEN) daily water-level and water-depth maps is dependent on high quality real-time data from over 240 water-level stations. To increase the accuracy of the daily water-surface maps, the Automated Data Assurance and Management (ADAM) tool was created by the U.S. Geological Survey as part of Greater Everglades Priority Ecosystems Science. The ADAM tool is used to provide accurate quality-assurance review of the real-time data from the EDEN network and allows estimation or replacement of missing or erroneous data. This user’s manual describes how to install and operate the ADAM software. File structure and operation of the ADAM software is explained using examples.
M Dwarf Exoplanet Survey by the Falcon Telescope Network
NASA Astrophysics Data System (ADS)
Carlson, Randall E.
2016-10-01
The Falcon Telescope Network (FTN) consists of twelve automated 20-inch telescopes located around the globe. We control it at the US Air Force Academy in Colorado Springs, Colorado from the Cadet Space Operations Center. We have installed 10 of the 12 sites and anticipate full operational capability by the beginning of 2017. The network's worldwide geographic distribution provides advantages. The primary mission of the FTN is Space Situational Awareness and studying Near Earth Objects. However, we are employing the FTN with its 11' x 11' field-of-view for a five-year, M dwarf exoplanet survey. Specifically, we are searching for Earth-radius exoplanets. We describe the FTN, design considerations going into the FTN's M dwarf exoplanet survey including automated operations, and initial results of the survey.
2012-04-30
individualistic , Western view of human behavior. That is, it assumes that people adhere to the individualistic principles of Western culture . What happens to...people with networks rich in structural holes that live/work in environments that adhere to other principles, such as those of a collectivistic ... culture ? •http://orgtheory.wordpress.com/2007/06/19/structural-holes-in-context/ Preferential Attachment (PA) [Barabási & Albert, 1999] • The most
NASA Astrophysics Data System (ADS)
Baray, J. L.; Fréville, P.; Montoux, N.; Chauvigné, A.; Hadad, D.; Sellegri, K.
2018-04-01
A Rayleigh-Mie-Raman LIDAR provides vertical profiles of tropospheric variables at Clermont-Ferrand (France) since 2008, in order to describe the boundary layer dynamics, tropospheric aerosols, cirrus and water vapor. It is included in the EARLINET network. We performed hardware/software developments in order to upgrade the quality, calibration and improve automation. We present an overview of the system and some examples of measurements and a preliminary geophysical analysis of the data.
Automated video surveillance: teaching an old dog new tricks
NASA Astrophysics Data System (ADS)
McLeod, Alastair
1993-12-01
The automated video surveillance market is booming with new players, new systems, new hardware and software, and an extended range of applications. This paper reviews available technology, and describes the features required for a good automated surveillance system. Both hardware and software are discussed. An overview of typical applications is also given. A shift towards PC-based hybrid systems, use of parallel processing, neural networks, and exploitation of modern telecomms are introduced, highlighting the evolution modern video surveillance systems.
Automated Propulsion Data Screening demonstration system
NASA Technical Reports Server (NTRS)
Hoyt, W. Andes; Choate, Timothy D.; Whitehead, Bruce A.
1995-01-01
A fully-instrumented firing of a propulsion system typically generates a very large quantity of data. In the case of the Space Shuttle Main Engine (SSME), data analysis from ground tests and flights is currently a labor-intensive process. Human experts spend a great deal of time examining the large volume of sensor data generated by each engine firing. These experts look for any anomalies in the data which might indicate engine conditions warranting further investigation. The contract effort was to develop a 'first-cut' screening system for application to SSME engine firings that would identify the relatively small volume of data which is unusual or anomalous in some way. With such a system, limited and expensive human resources could focus on this small volume of unusual data for thorough analysis. The overall project objective was to develop a fully operational Automated Propulsion Data Screening (APDS) system with the capability of detecting significant trends and anomalies in transient and steady-state data. However, the effort limited screening of transient data to ground test data for throttle-down cases typical of the 3-g acceleration, and for engine throttling required to reach the maximum dynamic pressure limits imposed on the Space Shuttle. This APDS is based on neural networks designed to detect anomalies in propulsion system data that are not part of the data used for neural network training. The delivered system allows engineers to build their own screening sets for application to completed or planned firings of the SSME. ERC developers also built some generic screening sets that NASA engineers could apply immediately to their data analysis efforts.
Toward a Scalable Visualization System for Network Traffic Monitoring
NASA Astrophysics Data System (ADS)
Malécot, Erwan Le; Kohara, Masayoshi; Hori, Yoshiaki; Sakurai, Kouichi
With the multiplication of attacks against computer networks, system administrators are required to monitor carefully the traffic exchanged by the networks they manage. However, that monitoring task is increasingly laborious because of the augmentation of the amount of data to analyze. And that trend is going to intensify with the explosion of the number of devices connected to computer networks along with the global rise of the available network bandwidth. So system administrators now heavily rely on automated tools to assist them and simplify the analysis of the data. Yet, these tools provide limited support and, most of the time, require highly skilled operators. Recently, some research teams have started to study the application of visualization techniques to the analysis of network traffic data. We believe that this original approach can also allow system administrators to deal with the large amount of data they have to process. In this paper, we introduce a tool for network traffic monitoring using visualization techniques that we developed in order to assist the system administrators of our corporate network. We explain how we designed the tool and some of the choices we made regarding the visualization techniques to use. The resulting tool proposes two linked representations of the network traffic and activity, one in 2D and the other in 3D. As 2D and 3D visualization techniques have different assets, we resulted in combining them in our tool to take advantage of their complementarity. We finally tested our tool in order to evaluate the accuracy of our approach.
Erdoğdu, Utku; Tan, Mehmet; Alhajj, Reda; Polat, Faruk; Rokne, Jon; Demetrick, Douglas
2013-01-01
The availability of enough samples for effective analysis and knowledge discovery has been a challenge in the research community, especially in the area of gene expression data analysis. Thus, the approaches being developed for data analysis have mostly suffered from the lack of enough data to train and test the constructed models. We argue that the process of sample generation could be successfully automated by employing some sophisticated machine learning techniques. An automated sample generation framework could successfully complement the actual sample generation from real cases. This argument is validated in this paper by describing a framework that integrates multiple models (perspectives) for sample generation. We illustrate its applicability for producing new gene expression data samples, a highly demanding area that has not received attention. The three perspectives employed in the process are based on models that are not closely related. The independence eliminates the bias of having the produced approach covering only certain characteristics of the domain and leading to samples skewed towards one direction. The first model is based on the Probabilistic Boolean Network (PBN) representation of the gene regulatory network underlying the given gene expression data. The second model integrates Hierarchical Markov Model (HIMM) and the third model employs a genetic algorithm in the process. Each model learns as much as possible characteristics of the domain being analysed and tries to incorporate the learned characteristics in generating new samples. In other words, the models base their analysis on domain knowledge implicitly present in the data itself. The developed framework has been extensively tested by checking how the new samples complement the original samples. The produced results are very promising in showing the effectiveness, usefulness and applicability of the proposed multi-model framework.
NASA Technical Reports Server (NTRS)
Fayyad, Kristina E.; Hill, Randall W., Jr.; Wyatt, E. J.
1993-01-01
This paper presents a case study of the knowledge engineering process employed to support the Link Monitor and Control Operator Assistant (LMCOA). The LMCOA is a prototype system which automates the configuration, calibration, test, and operation (referred to as precalibration) of the communications, data processing, metric data, antenna, and other equipment used to support space-ground communications with deep space spacecraft in NASA's Deep Space Network (DSN). The primary knowledge base in the LMCOA is the Temporal Dependency Network (TDN), a directed graph which provides a procedural representation of the precalibration operation. The TDN incorporates precedence, temporal, and state constraints and uses several supporting knowledge bases and data bases. The paper provides a brief background on the DSN, and describes the evolution of the TDN and supporting knowledge bases, the process used for knowledge engineering, and an analysis of the successes and problems of the knowledge engineering effort.
Faro, Alberto; Giordano, Daniela; Spampinato, Concetto
2008-06-01
This paper proposes a traffic monitoring architecture based on a high-speed communication network whose nodes are equipped with fuzzy processors and cellular neural network (CNN) embedded systems. It implements a real-time mobility information system where visual human perceptions sent by people working on the territory and video-sequences of traffic taken from webcams are jointly processed to evaluate the fundamental traffic parameters for every street of a metropolitan area. This paper presents the whole methodology for data collection and analysis and compares the accuracy and the processing time of the proposed soft computing techniques with other existing algorithms. Moreover, this paper discusses when and why it is recommended to fuse the visual perceptions of the traffic with the automated measurements taken from the webcams to compute the maximum traveling time that is likely needed to reach any destination in the traffic network.
Acharya, U Rajendra; Oh, Shu Lih; Hagiwara, Yuki; Tan, Jen Hong; Adeli, Hojjat
2017-09-27
An encephalogram (EEG) is a commonly used ancillary test to aide in the diagnosis of epilepsy. The EEG signal contains information about the electrical activity of the brain. Traditionally, neurologists employ direct visual inspection to identify epileptiform abnormalities. This technique can be time-consuming, limited by technical artifact, provides variable results secondary to reader expertise level, and is limited in identifying abnormalities. Therefore, it is essential to develop a computer-aided diagnosis (CAD) system to automatically distinguish the class of these EEG signals using machine learning techniques. This is the first study to employ the convolutional neural network (CNN) for analysis of EEG signals. In this work, a 13-layer deep convolutional neural network (CNN) algorithm is implemented to detect normal, preictal, and seizure classes. The proposed technique achieved an accuracy, specificity, and sensitivity of 88.67%, 90.00% and 95.00%, respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.
Optimization of a Multi-Stage ATR System for Small Target Identification
NASA Technical Reports Server (NTRS)
Lin, Tsung-Han; Lu, Thomas; Braun, Henry; Edens, Western; Zhang, Yuhan; Chao, Tien- Hsin; Assad, Christopher; Huntsberger, Terrance
2010-01-01
An Automated Target Recognition system (ATR) was developed to locate and target small object in images and videos. The data is preprocessed and sent to a grayscale optical correlator (GOC) filter to identify possible regionsof- interest (ROIs). Next, features are extracted from ROIs based on Principal Component Analysis (PCA) and sent to neural network (NN) to be classified. The features are analyzed by the NN classifier indicating if each ROI contains the desired target or not. The ATR system was found useful in identifying small boats in open sea. However, due to "noisy background," such as weather conditions, background buildings, or water wakes, some false targets are mis-classified. Feedforward backpropagation and Radial Basis neural networks are optimized for generalization of representative features to reduce false-alarm rate. The neural networks are compared for their performance in classification accuracy, classifying time, and training time.
Investigation of automated task learning, decomposition and scheduling
NASA Technical Reports Server (NTRS)
Livingston, David L.; Serpen, Gursel; Masti, Chandrashekar L.
1990-01-01
The details and results of research conducted in the application of neural networks to task planning and decomposition are presented. Task planning and decomposition are operations that humans perform in a reasonably efficient manner. Without the use of good heuristics and usually much human interaction, automatic planners and decomposers generally do not perform well due to the intractable nature of the problems under consideration. The human-like performance of neural networks has shown promise for generating acceptable solutions to intractable problems such as planning and decomposition. This was the primary reasoning behind attempting the study. The basis for the work is the use of state machines to model tasks. State machine models provide a useful means for examining the structure of tasks since many formal techniques have been developed for their analysis and synthesis. It is the approach to integrate the strong algebraic foundations of state machines with the heretofore trial-and-error approach to neural network synthesis.
Should Secondary Schools Buy Local Area Networks?
ERIC Educational Resources Information Center
Hyde, Hartley
1986-01-01
The advantages of microcomputer networks include resource sharing, multiple user communications, and integrating data processing and office automation. This article nonetheless favors stand-alone computers for Australian secondary school classrooms because of unreliable hardware, software design, and copyright problems, and individual progress…
ERIC Educational Resources Information Center
Levy, Sue
1992-01-01
Presents a library automation vendor's perspective on networking based on activities at Ameritech Information Systems. Topics discussed include the information explosion; information technology; user expectations and needs; remote access; the library of the future; systems integration; ownership versus access; copyright laws; and the role of the…
The Semi-Planned LAN: Prototyping a Local Area Network.
ERIC Educational Resources Information Center
True, John F.; Rosenwald, Judah
1986-01-01
Five administrative user departments at San Francisco State University discovered that they had common requirements for office automation and data manipulation that could be addressed with microcomputers. The results of a local area network project are presented. (Author/MLW)
ERIC Educational Resources Information Center
Proceedings of the ASIS Mid-Year Meeting, 1992
1992-01-01
Lists the speakers and summarizes the issues addressed for 12 panel sessions on topics related to networking, including libraries and national networks, federal national resources and energy programs, multimedia issues, telecommuting, remote image serving, accessing the Internet, library automation, scientific information, applications of Z39.50,…
Machine learning of network metrics in ATLAS Distributed Data Management
NASA Astrophysics Data System (ADS)
Lassnig, Mario; Toler, Wesley; Vamosi, Ralf; Bogado, Joaquin; ATLAS Collaboration
2017-10-01
The increasing volume of physics data poses a critical challenge to the ATLAS experiment. In anticipation of high luminosity physics, automation of everyday data management tasks has become necessary. Previously many of these tasks required human decision-making and operation. Recent advances in hardware and software have made it possible to entrust more complicated duties to automated systems using models trained by machine learning algorithms. In this contribution we show results from one of our ongoing automation efforts that focuses on network metrics. First, we describe our machine learning framework built atop the ATLAS Analytics Platform. This framework can automatically extract and aggregate data, train models with various machine learning algorithms, and eventually score the resulting models and parameters. Second, we use these models to forecast metrics relevant for networkaware job scheduling and data brokering. We show the characteristics of the data and evaluate the forecasting accuracy of our models.
The Choice between MapMan and Gene Ontology for Automated Gene Function Prediction in Plant Science
Klie, Sebastian; Nikoloski, Zoran
2012-01-01
Since the introduction of the Gene Ontology (GO), the analysis of high-throughput data has become tightly coupled with the use of ontologies to establish associations between knowledge and data in an automated fashion. Ontologies provide a systematic description of knowledge by a controlled vocabulary of defined structure in which ontological concepts are connected by pre-defined relationships. In plant science, MapMan and GO offer two alternatives for ontology-driven analyses. Unlike GO, initially developed to characterize microbial systems, MapMan was specifically designed to cover plant-specific pathways and processes. While the dependencies between concepts in MapMan are modeled as a tree, in GO these are captured in a directed acyclic graph. Therefore, the difference in ontologies may cause discrepancies in data reduction, visualization, and hypothesis generation. Here provide the first systematic comparative analysis of GO and MapMan for the case of the model plant species Arabidopsis thaliana (Arabidopsis) with respect to their structural properties and difference in distributions of information content. In addition, we investigate the effect of the two ontologies on the specificity and sensitivity of automated gene function prediction via the coupling of co-expression networks and the guilt-by-association principle. Automated gene function prediction is particularly needed for the model plant Arabidopsis in which only half of genes have been functionally annotated based on sequence similarity to known genes. The results highlight the need for structured representation of species-specific biological knowledge, and warrants caution in the design principles employed in future ontologies. PMID:22754563
An Automated Solar Synoptic Analysis Software System
NASA Astrophysics Data System (ADS)
Hong, S.; Lee, S.; Oh, S.; Kim, J.; Lee, J.; Kim, Y.; Lee, J.; Moon, Y.; Lee, D.
2012-12-01
We have developed an automated software system of identifying solar active regions, filament channels, and coronal holes, those are three major solar sources causing the space weather. Space weather forecasters of NOAA Space Weather Prediction Center produce the solar synoptic drawings as a daily basis to predict solar activities, i.e., solar flares, filament eruptions, high speed solar wind streams, and co-rotating interaction regions as well as their possible effects to the Earth. As an attempt to emulate this process with a fully automated and consistent way, we developed a software system named ASSA(Automated Solar Synoptic Analysis). When identifying solar active regions, ASSA uses high-resolution SDO HMI intensitygram and magnetogram as inputs and providing McIntosh classification and Mt. Wilson magnetic classification of each active region by applying appropriate image processing techniques such as thresholding, morphology extraction, and region growing. At the same time, it also extracts morphological and physical properties of active regions in a quantitative way for the short-term prediction of flares and CMEs. When identifying filament channels and coronal holes, images of global H-alpha network and SDO AIA 193 are used for morphological identification and also SDO HMI magnetograms for quantitative verification. The output results of ASSA are routinely checked and validated against NOAA's daily SRS(Solar Region Summary) and UCOHO(URSIgram code for coronal hole information). A couple of preliminary scientific results are to be presented using available output results. ASSA will be deployed at the Korean Space Weather Center and serve its customers in an operational status by the end of 2012.
Automation and Robotics for Space-Based Systems, 1991
NASA Technical Reports Server (NTRS)
Williams, Robert L., II (Editor)
1992-01-01
The purpose of this in-house workshop was to assess the state-of-the-art of automation and robotics for space operations from an LaRC perspective and to identify areas of opportunity for future research. Over half of the presentations came from the Automation Technology Branch, covering telerobotic control, extravehicular activity (EVA) and intra-vehicular activity (IVA) robotics, hand controllers for teleoperation, sensors, neural networks, and automated structural assembly, all applied to space missions. Other talks covered the Remote Manipulator System (RMS) active damping augmentation, space crane work, modeling, simulation, and control of large, flexible space manipulators, and virtual passive controller designs for space robots.
Özdemir, Vural; Hekim, Nezih
2018-01-01
Driverless cars with artificial intelligence (AI) and automated supermarkets run by collaborative robots (cobots) working without human supervision have sparked off new debates: what will be the impacts of extreme automation, turbocharged by the Internet of Things (IoT), AI, and the Industry 4.0, on Big Data and omics implementation science? The IoT builds on (1) broadband wireless internet connectivity, (2) miniaturized sensors embedded in animate and inanimate objects ranging from the house cat to the milk carton in your smart fridge, and (3) AI and cobots making sense of Big Data collected by sensors. Industry 4.0 is a high-tech strategy for manufacturing automation that employs the IoT, thus creating the Smart Factory. Extreme automation until "everything is connected to everything else" poses, however, vulnerabilities that have been little considered to date. First, highly integrated systems are vulnerable to systemic risks such as total network collapse in the event of failure of one of its parts, for example, by hacking or Internet viruses that can fully invade integrated systems. Second, extreme connectivity creates new social and political power structures. If left unchecked, they might lead to authoritarian governance by one person in total control of network power, directly or through her/his connected surrogates. We propose Industry 5.0 that can democratize knowledge coproduction from Big Data, building on the new concept of symmetrical innovation. Industry 5.0 utilizes IoT, but differs from predecessor automation systems by having three-dimensional (3D) symmetry in innovation ecosystem design: (1) a built-in safe exit strategy in case of demise of hyperconnected entrenched digital knowledge networks. Importantly, such safe exists are orthogonal-in that they allow "digital detox" by employing pathways unrelated/unaffected by automated networks, for example, electronic patient records versus material/article trails on vital medical information; (2) equal emphasis on both acceleration and deceleration of innovation if diminishing returns become apparent; and (3) next generation social science and humanities (SSH) research for global governance of emerging technologies: "Post-ELSI Technology Evaluation Research" (PETER). Importantly, PETER considers the technology opportunity costs, ethics, ethics-of-ethics, framings (epistemology), independence, and reflexivity of SSH research in technology policymaking. Industry 5.0 is poised to harness extreme automation and Big Data with safety, innovative technology policy, and responsible implementation science, enabled by 3D symmetry in innovation ecosystem design.
Neural network approaches to capture temporal information
NASA Astrophysics Data System (ADS)
van Veelen, Martijn; Nijhuis, Jos; Spaanenburg, Ben
2000-05-01
The automated design and construction of neural networks receives growing attention of the neural networks community. Both the growing availability of computing power and development of mathematical and probabilistic theory have had severe impact on the design and modelling approaches of neural networks. This impact is most apparent in the use of neural networks to time series prediction. In this paper, we give our views on past, contemporary and future design and modelling approaches to neural forecasting.
Student beats the teacher: deep neural networks for lateral ventricles segmentation in brain MR
NASA Astrophysics Data System (ADS)
Ghafoorian, Mohsen; Teuwen, Jonas; Manniesing, Rashindra; Leeuw, Frank-Erik d.; van Ginneken, Bram; Karssemeijer, Nico; Platel, Bram
2018-03-01
Ventricular volume and its progression are known to be linked to several brain diseases such as dementia and schizophrenia. Therefore accurate measurement of ventricle volume is vital for longitudinal studies on these disorders, making automated ventricle segmentation algorithms desirable. In the past few years, deep neural networks have shown to outperform the classical models in many imaging domains. However, the success of deep networks is dependent on manually labeled data sets, which are expensive to acquire especially for higher dimensional data in the medical domain. In this work, we show that deep neural networks can be trained on muchcheaper-to-acquire pseudo-labels (e.g., generated by other automated less accurate methods) and still produce more accurate segmentations compared to the quality of the labels. To show this, we use noisy segmentation labels generated by a conventional region growing algorithm to train a deep network for lateral ventricle segmentation. Then on a large manually annotated test set, we show that the network significantly outperforms the conventional region growing algorithm which was used to produce the training labels for the network. Our experiments report a Dice Similarity Coefficient (DSC) of 0.874 for the trained network compared to 0.754 for the conventional region growing algorithm (p < 0.001).
TOWARDS AN AUTOMATED TOOL FOR CHANNEL-NETWORK CHARACTERIZATIONS, MODELING, AND ASSESSMENT
Detailed characterization of channel networks for hydrologic and geomorphic models has traditionally been a difficult and expensive proposition, and lack of information has thus been a common limitation of modeling efforts. With the advent of datasets derived from high-resolutio...
TRANSMISSION NETWORK PLANNING METHOD FOR COMPARATIVE STUDIES (JOURNAL VERSION)
An automated transmission network planning method for comparative studies is presented. This method employs logical steps that may closely parallel those taken in practice by the planning engineers. Use is made of a sensitivity matrix to simulate the engineers' experience in sele...
Florida: Library Networking and Technology Development.
ERIC Educational Resources Information Center
Wilkins, Barratt, Ed.
1996-01-01
Explains the development of library networks in Florida and the role of the state library. Topics include regional multitype library consortia; a statewide bibliographic database; interlibrary loan; Internet access in public libraries; government information, including remote public access; automation projects; telecommunications; and free-nets.…
Saha, Sajib Kumar; Fernando, Basura; Cuadros, Jorge; Xiao, Di; Kanagasingam, Yogesan
2018-04-27
Fundus images obtained in a telemedicine program are acquired at different sites that are captured by people who have varying levels of experience. These result in a relatively high percentage of images which are later marked as unreadable by graders. Unreadable images require a recapture which is time and cost intensive. An automated method that determines the image quality during acquisition is an effective alternative. To determine the image quality during acquisition, we describe here an automated method for the assessment of image quality in the context of diabetic retinopathy. The method explicitly applies machine learning techniques to access the image and to determine 'accept' and 'reject' categories. 'Reject' category image requires a recapture. A deep convolution neural network is trained to grade the images automatically. A large representative set of 7000 colour fundus images was used for the experiment which was obtained from the EyePACS that were made available by the California Healthcare Foundation. Three retinal image analysis experts were employed to categorise these images into 'accept' and 'reject' classes based on the precise definition of image quality in the context of DR. The network was trained using 3428 images. The method shows an accuracy of 100% to successfully categorise 'accept' and 'reject' images, which is about 2% higher than the traditional machine learning method. On a clinical trial, the proposed method shows 97% agreement with human grader. The method can be easily incorporated with the fundus image capturing system in the acquisition centre and can guide the photographer whether a recapture is necessary or not.
Development of Automated Image Analysis Software for Suspended Marine Particle Classification
2003-09-30
Development of Automated Image Analysis Software for Suspended Marine Particle Classification Scott Samson Center for Ocean Technology...REPORT TYPE 3. DATES COVERED 00-00-2003 to 00-00-2003 4. TITLE AND SUBTITLE Development of Automated Image Analysis Software for Suspended...objective is to develop automated image analysis software to reduce the effort and time required for manual identification of plankton images. Automated
ERIC Educational Resources Information Center
Divilbiss, J. L., Ed.
To help the librarian in negotiating with vendors of automated library services, nine authors have presented methods of dealing with a specific service or situation. Paper topics include computer services, network contracts, innovative service, data processing, automated circulation, a turn-key system, data base sharing, online data base services,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katipamula, Srinivas; Gowri, Krishnan; Hernandez, George
This paper describes one such reference process that can be deployed to provide continuous automated conditioned-based maintenance management for buildings that have BIM, a building automation system (BAS) and a computerized maintenance management software (CMMS) systems. The process can be deployed using an open source transactional network platform, VOLTTRON™, designed for distributed sensing and controls and supports both energy efficiency and grid services.
Automated Consultation for the Diagnosis of Interplanetary Telecommunications
NASA Technical Reports Server (NTRS)
Quan, A. G.; Schwuttke, U. M.; Herstein, J. S.; Spagnuolo, J. S.; Burleigh, S.
1995-01-01
SHARP (Spacecraft Health Automated Reasoning Program) is a knowledge-based system for the diagnosis of problems in NASA's Deep Space Network (DSN) telecommunications system. This system provides the means of communication between a spacecraft and operations personnel at Jet Propulsion Laboratory. SHARP analyzes problems that occur in both the on-board spacecraft telecom subsystem, and the DSN.
Sub-Network Kernels for Measuring Similarity of Brain Connectivity Networks in Disease Diagnosis.
Jie, Biao; Liu, Mingxia; Zhang, Daoqiang; Shen, Dinggang
2018-05-01
As a simple representation of interactions among distributed brain regions, brain networks have been widely applied to automated diagnosis of brain diseases, such as Alzheimer's disease (AD) and its early stage, i.e., mild cognitive impairment (MCI). In brain network analysis, a challenging task is how to measure the similarity between a pair of networks. Although many graph kernels (i.e., kernels defined on graphs) have been proposed for measuring the topological similarity of a pair of brain networks, most of them are defined using general graphs, thus ignoring the uniqueness of each node in brain networks. That is, each node in a brain network denotes a particular brain region, which is a specific characteristics of brain networks. Accordingly, in this paper, we construct a novel sub-network kernel for measuring the similarity between a pair of brain networks and then apply it to brain disease classification. Different from current graph kernels, our proposed sub-network kernel not only takes into account the inherent characteristic of brain networks, but also captures multi-level (from local to global) topological properties of nodes in brain networks, which are essential for defining the similarity measure of brain networks. To validate the efficacy of our method, we perform extensive experiments on subjects with baseline functional magnetic resonance imaging data obtained from the Alzheimer's disease neuroimaging initiative database. Experimental results demonstrate that the proposed method outperforms several state-of-the-art graph-based methods in MCI classification.
Automated Depression Analysis Using Convolutional Neural Networks from Speech.
He, Lang; Cao, Cui
2018-05-28
To help clinicians to efficiently diagnose the severity of a person's depression, the affective computing community and the artificial intelligence field have shown a growing interest in designing automated systems. The speech features have useful information for the diagnosis of depression. However, manually designing and domain knowledge are still important for the selection of the feature, which makes the process labor consuming and subjective. In recent years, deep-learned features based on neural networks have shown superior performance to hand-crafted features in various areas. In this paper, to overcome the difficulties mentioned above, we propose a combination of hand-crafted and deep-learned features which can effectively measure the severity of depression from speech. In the proposed method, Deep Convolutional Neural Networks (DCNN) are firstly built to learn deep-learned features from spectrograms and raw speech waveforms. Then we manually extract the state-of-the-art texture descriptors named median robust extended local binary patterns (MRELBP) from spectrograms. To capture the complementary information within the hand-crafted features and deep-learned features, we propose joint fine-tuning layers to combine the raw and spectrogram DCNN to boost the depression recognition performance. Moreover, to address the problems with small samples, a data augmentation method was proposed. Experiments conducted on AVEC2013 and AVEC2014 depression databases show that our approach is robust and effective for the diagnosis of depression when compared to state-of-the-art audio-based methods. Copyright © 2018. Published by Elsevier Inc.
Del Campo, Antonio; Cintioni, Lorenzo; Spinsante, Susanna; Gambi, Ennio
2017-01-01
With the introduction of low-power wireless technologies, like Bluetooth Low Energy (BLE), new applications are approaching the home automation, healthcare, fitness, automotive and consumer electronics markets. BLE devices are designed to maximize the battery life, i.e., to run for long time on a single coin-cell battery. In typical application scenarios of home automation and Ambient Assisted Living (AAL), the sensors that monitor relatively unpredictable and rare events should coexist with other sensors that continuously communicate health or environmental parameter measurements. The former usually work in connectionless mode, acting as advertisers, while the latter need a persistent connection, acting as slave nodes. The coexistence of connectionless and connection-oriented networks, that share the same central node, can be required to reduce the number of handling devices, thus keeping the network complexity low and limiting the packet’s traffic congestion. In this paper, the medium access management, operated by the central node, has been modeled, focusing on the scheduling procedure in both connectionless and connection-oriented communication. The models have been merged to provide a tool supporting the configuration design of BLE devices, during the network design phase that precedes the real implementation. The results highlight the suitability of the proposed tool: the ability to set the device parameters to allow us to keep a practical discovery latency for event-driven sensors and avoid undesired overlaps between scheduled scanning and connection phases due to bad management performed by the central node. PMID:28387724
Hanrahan, Lawrence P.; Anderson, Henry A.; Busby, Brian; Bekkedal, Marni; Sieger, Thomas; Stephenson, Laura; Knobeloch, Lynda; Werner, Mark; Imm, Pamela; Olson, Joseph
2004-01-01
In this article we describe the development of an information system for environmental childhood cancer surveillance. The Wisconsin Cancer Registry annually receives more than 25,000 incident case reports. Approximately 269 cases per year involve children. Over time, there has been considerable community interest in understanding the role the environment plays as a cause of these cancer cases. Wisconsin’s Public Health Information Network (WI-PHIN) is a robust web portal integrating both Health Alert Network and National Electronic Disease Surveillance System components. WI-PHIN is the information technology platform for all public health surveillance programs. Functions include the secure, automated exchange of cancer case data between public health–based and hospital-based cancer registrars; web-based supplemental data entry for environmental exposure confirmation and hypothesis testing; automated data analysis, visualization, and exposure–outcome record linkage; directories of public health and clinical personnel for role-based access control of sensitive surveillance information; public health information dissemination and alerting; and information technology security and critical infrastructure protection. For hypothesis generation, cancer case data are sent electronically to WI-PHIN and populate the integrated data repository. Environmental data are linked and the exposure–disease relationships are explored using statistical tools for ecologic exposure risk assessment. For hypothesis testing, case–control interviews collect exposure histories, including parental employment and residential histories. This information technology approach can thus serve as the basis for building a comprehensive system to assess environmental cancer etiology. PMID:15471739
Del Campo, Antonio; Cintioni, Lorenzo; Spinsante, Susanna; Gambi, Ennio
2017-04-07
With the introduction of low-power wireless technologies, like Bluetooth Low Energy (BLE), new applications are approaching the home automation, healthcare, fitness, automotive and consumer electronics markets. BLE devices are designed to maximize the battery life, i.e., to run for long time on a single coin-cell battery. In typical application scenarios of home automation and Ambient Assisted Living (AAL), the sensors that monitor relatively unpredictable and rare events should coexist with other sensors that continuously communicate health or environmental parameter measurements. The former usually work in connectionless mode, acting as advertisers, while the latter need a persistent connection, acting as slave nodes. The coexistence of connectionless and connection-oriented networks, that share the same central node, can be required to reduce the number of handling devices, thus keeping the network complexity low and limiting the packet's traffic congestion. In this paper, the medium access management, operated by the central node, has been modeled, focusing on the scheduling procedure in both connectionless and connection-oriented communication. The models have been merged to provide a tool supporting the configuration design of BLE devices, during the network design phase that precedes the real implementation. The results highlight the suitability of the proposed tool: the ability to set the device parameters to allow us to keep a practical discovery latency for event-driven sensors and avoid undesired overlaps between scheduled scanning and connection phases due to bad management performed by the central node.
Scalable Probabilistic Inference for Global Seismic Monitoring
NASA Astrophysics Data System (ADS)
Arora, N. S.; Dear, T.; Russell, S.
2011-12-01
We describe a probabilistic generative model for seismic events, their transmission through the earth, and their detection (or mis-detection) at seismic stations. We also describe an inference algorithm that constructs the most probable event bulletin explaining the observed set of detections. The model and inference are called NET-VISA (network processing vertically integrated seismic analysis) and is designed to replace the current automated network processing at the IDC, the SEL3 bulletin. Our results (attached table) demonstrate that NET-VISA significantly outperforms SEL3 by reducing the missed events from 30.3% down to 12.5%. The difference is even more dramatic for smaller magnitude events. NET-VISA has no difficulty in locating nuclear explosions as well. The attached figure demonstrates the location predicted by NET-VISA versus other bulletins for the second DPRK event. Further evaluation on dense regional networks demonstrates that NET-VISA finds many events missed in the LEB bulletin, which is produced by the human analysts. Large aftershock sequences, as produced by the 2004 December Sumatra earthquake and the 2011 March Tohoku earthquake, can pose a significant load for automated processing, often delaying the IDC bulletins by weeks or months. Indeed these sequences can overload the serial NET-VISA inference as well. We describe an enhancement to NET-VISA to make it multi-threaded, and hence take full advantage of the processing power of multi-core and -cpu machines. Our experiments show that the new inference algorithm is able to achieve 80% efficiency in parallel speedup.
Bass, Ellen J; Baumgart, Leigh A; Shepley, Kathryn Klein
2013-03-01
Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance.
Multichannel Convolutional Neural Network for Biological Relation Extraction.
Quan, Chanqin; Hua, Lei; Sun, Xiao; Bai, Wenjun
2016-01-01
The plethora of biomedical relations which are embedded in medical logs (records) demands researchers' attention. Previous theoretical and practical focuses were restricted on traditional machine learning techniques. However, these methods are susceptible to the issues of "vocabulary gap" and data sparseness and the unattainable automation process in feature extraction. To address aforementioned issues, in this work, we propose a multichannel convolutional neural network (MCCNN) for automated biomedical relation extraction. The proposed model has the following two contributions: (1) it enables the fusion of multiple (e.g., five) versions in word embeddings; (2) the need for manual feature engineering can be obviated by automated feature learning with convolutional neural network (CNN). We evaluated our model on two biomedical relation extraction tasks: drug-drug interaction (DDI) extraction and protein-protein interaction (PPI) extraction. For DDI task, our system achieved an overall f -score of 70.2% compared to the standard linear SVM based system (e.g., 67.0%) on DDIExtraction 2013 challenge dataset. And for PPI task, we evaluated our system on Aimed and BioInfer PPI corpus; our system exceeded the state-of-art ensemble SVM system by 2.7% and 5.6% on f -scores.
A Neural-Network-Based Semi-Automated Geospatial Classification Tool
NASA Astrophysics Data System (ADS)
Hale, R. G.; Herzfeld, U. C.
2014-12-01
North America's largest glacier system, the Bering Bagley Glacier System (BBGS) in Alaska, surged in 2011-2013, as shown by rapid mass transfer, elevation change, and heavy crevassing. Little is known about the physics controlling surge glaciers' semi-cyclic patterns; therefore, it is crucial to collect and analyze as much data as possible so that predictive models can be made. In addition, physical signs frozen in ice in the form of crevasses may help serve as a warning for future surges. The BBGS surge provided an opportunity to develop an automated classification tool for crevasse classification based on imagery collected from small aircraft. The classification allows one to link image classification to geophysical processes associated with ice deformation. The tool uses an approach that employs geostatistical functions and a feed-forward perceptron with error back-propagation. The connectionist-geostatistical approach uses directional experimental (discrete) variograms to parameterize images into a form that the Neural Network (NN) can recognize. In an application to preform analysis on airborne video graphic data from the surge of the BBGS, an NN was able to distinguish 18 different crevasse classes with 95 percent or higher accuracy, for over 3,000 images. Recognizing that each surge wave results in different crevasse types and that environmental conditions affect the appearance in imagery, we designed the tool's semi-automated pre-training algorithm to be adaptable. The tool can be optimized to specific settings and variables of image analysis: (airborne and satellite imagery, different camera types, observation altitude, number and types of classes, and resolution). The generalization of the classification tool brings three important advantages: (1) multiple types of problems in geophysics can be studied, (2) the training process is sufficiently formalized to allow non-experts in neural nets to perform the training process, and (3) the time required to manually pre-sort imagery into classes is greatly reduced.
Chow, Cheryl-Emiliane T; Kim, Diane Y; Sachdeva, Rohan; Caron, David A; Fuhrman, Jed A
2014-01-01
Characterizing ecological relationships between viruses, bacteria and protists in the ocean are critical to understanding ecosystem function, yet these relationships are infrequently investigated together. We evaluated these relationships through microbial association network analysis of samples collected approximately monthly from March 2008 to January 2011 in the surface ocean (0–5 m) at the San Pedro Ocean Time series station. Bacterial, T4-like myoviral and protistan communities were described by Automated Ribosomal Intergenic Spacer Analysis and terminal restriction fragment length polymorphism of the gene encoding the major capsid protein (g23) and 18S ribosomal DNA, respectively. Concurrent shifts in community structure suggested similar timing of responses to environmental and biological parameters. We linked T4-like myoviral, bacterial and protistan operational taxonomic units by local similarity correlations, which were then visualized as association networks. Network links (correlations) potentially represent synergistic and antagonistic relationships such as viral lysis, grazing, competition or other interactions. We found that virus–bacteria relationships were more cross-linked than protist–bacteria relationships, suggestive of increased taxonomic specificity in virus–bacteria relationships. We also found that 80% of bacterial–protist and 74% of bacterial–viral correlations were positive, with the latter suggesting that at monthly and seasonal timescales, viruses may be following their hosts more often than controlling host abundance. PMID:24196323
Performance of wavelet analysis and neural networks for pathological voices identification
NASA Astrophysics Data System (ADS)
Salhi, Lotfi; Talbi, Mourad; Abid, Sabeur; Cherif, Adnane
2011-09-01
Within the medical environment, diverse techniques exist to assess the state of the voice of the patient. The inspection technique is inconvenient for a number of reasons, such as its high cost, the duration of the inspection, and above all, the fact that it is an invasive technique. This study focuses on a robust, rapid and accurate system for automatic identification of pathological voices. This system employs non-invasive, non-expensive and fully automated method based on hybrid approach: wavelet transform analysis and neural network classifier. First, we present the results obtained in our previous study while using classic feature parameters. These results allow visual identification of pathological voices. Second, quantified parameters drifting from the wavelet analysis are proposed to characterise the speech sample. On the other hand, a system of multilayer neural networks (MNNs) has been developed which carries out the automatic detection of pathological voices. The developed method was evaluated using voice database composed of recorded voice samples (continuous speech) from normophonic or dysphonic speakers. The dysphonic speakers were patients of a National Hospital 'RABTA' of Tunis Tunisia and a University Hospital in Brussels, Belgium. Experimental results indicate a success rate ranging between 75% and 98.61% for discrimination of normal and pathological voices using the proposed parameters and neural network classifier. We also compared the average classification rate based on the MNN, Gaussian mixture model and support vector machines.
A Survey on Virtualization of Wireless Sensor Networks
Islam, Md. Motaharul; Hassan, Mohammad Mehedi; Lee, Ga-Won; Huh, Eui-Nam
2012-01-01
Wireless Sensor Networks (WSNs) are gaining tremendous importance thanks to their broad range of commercial applications such as in smart home automation, health-care and industrial automation. In these applications multi-vendor and heterogeneous sensor nodes are deployed. Due to strict administrative control over the specific WSN domains, communication barriers, conflicting goals and the economic interests of different WSN sensor node vendors, it is difficult to introduce a large scale federated WSN. By allowing heterogeneous sensor nodes in WSNs to coexist on a shared physical sensor substrate, virtualization in sensor network may provide flexibility, cost effective solutions, promote diversity, ensure security and increase manageability. This paper surveys the novel approach of using the large scale federated WSN resources in a sensor virtualization environment. Our focus in this paper is to introduce a few design goals, the challenges and opportunities of research in the field of sensor network virtualization as well as to illustrate a current status of research in this field. This paper also presents a wide array of state-of-the art projects related to sensor network virtualization. PMID:22438759
A survey on virtualization of Wireless Sensor Networks.
Islam, Md Motaharul; Hassan, Mohammad Mehedi; Lee, Ga-Won; Huh, Eui-Nam
2012-01-01
Wireless Sensor Networks (WSNs) are gaining tremendous importance thanks to their broad range of commercial applications such as in smart home automation, health-care and industrial automation. In these applications multi-vendor and heterogeneous sensor nodes are deployed. Due to strict administrative control over the specific WSN domains, communication barriers, conflicting goals and the economic interests of different WSN sensor node vendors, it is difficult to introduce a large scale federated WSN. By allowing heterogeneous sensor nodes in WSNs to coexist on a shared physical sensor substrate, virtualization in sensor network may provide flexibility, cost effective solutions, promote diversity, ensure security and increase manageability. This paper surveys the novel approach of using the large scale federated WSN resources in a sensor virtualization environment. Our focus in this paper is to introduce a few design goals, the challenges and opportunities of research in the field of sensor network virtualization as well as to illustrate a current status of research in this field. This paper also presents a wide array of state-of-the art projects related to sensor network virtualization.
Botsis, Taxiarchis; Foster, Matthew; Arya, Nina; Kreimeyer, Kory; Pandey, Abhishek; Arya, Deepa
2017-04-26
To evaluate the feasibility of automated dose and adverse event information retrieval in supporting the identification of safety patterns. We extracted all rabbit Anti-Thymocyte Globulin (rATG) reports submitted to the United States Food and Drug Administration Adverse Event Reporting System (FAERS) from the product's initial licensure in April 16, 1984 through February 8, 2016. We processed the narratives using the Medication Extraction (MedEx) and the Event-based Text-mining of Health Electronic Records (ETHER) systems and retrieved the appropriate medication, clinical, and temporal information. When necessary, the extracted information was manually curated. This process resulted in a high quality dataset that was analyzed with the Pattern-based and Advanced Network Analyzer for Clinical Evaluation and Assessment (PANACEA) to explore the association of rATG dosing with post-transplant lymphoproliferative disorder (PTLD). Although manual curation was necessary to improve the data quality, MedEx and ETHER supported the extraction of the appropriate information. We created a final dataset of 1,380 cases with complete information for rATG dosing and date of administration. Analysis in PANACEA found that PTLD was associated with cumulative doses of rATG >8 mg/kg, even in periods where most of the submissions to FAERS reported low doses of rATG. We demonstrated the feasibility of investigating a dose-related safety pattern for a particular product in FAERS using a set of automated tools.
Automated detection of geological landforms on Mars using Convolutional Neural Networks
NASA Astrophysics Data System (ADS)
Palafox, Leon F.; Hamilton, Christopher W.; Scheidt, Stephen P.; Alvarez, Alexander M.
2017-04-01
The large volume of high-resolution images acquired by the Mars Reconnaissance Orbiter has opened a new frontier for developing automated approaches to detecting landforms on the surface of Mars. However, most landform classifiers focus on crater detection, which represents only one of many geological landforms of scientific interest. In this work, we use Convolutional Neural Networks (ConvNets) to detect both volcanic rootless cones and transverse aeolian ridges. Our system, named MarsNet, consists of five networks, each of which is trained to detect landforms of different sizes. We compare our detection algorithm with a widely used method for image recognition, Support Vector Machines (SVMs) using Histogram of Oriented Gradients (HOG) features. We show that ConvNets can detect a wide range of landforms and has better accuracy and recall in testing data than traditional classifiers based on SVMs.
Automated detection of geological landforms on Mars using Convolutional Neural Networks.
Palafox, Leon F; Hamilton, Christopher W; Scheidt, Stephen P; Alvarez, Alexander M
2017-04-01
The large volume of high-resolution images acquired by the Mars Reconnaissance Orbiter has opened a new frontier for developing automated approaches to detecting landforms on the surface of Mars. However, most landform classifiers focus on crater detection, which represents only one of many geological landforms of scientific interest. In this work, we use Convolutional Neural Networks (ConvNets) to detect both volcanic rootless cones and transverse aeolian ridges. Our system, named MarsNet, consists of five networks, each of which is trained to detect landforms of different sizes. We compare our detection algorithm with a widely used method for image recognition, Support Vector Machines (SVMs) using Histogram of Oriented Gradients (HOG) features. We show that ConvNets can detect a wide range of landforms and has better accuracy and recall in testing data than traditional classifiers based on SVMs.
Inferring gene and protein interactions using PubMed citations and consensus Bayesian networks.
Deeter, Anthony; Dalman, Mark; Haddad, Joseph; Duan, Zhong-Hui
2017-01-01
The PubMed database offers an extensive set of publication data that can be useful, yet inherently complex to use without automated computational techniques. Data repositories such as the Genomic Data Commons (GDC) and the Gene Expression Omnibus (GEO) offer experimental data storage and retrieval as well as curated gene expression profiles. Genetic interaction databases, including Reactome and Ingenuity Pathway Analysis, offer pathway and experiment data analysis using data curated from these publications and data repositories. We have created a method to generate and analyze consensus networks, inferring potential gene interactions, using large numbers of Bayesian networks generated by data mining publications in the PubMed database. Through the concept of network resolution, these consensus networks can be tailored to represent possible genetic interactions. We designed a set of experiments to confirm that our method is stable across variation in both sample and topological input sizes. Using gene product interactions from the KEGG pathway database and data mining PubMed publication abstracts, we verify that regardless of the network resolution or the inferred consensus network, our method is capable of inferring meaningful gene interactions through consensus Bayesian network generation with multiple, randomized topological orderings. Our method can not only confirm the existence of currently accepted interactions, but has the potential to hypothesize new ones as well. We show our method confirms the existence of known gene interactions such as JAK-STAT-PI3K-AKT-mTOR, infers novel gene interactions such as RAS- Bcl-2 and RAS-AKT, and found significant pathway-pathway interactions between the JAK-STAT signaling and Cardiac Muscle Contraction KEGG pathways.
Szyrkowiec, Thomas; Autenrieth, Achim; Gunning, Paul; Wright, Paul; Lord, Andrew; Elbers, Jörg-Peter; Lumb, Alan
2014-02-10
For the first time, we demonstrate the orchestration of elastic datacenter and inter-datacenter transport network resources using a combination of OpenStack and OpenFlow. Programmatic control allows a datacenter operator to dynamically request optical lightpaths from a transport network operator to accommodate rapid changes of inter-datacenter workflows.
NASA Astrophysics Data System (ADS)
Nelson, B. R.; Prat, O. P.; Stevens, S. E.; Seo, D. J.; Zhang, J.; Howard, K.
2014-12-01
The processing of radar-only precipitation via the reanalysis from the National Mosaic and Multi-Sensor QPE (NMQ/Q2) based on the WSR-88D Next-generation Radar (NEXRAD) network over Continental United States (CONUS) is nearly completed for the period covering from 2001 to 2012. Reanalysis data are available at 1-km and 5-minute resolution. An important step in generating the best possible precipitation data is to assess the bias in the radar-only product. In this work, we use data from a combination of rain gauge networks to assess the bias in the NMQ reanalysis. Rain gauge networks such as the Hydrometeorological Automated Data System (HADS), the Automated Surface Observing Systems (ASOS), the Climate Reference Network (CRN), and the Global Historical Climatology Network Daily (GHCN-D) are combined for use in the assessment. These rain gauge networks vary in spatial density and temporal resolution. The challenge hence is to optimally utilize them to assess the bias at the finest resolution possible. For initial assessment, we propose to subset the CONUS data in climatologically representative domains, and perform bias assessment using information in the Q2 dataset on precipitation type and phase.
NASA Astrophysics Data System (ADS)
Green, H. D.; Contractor, N. S.; Yao, Y.
2006-12-01
A knowledge network is a multi-dimensional network created from the interactions and interconnections among the scientists, documents, data, analytic tools, and interactive collaboration spaces (like forums and wikis) associated with a collaborative environment. CI-KNOW is a suite of software tools that leverages automated data collection, social network theories, analysis techniques and algorithms to infer an individual's interests and expertise based on their interactions and activities within a knowledge network. The CI-KNOW recommender system mines the knowledge network associated with a scientific community's use of cyberinfrastructure tools and uses relational metadata to record connections among entities in the knowledge network. Recent developments in social network theories and methods provide the backbone for a modular system that creates recommendations from relational metadata. A network navigation portlet allows users to locate colleagues, documents, data or analytic tools in the knowledge network and to explore their networks through a visual, step-wise process. An internal auditing portlet offers administrators diagnostics to assess the growth and health of the entire knowledge network. The first instantiation of the prototype CI-KNOW system is part of the Environmental Cyberinfrastructure Demonstration project at the National Center for Supercomputing Applications, which supports the activities of hydrologic and environmental science communities (CLEANER and CUAHSI) under the umbrella of the WATERS network environmental observatory planning activities (http://cleaner.ncsa.uiuc.edu). This poster summarizes the key aspects of the CI-KNOW system, highlighting the key inputs, calculation mechanisms, and output modalities.
Technologies for unattended network operations
NASA Technical Reports Server (NTRS)
Jaworski, Allan; Odubiyi, Jide; Holdridge, Mark; Zuzek, John
1991-01-01
The necessary network management functions for a telecommunications, navigation and information management (TNIM) system in the framework of an extension of the ISO model for communications network management are described. Various technologies that could substantially reduce the need for TNIM network management, automate manpower intensive functions, and deal with synchronization and control at interplanetary distances are presented. Specific technologies addressed include the use of the ISO Common Management Interface Protocol, distributed artificial intelligence for network synchronization and fault management, and fault-tolerant systems engineering.
School Library Media Centers in a Statewide Network.
ERIC Educational Resources Information Center
Fox, Carol
1990-01-01
Description of library services in Illinois focuses on school libraries and youth services. Topics discussed include multitype library systems; automation; youth services consultants; data collection for youth services; resource sharing for schools; promotion of reading and library programs; communications networks; and standards and certification…
DOT National Transportation Integrated Search
1980-11-01
The purpose of the Application Area Definition task is to define the travel demands and guideway networks for a set of representative AGT system deployments. These demands and networks, when combined with detailed descriptions of the systems and thei...
Suggestions for Library Network Design.
ERIC Educational Resources Information Center
Salton, Gerald
1979-01-01
Various approaches to the design of automatic library systems are described, suggestions for the design of rational and effective automated library processes are posed, and an attempt is made to assess the importance and effect of library network systems on library operations and library effectiveness. (Author/CWM)
C4I Community of Interest C2 Roadmap
2015-03-24
QoS -based services – Digital policy-based prioritization – Dynamic bandwidth allocation – Automated network management April 15 Slide 9...Co-Site Mitigation) NC-3 • LPD/LPI Comms NC-4 • Increased Range NC-7 • Increased Loss Tolerance & Recovery NC-7 • Mobile Ad Hoc Networking NC-8...Algorithms and Software • Systems and Processes Networks and Communications • Radios and Apertures • Networks • Information April 15 Slide 8
Pulse Coupled Neural Networks for the Segmentation of Magnetic Resonance Brain Images.
1996-12-01
PULSE COUPLED NEURAL NETWORKS FOR THE SEGMENTATION OF MAGNETIC RESONANCE BRAIN IMAGES THESIS Shane Lee Abrahamson First Lieutenant, USAF AFIT/GCS/ENG...COUPLED NEURAL NETWORKS FOR THE SEGMENTATION OF MAGNETIC RESONANCE BRAIN IMAGES THESIS Shane Lee Abrahamson First Lieutenant, USAF AFIT/GCS/ENG/96D-01...research develops an automated method for segmenting Magnetic Resonance (MR) brain images based on Pulse Coupled Neural Networks (PCNN). MR brain image
Strategic Studies Quarterly. Volume 6, Number 3. Fall 2012
2012-01-01
a particular TOR site, the number of Iranian users on the network drops precipitously. It picks up again after TOR developers announce a workaround...benefit, hence value of the network , is then proportional to the area under the curve or natural log of N (lnN). The increase (decrease) in network ...necessary sensors and automation to strengthen and de fend network operations at the scale required for a global industry or military operations
Time frequency analysis for automated sleep stage identification in fullterm and preterm neonates.
Fraiwan, Luay; Lweesy, Khaldon; Khasawneh, Natheer; Fraiwan, Mohammad; Wenz, Heinrich; Dickhaus, Hartmut
2011-08-01
This work presents a new methodology for automated sleep stage identification in neonates based on the time frequency distribution of single electroencephalogram (EEG) recording and artificial neural networks (ANN). Wigner-Ville distribution (WVD), Hilbert-Hough spectrum (HHS) and continuous wavelet transform (CWT) time frequency distributions were used to represent the EEG signal from which features were extracted using time frequency entropy. The classification of features was done using feed forward back-propagation ANN. The system was trained and tested using data taken from neonates of post-conceptual age of 40 weeks for both preterm (14 recordings) and fullterm (15 recordings). The identification of sleep stages was successfully implemented and the classification based on the WVD outperformed the approaches based on CWT and HHS. The accuracy and kappa coefficient were found to be 0.84 and 0.65 respectively for the fullterm neonates' recordings and 0.74 and 0.50 respectively for preterm neonates' recordings.
Automated classification of multiphoton microscopy images of ovarian tissue using deep learning.
Huttunen, Mikko J; Hassan, Abdurahman; McCloskey, Curtis W; Fasih, Sijyl; Upham, Jeremy; Vanderhyden, Barbara C; Boyd, Robert W; Murugkar, Sangeeta
2018-06-01
Histopathological image analysis of stained tissue slides is routinely used in tumor detection and classification. However, diagnosis requires a highly trained pathologist and can thus be time-consuming, labor-intensive, and potentially risk bias. Here, we demonstrate a potential complementary approach for diagnosis. We show that multiphoton microscopy images from unstained, reproductive tissues can be robustly classified using deep learning techniques. We fine-train four pretrained convolutional neural networks using over 200 murine tissue images based on combined second-harmonic generation and two-photon excitation fluorescence contrast, to classify the tissues either as healthy or associated with high-grade serous carcinoma with over 95% sensitivity and 97% specificity. Our approach shows promise for applications involving automated disease diagnosis. It could also be readily applied to other tissues, diseases, and related classification problems. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
Object localization in handheld thermal images for fireground understanding
NASA Astrophysics Data System (ADS)
Vandecasteele, Florian; Merci, Bart; Jalalvand, Azarakhsh; Verstockt, Steven
2017-05-01
Despite the broad application of the handheld thermal imaging cameras in firefighting, its usage is mostly limited to subjective interpretation by the person carrying the device. As remedies to overcome this limitation, object localization and classification mechanisms could assist the fireground understanding and help with the automated localization, characterization and spatio-temporal (spreading) analysis of the fire. An automated understanding of thermal images can enrich the conventional knowledge-based firefighting techniques by providing the information from the data and sensing-driven approaches. In this work, transfer learning is applied on multi-labeling convolutional neural network architectures for object localization and recognition in monocular visual, infrared and multispectral dynamic images. Furthermore, the possibility of analyzing fire scene images is studied and their current limitations are discussed. Finally, the understanding of the room configuration (i.e., objects location) for indoor localization in reduced visibility environments and the linking with Building Information Models (BIM) are investigated.
Bjornsson, Christopher S; Lin, Gang; Al-Kofahi, Yousef; Narayanaswamy, Arunachalam; Smith, Karen L; Shain, William; Roysam, Badrinath
2009-01-01
Brain structural complexity has confounded prior efforts to extract quantitative image-based measurements. We present a systematic ‘divide and conquer’ methodology for analyzing three-dimensional (3D) multi-parameter images of brain tissue to delineate and classify key structures, and compute quantitative associations among them. To demonstrate the method, thick (~100 μm) slices of rat brain tissue were labeled using 3 – 5 fluorescent signals, and imaged using spectral confocal microscopy and unmixing algorithms. Automated 3D segmentation and tracing algorithms were used to delineate cell nuclei, vasculature, and cell processes. From these segmentations, a set of 23 intrinsic and 8 associative image-based measurements was computed for each cell. These features were used to classify astrocytes, microglia, neurons, and endothelial cells. Associations among cells and between cells and vasculature were computed and represented as graphical networks to enable further analysis. The automated results were validated using a graphical interface that permits investigator inspection and corrective editing of each cell in 3D. Nuclear counting accuracy was >89%, and cell classification accuracy ranged from 81–92% depending on cell type. We present a software system named FARSIGHT implementing our methodology. Its output is a detailed XML file containing measurements that may be used for diverse quantitative hypothesis-driven and exploratory studies of the central nervous system. PMID:18294697
Automated podosome identification and characterization in fluorescence microscopy images.
Meddens, Marjolein B M; Rieger, Bernd; Figdor, Carl G; Cambi, Alessandra; van den Dries, Koen
2013-02-01
Podosomes are cellular adhesion structures involved in matrix degradation and invasion that comprise an actin core and a ring of cytoskeletal adaptor proteins. They are most often identified by staining with phalloidin, which binds F-actin and therefore visualizes the core. However, not only podosomes, but also many other cytoskeletal structures contain actin, which makes podosome segmentation by automated image processing difficult. Here, we have developed a quantitative image analysis algorithm that is optimized to identify podosome cores within a typical sample stained with phalloidin. By sequential local and global thresholding, our analysis identifies up to 76% of podosome cores excluding other F-actin-based structures. Based on the overlap in podosome identifications and quantification of podosome numbers, our algorithm performs equally well compared to three experts. Using our algorithm we show effects of actin polymerization and myosin II inhibition on the actin intensity in both podosome core and associated actin network. Furthermore, by expanding the core segmentations, we reveal a previously unappreciated differential distribution of cytoskeletal adaptor proteins within the podosome ring. These applications illustrate that our algorithm is a valuable tool for rapid and accurate large-scale analysis of podosomes to increase our understanding of these characteristic adhesion structures.
NASA Astrophysics Data System (ADS)
Billonnet, Laurent; Dumas, Jean-Michel; Desbordes, Emmanuel; Lapôtre, Bertrand
To face the problems of elderly and disabled people in a rural environment, the district of Guéret (department of Creuse, France) has set up the "Home automation and Health Pole". In association with the University of Limoges, this structure is based on the use of e-technologies together with home automation techniques. In this frame, many international collaborations attempts have started through a BSc diploma. This paper sums up these different collaborations and directions.
Portable data collection terminal in the automated power consumption measurement system
NASA Astrophysics Data System (ADS)
Vologdin, S. V.; Shushkov, I. D.; Bysygin, E. K.
2018-01-01
Aim of efficiency increasing, automation process of electric energy data collection and processing is very important at present time. High cost of classic electric energy billing systems prevent from its mass application. Udmurtenergo Branch of IDGC of Center and Volga Region developed electronic automated system called “Mobile Energy Billing” based on data collection terminals. System joins electronic components based on service-oriented architecture, WCF services. At present time all parts of Udmurtenergo Branch electric network are connected to “Mobile Energy Billing” project. System capabilities are expanded due to flexible architecture.
2005-09-01
This research explores the need for a high throughput, high speed network for use in a network centric wartime environment and how commercial...Automated Digital Network System (ADNS). This research explores the need for a high-throughput, high-speed network for use in a network centric ...1 C. DEPARTMENT OF DEFENSE (DOD) DESIRED END STATE ..............2 1. DOD Transformation to Network Centric Warfare (NCW) Operations
Self-reconfigurable ship fluid-network modeling for simulation-based design
NASA Astrophysics Data System (ADS)
Moon, Kyungjin
Our world is filled with large-scale engineering systems, which provide various services and conveniences in our daily life. A distinctive trend in the development of today's large-scale engineering systems is the extensive and aggressive adoption of automation and autonomy that enable the significant improvement of systems' robustness, efficiency, and performance, with considerably reduced manning and maintenance costs, and the U.S. Navy's DD(X), the next-generation destroyer program, is considered as an extreme example of such a trend. This thesis pursues a modeling solution for performing simulation-based analysis in the conceptual or preliminary design stage of an intelligent, self-reconfigurable ship fluid system, which is one of the concepts of DD(X) engineering plant development. Through the investigations on the Navy's approach for designing a more survivable ship system, it is found that the current naval simulation-based analysis environment is limited by the capability gaps in damage modeling, dynamic model reconfiguration, and simulation speed of the domain specific models, especially fluid network models. As enablers of filling these gaps, two essential elements were identified in the formulation of the modeling method. The first one is the graph-based topological modeling method, which will be employed for rapid model reconstruction and damage modeling, and the second one is the recurrent neural network-based, component-level surrogate modeling method, which will be used to improve the affordability and efficiency of the modeling and simulation (M&S) computations. The integration of the two methods can deliver computationally efficient, flexible, and automation-friendly M&S which will create an environment for more rigorous damage analysis and exploration of design alternatives. As a demonstration for evaluating the developed method, a simulation model of a notional ship fluid system was created, and a damage analysis was performed. Next, the models representing different design configurations of the fluid system were created, and damage analyses were performed with them in order to find an optimal design configuration for system survivability. Finally, the benefits and drawbacks of the developed method were discussed based on the result of the demonstration.
Devine, Emily Beth; Capurro, Daniel; van Eaton, Erik; Alfonso-Cristancho, Rafael; Devlin, Allison; Yanez, N. David; Yetisgen-Yildiz, Meliha; Flum, David R.; Tarczy-Hornoch, Peter
2013-01-01
Background: The field of clinical research informatics includes creation of clinical data repositories (CDRs) used to conduct quality improvement (QI) activities and comparative effectiveness research (CER). Ideally, CDR data are accurately and directly abstracted from disparate electronic health records (EHRs), across diverse health-systems. Objective: Investigators from Washington State’s Surgical Care Outcomes and Assessment Program (SCOAP) Comparative Effectiveness Research Translation Network (CERTAIN) are creating such a CDR. This manuscript describes the automation and validation methods used to create this digital infrastructure. Methods: SCOAP is a QI benchmarking initiative. Data are manually abstracted from EHRs and entered into a data management system. CERTAIN investigators are now deploying Caradigm’s Amalga™ tool to facilitate automated abstraction of data from multiple, disparate EHRs. Concordance is calculated to compare data automatically to manually abstracted. Performance measures are calculated between Amalga and each parent EHR. Validation takes place in repeated loops, with improvements made over time. When automated abstraction reaches the current benchmark for abstraction accuracy - 95% - itwill ‘go-live’ at each site. Progress to Date: A technical analysis was completed at 14 sites. Five sites are contributing; the remaining sites prioritized meeting Meaningful Use criteria. Participating sites are contributing 15–18 unique data feeds, totaling 13 surgical registry use cases. Common feeds are registration, laboratory, transcription/dictation, radiology, and medications. Approximately 50% of 1,320 designated data elements are being automatically abstracted—25% from structured data; 25% from text mining. Conclusion: In semi-automating data abstraction and conducting a rigorous validation, CERTAIN investigators will semi-automate data collection to conduct QI and CER, while advancing the Learning Healthcare System. PMID:25848565
Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein
2014-01-01
Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance. PMID:24847184
Classifying machinery condition using oil samples and binary logistic regression
NASA Astrophysics Data System (ADS)
Phillips, J.; Cripps, E.; Lau, John W.; Hodkiewicz, M. R.
2015-08-01
The era of big data has resulted in an explosion of condition monitoring information. The result is an increasing motivation to automate the costly and time consuming human elements involved in the classification of machine health. When working with industry it is important to build an understanding and hence some trust in the classification scheme for those who use the analysis to initiate maintenance tasks. Typically "black box" approaches such as artificial neural networks (ANN) and support vector machines (SVM) can be difficult to provide ease of interpretability. In contrast, this paper argues that logistic regression offers easy interpretability to industry experts, providing insight to the drivers of the human classification process and to the ramifications of potential misclassification. Of course, accuracy is of foremost importance in any automated classification scheme, so we also provide a comparative study based on predictive performance of logistic regression, ANN and SVM. A real world oil analysis data set from engines on mining trucks is presented and using cross-validation we demonstrate that logistic regression out-performs the ANN and SVM approaches in terms of prediction for healthy/not healthy engines.
Settanni, Michele; Marengo, Davide
2015-01-01
Digital traces of activity on social network sites represent a vast source of ecological data with potential connections with individual behavioral and psychological characteristics. The present study investigates the relationship between user-generated textual content shared on Facebook and emotional well-being. Self-report measures of depression, anxiety, and stress were collected from 201 adult Facebook users from North Italy. Emotion-related textual indicators, including emoticon use, were extracted form users' Facebook posts via automated text analysis. Correlation analyses revealed that individuals with higher levels of depression, anxiety expressed negative emotions on Facebook more frequently. In addition, use of emoticons expressing positive emotions correlated negatively with stress level. When comparing age groups, younger users reported higher frequency of both emotion-related words and emoticon use in their posts. Also, the relationship between online emotional expression and self-report emotional well-being was generally stronger in the younger group. Overall, findings support the feasibility and validity of studying individual emotional well-being by means of examination of Facebook profiles. Implications for online screening purposes and future research directions are discussed.
Sharing feelings online: studying emotional well-being via automated text analysis of Facebook posts
Settanni, Michele; Marengo, Davide
2015-01-01
Digital traces of activity on social network sites represent a vast source of ecological data with potential connections with individual behavioral and psychological characteristics. The present study investigates the relationship between user-generated textual content shared on Facebook and emotional well-being. Self-report measures of depression, anxiety, and stress were collected from 201 adult Facebook users from North Italy. Emotion-related textual indicators, including emoticon use, were extracted form users’ Facebook posts via automated text analysis. Correlation analyses revealed that individuals with higher levels of depression, anxiety expressed negative emotions on Facebook more frequently. In addition, use of emoticons expressing positive emotions correlated negatively with stress level. When comparing age groups, younger users reported higher frequency of both emotion-related words and emoticon use in their posts. Also, the relationship between online emotional expression and self-report emotional well-being was generally stronger in the younger group. Overall, findings support the feasibility and validity of studying individual emotional well-being by means of examination of Facebook profiles. Implications for online screening purposes and future research directions are discussed. PMID:26257692
Automated MeSH indexing of the World-Wide Web.
Fowler, J.; Kouramajian, V.; Maram, S.; Devadhar, V.
1995-01-01
To facilitate networked discovery and information retrieval in the biomedical domain, we have designed a system for automatic assignment of Medical Subject Headings to documents retrieved from the World-Wide Web. Our prototype implementations show significant promise. We describe our methods and discuss the further development of a completely automated indexing tool called the "Web-MeSH Medibot." PMID:8563421
Implementation of Cyberinfrastructure and Data Management Workflow for a Large-Scale Sensor Network
NASA Astrophysics Data System (ADS)
Jones, A. S.; Horsburgh, J. S.
2014-12-01
Monitoring with in situ environmental sensors and other forms of field-based observation presents many challenges for data management, particularly for large-scale networks consisting of multiple sites, sensors, and personnel. The availability and utility of these data in addressing scientific questions relies on effective cyberinfrastructure that facilitates transformation of raw sensor data into functional data products. It also depends on the ability of researchers to share and access the data in useable formats. In addition to addressing the challenges presented by the quantity of data, monitoring networks need practices to ensure high data quality, including procedures and tools for post processing. Data quality is further enhanced if practitioners are able to track equipment, deployments, calibrations, and other events related to site maintenance and associate these details with observational data. In this presentation we will describe the overall workflow that we have developed for research groups and sites conducting long term monitoring using in situ sensors. Features of the workflow include: software tools to automate the transfer of data from field sites to databases, a Python-based program for data quality control post-processing, a web-based application for online discovery and visualization of data, and a data model and web interface for managing physical infrastructure. By automating the data management workflow, the time from collection to analysis is reduced and sharing and publication is facilitated. The incorporation of metadata standards and descriptions and the use of open-source tools enhances the sustainability and reusability of the data. We will describe the workflow and tools that we have developed in the context of the iUTAH (innovative Urban Transitions and Aridregion Hydrosustainability) monitoring network. The iUTAH network consists of aquatic and climate sensors deployed in three watersheds to monitor Gradients Along Mountain to Urban Transitions (GAMUT). The variety of environmental sensors and the multi-watershed, multi-institutional nature of the network necessitate a well-planned and efficient workflow for acquiring, managing, and sharing sensor data, which should be useful for similar large-scale and long-term networks.
NASA Astrophysics Data System (ADS)
Sansivero, Fabio; Vilardo, Giuseppe; Caputo, Teresa
2017-04-01
The permanent thermal infrared surveillance network of Osservatorio Vesuviano (INGV) is composed of 6 stations which acquire IR frames of fumarole fields in the Campi Flegrei caldera and inside the Vesuvius crater (Italy). The IR frames are uploaded to a dedicated server in the Surveillance Center of Osservatorio Vesuviano in order to process the infrared data and to excerpt all the information contained. In a first phase the infrared data are processed by an automated system (A.S.I.R.A. Acq- Automated System of IR Analysis and Acquisition) developed in Matlab environment and with a user-friendly graphic user interface (GUI). ASIRA daily generates time-series of residual temperature values of the maximum temperatures observed in the IR scenes after the removal of seasonal effects. These time-series are displayed in the Surveillance Room of Osservatorio Vesuviano and provide information about the evolution of shallow temperatures field of the observed areas. In particular the features of ASIRA Acq include: a) efficient quality selection of IR scenes, b) IR images co-registration in respect of a reference frame, c) seasonal correction by using a background-removal methodology, a) filing of IR matrices and of the processed data in shared archives accessible to interrogation. The daily archived records can be also processed by ASIRA Plot (Matlab code with GUI) to visualize IR data time-series and to help in evaluating inputs parameters for further data processing and analysis. Additional processing features are accomplished in a second phase by ASIRA Tools which is Matlab code with GUI developed to extract further information from the dataset in automated way. The main functions of ASIRA Tools are: a) the analysis of temperature variations of each pixel of the IR frame in a given time interval, b) the removal of seasonal effects from temperature of every pixel in the IR frames by using an analytic approach (removal of sinusoidal long term seasonal component by using a polynomial fit Matlab function - LTFC_SCOREF), c) the export of data in different raster formats (i.e. Surfer grd). An interesting example of elaborations of the data produced by ASIRA Tools is the map of the temperature changing rate, which provide remarkable information about the potential migration of fumarole activity. The high efficiency of Matlab in processing matrix data from IR scenes and the flexibility of this code-developing tool proved to be very useful to produce applications to use in volcanic surveillance aimed to monitor the evolution of surface temperatures field in diffuse degassing volcanic areas.
Data-driven automated acoustic analysis of human infant vocalizations using neural network tools.
Warlaumont, Anne S; Oller, D Kimbrough; Buder, Eugene H; Dale, Rick; Kozma, Robert
2010-04-01
Acoustic analysis of infant vocalizations has typically employed traditional acoustic measures drawn from adult speech acoustics, such as f(0), duration, formant frequencies, amplitude, and pitch perturbation. Here an alternative and complementary method is proposed in which data-derived spectrographic features are central. 1-s-long spectrograms of vocalizations produced by six infants recorded longitudinally between ages 3 and 11 months are analyzed using a neural network consisting of a self-organizing map and a single-layer perceptron. The self-organizing map acquires a set of holistic, data-derived spectrographic receptive fields. The single-layer perceptron receives self-organizing map activations as input and is trained to classify utterances into prelinguistic phonatory categories (squeal, vocant, or growl), identify the ages at which they were produced, and identify the individuals who produced them. Classification performance was significantly better than chance for all three classification tasks. Performance is compared to another popular architecture, the fully supervised multilayer perceptron. In addition, the network's weights and patterns of activation are explored from several angles, for example, through traditional acoustic measurements of the network's receptive fields. Results support the use of this and related tools for deriving holistic acoustic features directly from infant vocalization data and for the automatic classification of infant vocalizations.
Distributed Interplanetary Delay/Disruption Tolerant Network (DTN) Monitor and Control System
NASA Technical Reports Server (NTRS)
Wang, Shin-Ywan
2012-01-01
The main purpose of Distributed interplanetary Delay Tolerant Network Monitor and Control System as a DTN system network management implementation in JPL is defined to provide methods and tools that can monitor the DTN operation status, detect and resolve DTN operation failures in some automated style while either space network or some heterogeneous network is infused with DTN capability. In this paper, "DTN Monitor and Control system in Deep Space Network (DSN)" exemplifies a case how DTN Monitor and Control system can be adapted into a space network as it is DTN enabled.
Peirone, Laura S.; Pereyra Irujo, Gustavo A.; Bolton, Alejandro; Erreguerena, Ignacio; Aguirrezábal, Luis A. N.
2018-01-01
Conventional field phenotyping for drought tolerance, the most important factor limiting yield at a global scale, is labor-intensive and time-consuming. Automated greenhouse platforms can increase the precision and throughput of plant phenotyping and contribute to a faster release of drought tolerant varieties. The aim of this work was to establish a framework of analysis to identify early traits which could be efficiently measured in a greenhouse automated phenotyping platform, for predicting the drought tolerance of field grown soybean genotypes. A group of genotypes was evaluated, which showed variation in their drought susceptibility index (DSI) for final biomass and leaf area. A large number of traits were measured before and after the onset of a water deficit treatment, which were analyzed under several criteria: the significance of the regression with the DSI, phenotyping cost, earliness, and repeatability. The most efficient trait was found to be transpiration efficiency measured at 13 days after emergence. This trait was further tested in a second experiment with different water deficit intensities, and validated using a different set of genotypes against field data from a trial network in a third experiment. The framework applied in this work for assessing traits under different criteria could be helpful for selecting those most efficient for automated phenotyping. PMID:29774042
Protocols for Automated Protist Analysis
2011-12-01
Report No: CG-D-14-13 Protocols for Automated Protist Analysis December 2011 Distribution Statement A: Approved for public...release; distribution is unlimited. Protocols for Automated Protist Analysis ii UNCLAS//Public | CG-926 RDC | B. Nelson, et al. | Public...Director United States Coast Guard Research & Development Center 1 Chelsea Street New London, CT 06320 Protocols for Automated Protist Analysis
Xiang, Kun; Li, Yinglei; Ford, William; Land, Walker; Schaffer, J David; Congdon, Robert; Zhang, Jing; Sadik, Omowunmi
2016-02-21
We hereby report the design and implementation of an Autonomous Microbial Cell Culture and Classification (AMC(3)) system for rapid detection of food pathogens. Traditional food testing methods require multistep procedures and long incubation period, and are thus prone to human error. AMC(3) introduces a "one click approach" to the detection and classification of pathogenic bacteria. Once the cultured materials are prepared, all operations are automatic. AMC(3) is an integrated sensor array platform in a microbial fuel cell system composed of a multi-potentiostat, an automated data collection system (Python program, Yocto Maxi-coupler electromechanical relay module) and a powerful classification program. The classification scheme consists of Probabilistic Neural Network (PNN), Support Vector Machines (SVM) and General Regression Neural Network (GRNN) oracle-based system. Differential Pulse Voltammetry (DPV) is performed on standard samples or unknown samples. Then, using preset feature extractions and quality control, accepted data are analyzed by the intelligent classification system. In a typical use, thirty-two extracted features were analyzed to correctly classify the following pathogens: Escherichia coli ATCC#25922, Escherichia coli ATCC#11775, and Staphylococcus epidermidis ATCC#12228. 85.4% accuracy range was recorded for unknown samples, and within a shorter time period than the industry standard of 24 hours.
Automated Detection and Modeling of Slow Slip: Case Study of the Cascadia Subduction Zone
NASA Astrophysics Data System (ADS)
Crowell, B. W.; Bock, Y.; Liu, Z.
2012-12-01
The discovery of transient slow slip events over the past decade has changed our understanding of tectonic hazards and the earthquake cycle. Proper geodetic characterization of transient deformation is necessary for studies of regional interseismic, coseismic and postseismic tectonics, and miscalculations can affect our understanding of the regional stress field. We utilize two different methods to create a complete record of slow slip from continuous GPS stations in the Cascadia subduction zone between 1996 and 2012: spatiotemporal principal component analysis (PCA) and the relative strength index (RSI). The PCA is performed on 100 day windows of nearby stations to locate signals that exist across many stations in the network by looking at the ratio of the first two eigenvalues. The RSI is a financial momentum oscillator that looks for changes in individual time series with respect to previous epochs to locate rapid changes, indicative of transient deformation. Using both methods, we create a complete history of slow slip across the Cascadia subduction zone, fully characterizing the timing, progression, and magnitude of events. We inject the results from the automated transient detection into a time-dependent slip inversion and apply a Kalman filter based network inversion method to image the spatiotemporal variation of slip transients along the Cascadia margin.
Narula, Sukrit; Shameer, Khader; Salem Omar, Alaa Mabrouk; Dudley, Joel T; Sengupta, Partho P
2016-11-29
Machine-learning models may aid cardiac phenotypic recognition by using features of cardiac tissue deformation. This study investigated the diagnostic value of a machine-learning framework that incorporates speckle-tracking echocardiographic data for automated discrimination of hypertrophic cardiomyopathy (HCM) from physiological hypertrophy seen in athletes (ATH). Expert-annotated speckle-tracking echocardiographic datasets obtained from 77 ATH and 62 HCM patients were used for developing an automated system. An ensemble machine-learning model with 3 different machine-learning algorithms (support vector machines, random forests, and artificial neural networks) was developed and a majority voting method was used for conclusive predictions with further K-fold cross-validation. Feature selection using an information gain (IG) algorithm revealed that volume was the best predictor for differentiating between HCM ands. ATH (IG = 0.24) followed by mid-left ventricular segmental (IG = 0.134) and average longitudinal strain (IG = 0.131). The ensemble machine-learning model showed increased sensitivity and specificity compared with early-to-late diastolic transmitral velocity ratio (p < 0.01), average early diastolic tissue velocity (e') (p < 0.01), and strain (p = 0.04). Because ATH were younger, adjusted analysis was undertaken in younger HCM patients and compared with ATH with left ventricular wall thickness >13 mm. In this subgroup analysis, the automated model continued to show equal sensitivity, but increased specificity relative to early-to-late diastolic transmitral velocity ratio, e', and strain. Our results suggested that machine-learning algorithms can assist in the discrimination of physiological versus pathological patterns of hypertrophic remodeling. This effort represents a step toward the development of a real-time, machine-learning-based system for automated interpretation of echocardiographic images, which may help novice readers with limited experience. Copyright © 2016 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Jamaludin, Amir; Lootus, Meelis; Kadir, Timor; Zisserman, Andrew; Urban, Jill; Battié, Michele C; Fairbank, Jeremy; McCall, Iain
2017-05-01
Investigation of the automation of radiological features from magnetic resonance images (MRIs) of the lumbar spine. To automate the process of grading lumbar intervertebral discs and vertebral bodies from MRIs. MR imaging is the most common imaging technique used in investigating low back pain (LBP). Various features of degradation, based on MRIs, are commonly recorded and graded, e.g., Modic change and Pfirrmann grading of intervertebral discs. Consistent scoring and grading is important for developing robust clinical systems and research. Automation facilitates this consistency and reduces the time of radiological analysis considerably and hence the expense. 12,018 intervertebral discs, from 2009 patients, were graded by a radiologist and were then used to train: (1) a system to detect and label vertebrae and discs in a given scan, and (2) a convolutional neural network (CNN) model that predicts several radiological gradings. The performance of the model, in terms of class average accuracy, was compared with the intra-observer class average accuracy of the radiologist. The detection system achieved 95.6% accuracy in terms of disc detection and labeling. The model is able to produce predictions of multiple pathological gradings that consistently matched those of the radiologist. The model identifies 'Evidence Hotspots' that are the voxels that most contribute to the degradation scores. Automation of radiological grading is now on par with human performance. The system can be beneficial in aiding clinical diagnoses in terms of objectivity of gradings and the speed of analysis. It can also draw the attention of a radiologist to regions of degradation. This objectivity and speed is an important stepping stone in the investigation of the relationship between MRIs and clinical diagnoses of back pain in large cohorts. Level 3.
NASA Astrophysics Data System (ADS)
Burba, G. G.; Johnson, D.; Velgersdyk, M.; Beaty, K.; Forgione, A.; Begashaw, I.; Allyn, D.
2015-12-01
Significant increases in data generation and computing power in recent years have greatly improved spatial and temporal flux data coverage on multiple scales, from a single station to continental flux networks. At the same time, operating budgets for flux teams and stations infrastructure are getting ever more difficult to acquire and sustain. With more stations and networks, larger data flows from each station, and smaller operating budgets, modern tools are needed to effectively and efficiently handle the entire process. This would help maximize time dedicated to answering research questions, and minimize time and expenses spent on data processing, quality control and station management. Cross-sharing the stations with external institutions may also help leverage available funding, increase scientific collaboration, and promote data analyses and publications. FluxSuite, a new advanced tool combining hardware, software and web-service, was developed to address these specific demands. It automates key stages of flux workflow, minimizes day-to-day site management, and modernizes the handling of data flows: Each next-generation station measures all parameters needed for flux computations Field microcomputer calculates final fully-corrected flux rates in real time, including computation-intensive Fourier transforms, spectra, co-spectra, multiple rotations, stationarity, footprint, etc. Final fluxes, radiation, weather and soil data are merged into a single quality-controlled file Multiple flux stations are linked into an automated time-synchronized network Flux network manager, or PI, can see all stations in real time, including fluxes, supporting data, automated reports, and email alerts PI can assign rights, allow or restrict access to stations and data: selected stations can be shared via rights-managed access internally or with external institutions Researchers without stations could form "virtual networks" for specific projects by collaborating with PIs from different actual networks This presentation provides detailed examples of FluxSuite currently utilized by two large flux networks in China (National Academy of Sciences & Agricultural Academy of Sciences), and smaller networks with stations in the USA, Germany, Ireland, Malaysia and other locations around the globe.
First Steps to Automated Interior Reconstruction from Semantically Enriched Point Clouds and Imagery
NASA Astrophysics Data System (ADS)
Obrock, L. S.; Gülch, E.
2018-05-01
The automated generation of a BIM-Model from sensor data is a huge challenge for the modeling of existing buildings. Currently the measurements and analyses are time consuming, allow little automation and require expensive equipment. We do lack an automated acquisition of semantical information of objects in a building. We are presenting first results of our approach based on imagery and derived products aiming at a more automated modeling of interior for a BIM building model. We examine the building parts and objects visible in the collected images using Deep Learning Methods based on Convolutional Neural Networks. For localization and classification of building parts we apply the FCN8s-Model for pixel-wise Semantic Segmentation. We, so far, reach a Pixel Accuracy of 77.2 % and a mean Intersection over Union of 44.2 %. We finally use the network for further reasoning on the images of the interior room. We combine the segmented images with the original images and use photogrammetric methods to produce a three-dimensional point cloud. We code the extracted object types as colours of the 3D-points. We thus are able to uniquely classify the points in three-dimensional space. We preliminary investigate a simple extraction method for colour and material of building parts. It is shown, that the combined images are very well suited to further extract more semantic information for the BIM-Model. With the presented methods we see a sound basis for further automation of acquisition and modeling of semantic and geometric information of interior rooms for a BIM-Model.
Integration of Sensors, Controllers and Instruments Using a Novel OPC Architecture
2017-01-01
The interconnection between sensors, controllers and instruments through a communication network plays a vital role in the performance and effectiveness of a control system. Since its inception in the 90s, the Object Linking and Embedding for Process Control (OPC) protocol has provided open connectivity for monitoring and automation systems. It has been widely used in several environments such as industrial facilities, building and energy automation, engineering education and many others. This paper presents a novel OPC-based architecture to implement automation systems devoted to R&D and educational activities. The proposal is a novel conceptual framework, structured into four functional layers where the diverse components are categorized aiming to foster the systematic design and implementation of automation systems involving OPC communication. Due to the benefits of OPC, the proposed architecture provides features like open connectivity, reliability, scalability, and flexibility. Furthermore, four successful experimental applications of such an architecture, developed at the University of Extremadura (UEX), are reported. These cases are a proof of concept of the ability of this architecture to support interoperability for different domains. Namely, the automation of energy systems like a smart microgrid and photobioreactor facilities, the implementation of a network-accessible industrial laboratory and the development of an educational hardware-in-the-loop platform are described. All cases include a Programmable Logic Controller (PLC) to automate and control the plant behavior, which exchanges operative data (measurements and signals) with a multiplicity of sensors, instruments and supervisory systems under the structure of the novel OPC architecture. Finally, the main conclusions and open research directions are highlighted. PMID:28654002
Integration of Sensors, Controllers and Instruments Using a Novel OPC Architecture.
González, Isaías; Calderón, Antonio José; Barragán, Antonio Javier; Andújar, José Manuel
2017-06-27
The interconnection between sensors, controllers and instruments through a communication network plays a vital role in the performance and effectiveness of a control system. Since its inception in the 90s, the Object Linking and Embedding for Process Control (OPC) protocol has provided open connectivity for monitoring and automation systems. It has been widely used in several environments such as industrial facilities, building and energy automation, engineering education and many others. This paper presents a novel OPC-based architecture to implement automation systems devoted to R&D and educational activities. The proposal is a novel conceptual framework, structured into four functional layers where the diverse components are categorized aiming to foster the systematic design and implementation of automation systems involving OPC communication. Due to the benefits of OPC, the proposed architecture provides features like open connectivity, reliability, scalability, and flexibility. Furthermore, four successful experimental applications of such an architecture, developed at the University of Extremadura (UEX), are reported. These cases are a proof of concept of the ability of this architecture to support interoperability for different domains. Namely, the automation of energy systems like a smart microgrid and photobioreactor facilities, the implementation of a network-accessible industrial laboratory and the development of an educational hardware-in-the-loop platform are described. All cases include a Programmable Logic Controller (PLC) to automate and control the plant behavior, which exchanges operative data (measurements and signals) with a multiplicity of sensors, instruments and supervisory systems under the structure of the novel OPC architecture. Finally, the main conclusions and open research directions are highlighted.
Research & Technology Report Goddard Space Flight Center
NASA Technical Reports Server (NTRS)
Soffen, Gerald A. (Editor); Truszkowski, Walter (Editor); Ottenstein, Howard (Editor); Frost, Kenneth (Editor); Maran, Stephen (Editor); Walter, Lou (Editor); Brown, Mitch (Editor)
1995-01-01
The main theme of this edition of the annual Research and Technology Report is Mission Operations and Data Systems. Shifting from centralized to distributed mission operations, and from human interactive operations to highly automated operations is reported. The following aspects are addressed: Mission planning and operations; TDRSS, Positioning Systems, and orbit determination; hardware and software associated with Ground System and Networks; data processing and analysis; and World Wide Web. Flight projects are described along with the achievements in space sciences and earth sciences. Spacecraft subsystems, cryogenic developments, and new tools and capabilities are also discussed.
2012-10-23
Quantum Intelligence, Inc. She was principal investigator (PI) for six contracts awarded by the DoD Small Business Innovation Research (SBIR) Program. She...with at OSD? I hope you don’t mind if I indulge in a little ‘stream of consciousness ’ musing about where LLA could really add value. One of the...implemented by Quantum Intelligence, Inc. (QI, 2001–2012). The unique contribution of this architecture is to leverage a peer-to-peer agent network
2005-09-01
discovery of network security threats and vulnerabilities will be done by doing penetration testing during the C&A process. This can be done on a...2.1.1; Appendix E, J COBR -1 Protection of Backup and Restoration Assets Availability 1.3.1; 2.1.3; 2.1.7; 3.1; 4.3; Appendix J, M CODB-2 Data... discovery , inventory, scanning and loading of C&A information in its central database, (2) automatic generation of the SRTM , (3) automatic generation
NASA Astrophysics Data System (ADS)
Viger, R. J.; Van Beusekom, A. E.
2016-12-01
The treatment of glaciers in modeling requires information about their shape and extent. This presentation discusses new methods and their application in a new glacier-capable variant of the USGS PRMS model, a physically-based, spatially distributed daily time-step model designed to simulate the runoff and evolution of glaciers through time. In addition to developing parameters describing PRMS land surfaces (hydrologic response units, HRUs), several of the analyses and products are likely of interest to cryospheric science community in general. The first method is a (fully automated) variation of logic previously presented in the literature for definition of the glacier centerline. Given that the surface of a glacier might be convex, using traditional topographic analyses based on a DEM to trace a path down the glacier is not reliable. Instead a path is derived based on a cost function. Although only a single path is presented in our results, the method can be easily modified to delineate a branched network of centerlines for each glacier. The second method extends the glacier terminus downslope by an arbitrary distance, according to local surface topography. This product is can be used to explore possible, if unlikely, scenarios under which glacier area grows. More usefully, this method can be used to approximate glacier extents from previous years without needing historical imagery. The final method presents an approach for segmenting the glacier into altitude-based HRUs. Successful integration of this information with traditional approaches for discretizing the non-glacierized portions of a basin requires several additional steps. These include synthesizing the glacier centerline network with one developed with a traditional DEM analysis, ensuring that flow can be routed under and beyond glaciers to a basin outlet. Results are presented based on analysis of the Copper River Basin, Alaska.
Dias, Roberto A; Gonçalves, Bruno P; da Rocha, Joana F; da Cruz E Silva, Odete A B; da Silva, Augusto M F; Vieira, Sandra I
2017-12-01
Neurons are specialized cells of the Central Nervous System whose function is intricately related to the neuritic network they develop to transmit information. Morphological evaluation of this network and other neuronal structures is required to establish relationships between neuronal morphology and function, and may allow monitoring physiological and pathophysiologic alterations. Fluorescence-based microphotographs are the most widely used in cellular bioimaging, but phase contrast (PhC) microphotographs are easier to obtain, more affordable, and do not require invasive, complicated and disruptive techniques. Despite the various freeware tools available for fluorescence-based images analysis, few exist that can tackle the more elusive and harder-to-analyze PhC images. To surpass this, an interactive semi-automated image processing workflow was developed to easily extract relevant information (e.g. total neuritic length, average cell body area) from both PhC and fluorescence neuronal images. This workflow, named 'NeuronRead', was developed in the form of an ImageJ macro. Its robustness and adaptability were tested and validated on rat cortical primary neurons under control and differentiation inhibitory conditions. Validation included a comparison to manual determinations and to a golden standard freeware tool for fluorescence image analysis. NeuronRead was subsequently applied to PhC images of neurons at distinct differentiation days and exposed or not to DAPT, a pharmacological inhibitor of the γ-secretase enzyme, which cleaves the well-known Alzheimer's amyloid precursor protein (APP) and the Notch receptor. Data obtained confirms a neuritogenic regulatory role for γ-secretase products and validates NeuronRead as a time- and cost-effective useful monitoring tool. Copyright © 2017. Published by Elsevier Inc.
Westlake, Bryce; Bouchard, Martin; Frank, Richard
2017-10-01
The distribution of child sexual exploitation (CE) material has been aided by the growth of the Internet. The graphic nature and prevalence of the material has made researching and combating difficult. Although used to study online CE distribution, automated data collection tools (e.g., webcrawlers) have yet to be shown effective at targeting only relevant data. Using CE-related image and keyword criteria, we compare networks starting from CE websites to those from similar non-CE sexuality websites and dissimilar sports websites. Our results provide evidence that (a) webcrawlers have the potential to provide valid CE data, if the appropriate criterion is selected; (b) CE distribution is still heavily image-based suggesting images as an effective criterion; (c) CE-seeded networks are more hub-based and differ from non-CE-seeded networks on several website characteristics. Recommendations for improvements to reliable criteria selection are discussed.
Building a Library Network from Scratch: Eric & Veronica's Excellent Adventure.
ERIC Educational Resources Information Center
Sisler, Eric; Smith, Veronica
2000-01-01
Describes library automation issues during the planning and construction of College Hill Library (Colorado), a joint-use facility shared by a community college and a public library. Discuses computer networks; hardware selection; public access to catalogs and electronic resources; classification schemes and bibliographic data; children's…
Automated Network Anomaly Detection with Learning, Control and Mitigation
ERIC Educational Resources Information Center
Ippoliti, Dennis
2014-01-01
Anomaly detection is a challenging problem that has been researched within a variety of application domains. In network intrusion detection, anomaly based techniques are particularly attractive because of their ability to identify previously unknown attacks without the need to be programmed with the specific signatures of every possible attack.…
Computerization of Library and Information Services in Mainland China.
ERIC Educational Resources Information Center
Lin, Sharon Chien
1994-01-01
Describes two phases of the automation of library and information services in mainland China. From 1974-86, much effort was concentrated on developing computer systems, databases, online retrieval, and networking. From 1986 to the present, practical progress became possible largely because of CD-ROM technology; and large scale networking for…
Networked Information: Finding What's Out There.
ERIC Educational Resources Information Center
Lynch, Clifford A.
1997-01-01
Clifford A. Lynch, developer of MELVYL and former director of library automation at the University of California, is now executive director for the Coalition for Networked Information (CNI). This interview discusses Lynch's background, MELVYL, the Web and the role of libraries and librarians, community and collaborative filtering, the library of…
ERIC Educational Resources Information Center
Zhao, Weiyi
2011-01-01
Wireless mesh networks (WMNs) have recently emerged to be a cost-effective solution to support large-scale wireless Internet access. They have numerous applications, such as broadband Internet access, building automation, and intelligent transportation systems. One research challenge for Internet-based WMNs is to design efficient mobility…
Technology-Driven Resource Sharing: Paying for Improvement.
ERIC Educational Resources Information Center
Rush, James E.
1993-01-01
Addresses the inadequacies of traditional methods of library financing and proposes a strategy to be implemented in an environment supported by automation and networks. Publisher pricing of data rather than publications, options for charging library users with debit cards, and the role of regional library networks are discussed. (EAM)
PILOT: An intelligent distributed operations support system
NASA Technical Reports Server (NTRS)
Rasmussen, Arthur N.
1993-01-01
The Real-Time Data System (RTDS) project is exploring the application of advanced technologies to the real-time flight operations environment of the Mission Control Centers at NASA's Johnson Space Center. The system, based on a network of engineering workstations, provides services such as delivery of real time telemetry data to flight control applications. To automate the operation of this complex distributed environment, a facility called PILOT (Process Integrity Level and Operation Tracker) is being developed. PILOT comprises a set of distributed agents cooperating with a rule-based expert system; together they monitor process operation and data flows throughout the RTDS network. The goal of PILOT is to provide unattended management and automated operation under user control.
A human factors approach to range scheduling for satellite control
NASA Technical Reports Server (NTRS)
Wright, Cameron H. G.; Aitken, Donald J.
1991-01-01
Range scheduling for satellite control presents a classical problem: supervisory control of a large-scale dynamic system, with unwieldy amounts of interrelated data used as inputs to the decision process. Increased automation of the task, with the appropriate human-computer interface, is highly desirable. The development and user evaluation of a semi-automated network range scheduling system is described. The system incorporates a synergistic human-computer interface consisting of a large screen color display, voice input/output, a 'sonic pen' pointing device, a touchscreen color CRT, and a standard keyboard. From a human factors standpoint, this development represents the first major improvement in almost 30 years to the satellite control network scheduling task.
Pre-trained convolutional neural networks as feature extractors for tuberculosis detection.
Lopes, U K; Valiati, J F
2017-10-01
It is estimated that in 2015, approximately 1.8 million people infected by tuberculosis died, most of them in developing countries. Many of those deaths could have been prevented if the disease had been detected at an earlier stage, but the most advanced diagnosis methods are still cost prohibitive for mass adoption. One of the most popular tuberculosis diagnosis methods is the analysis of frontal thoracic radiographs; however, the impact of this method is diminished by the need for individual analysis of each radiography by properly trained radiologists. Significant research can be found on automating diagnosis by applying computational techniques to medical images, thereby eliminating the need for individual image analysis and greatly diminishing overall costs. In addition, recent improvements on deep learning accomplished excellent results classifying images on diverse domains, but its application for tuberculosis diagnosis remains limited. Thus, the focus of this work is to produce an investigation that will advance the research in the area, presenting three proposals to the application of pre-trained convolutional neural networks as feature extractors to detect the disease. The proposals presented in this work are implemented and compared to the current literature. The obtained results are competitive with published works demonstrating the potential of pre-trained convolutional networks as medical image feature extractors. Copyright © 2017 Elsevier Ltd. All rights reserved.
Biswas, Amitava; Liu, Chen; Monga, Inder; ...
2016-01-01
For last few years, there has been a tremendous growth in data traffic due to high adoption rate of mobile devices and cloud computing. Internet of things (IoT) will stimulate even further growth. This is increasing scale and complexity of telecom/internet service provider (SP) and enterprise data centre (DC) compute and network infrastructures. As a result, managing these large network-compute converged infrastructures is becoming complex and cumbersome. To cope up, network and DC operators are trying to automate network and system operations, administrations and management (OAM) functions. OAM includes all non-functional mechanisms which keep the network running.
NASA Astrophysics Data System (ADS)
Fedorov, D.; Miller, R. J.; Kvilekval, K. G.; Doheny, B.; Sampson, S.; Manjunath, B. S.
2016-02-01
Logistical and financial limitations of underwater operations are inherent in marine science, including biodiversity observation. Imagery is a promising way to address these challenges, but the diversity of organisms thwarts simple automated analysis. Recent developments in computer vision methods, such as convolutional neural networks (CNN), are promising for automated classification and detection tasks but are typically very computationally expensive and require extensive training on large datasets. Therefore, managing and connecting distributed computation, large storage and human annotations of diverse marine datasets is crucial for effective application of these methods. BisQue is a cloud-based system for management, annotation, visualization, analysis and data mining of underwater and remote sensing imagery and associated data. Designed to hide the complexity of distributed storage, large computational clusters, diversity of data formats and inhomogeneous computational environments behind a user friendly web-based interface, BisQue is built around an idea of flexible and hierarchical annotations defined by the user. Such textual and graphical annotations can describe captured attributes and the relationships between data elements. Annotations are powerful enough to describe cells in fluorescent 4D images, fish species in underwater videos and kelp beds in aerial imagery. Presently we are developing BisQue-based analysis modules for automated identification of benthic marine organisms. Recent experiments with drop-out and CNN based classification of several thousand annotated underwater images demonstrated an overall accuracy above 70% for the 15 best performing species and above 85% for the top 5 species. Based on these promising results, we have extended bisque with a CNN-based classification system allowing continuous training on user-provided data.
Application of high performance asynchronous socket communication in power distribution automation
NASA Astrophysics Data System (ADS)
Wang, Ziyu
2017-05-01
With the development of information technology and Internet technology, and the growing demand for electricity, the stability and the reliable operation of power system have been the goal of power grid workers. With the advent of the era of big data, the power data will gradually become an important breakthrough to guarantee the safe and reliable operation of the power grid. So, in the electric power industry, how to efficiently and robustly receive the data transmitted by the data acquisition device, make the power distribution automation system be able to execute scientific decision quickly, which is the pursuit direction in power grid. In this paper, some existing problems in the power system communication are analysed, and with the help of the network technology, a set of solutions called Asynchronous Socket Technology to the problem in network communication which meets the high concurrency and the high throughput is proposed. Besides, the paper also looks forward to the development direction of power distribution automation in the era of big data and artificial intelligence.
NASA Astrophysics Data System (ADS)
Fernandez, Carlos; Platero, Carlos; Campoy, Pascual; Aracil, Rafael
1994-11-01
This paper describes some texture-based techniques that can be applied to quality assessment of flat products continuously produced (metal strips, wooden surfaces, cork, textile products, ...). Since the most difficult task is that of inspecting for product appearance, human-like inspection ability is required. A common feature to all these products is the presence of non- deterministic texture on their surfaces. Two main subjects are discussed: statistical techniques for both surface finishing determination and surface defect analysis as well as real-time implementation for on-line inspection in high-speed applications. For surface finishing determination a Gray Level Difference technique is presented to perform over low resolution images, that is, no-zoomed images. Defect analysis is performed by means of statistical texture analysis over defective portions of the surface. On-line implementation is accomplished by means of neural networks. When a defect arises, textural analysis is applied which result in a data-vector, acting as input of a neural net, previously trained in a supervised way. This approach tries to reach on-line performance in automated visual inspection applications when texture is presented in flat product surfaces.
Automated Protist Analysis of Complex Samples: Recent Investigations Using Motion and Thresholding
2012-01-01
Report No: CG-D-15-13 Automated Protist Analysis of Complex Samples: Recent Investigations Using Motion and Thresholding...Distribution Statement A: Approved for public release; distribution is unlimited. January 2012 Automated Protist Analysis of Complex Samples...Chelsea Street New London, CT 06320 Automated Protist Analysis of Complex Samples iii UNCLAS//PUBLIC | CG-926 R&DC | B. Nelson, et al
Automated target recognition and tracking using an optical pattern recognition neural network
NASA Technical Reports Server (NTRS)
Chao, Tien-Hsin
1991-01-01
The on-going development of an automatic target recognition and tracking system at the Jet Propulsion Laboratory is presented. This system is an optical pattern recognition neural network (OPRNN) that is an integration of an innovative optical parallel processor and a feature extraction based neural net training algorithm. The parallel optical processor provides high speed and vast parallelism as well as full shift invariance. The neural network algorithm enables simultaneous discrimination of multiple noisy targets in spite of their scales, rotations, perspectives, and various deformations. This fully developed OPRNN system can be effectively utilized for the automated spacecraft recognition and tracking that will lead to success in the Automated Rendezvous and Capture (AR&C) of the unmanned Cargo Transfer Vehicle (CTV). One of the most powerful optical parallel processors for automatic target recognition is the multichannel correlator. With the inherent advantages of parallel processing capability and shift invariance, multiple objects can be simultaneously recognized and tracked using this multichannel correlator. This target tracking capability can be greatly enhanced by utilizing a powerful feature extraction based neural network training algorithm such as the neocognitron. The OPRNN, currently under investigation at JPL, is constructed with an optical multichannel correlator where holographic filters have been prepared using the neocognitron training algorithm. The computation speed of the neocognitron-type OPRNN is up to 10(exp 14) analog connections/sec that enabling the OPRNN to outperform its state-of-the-art electronics counterpart by at least two orders of magnitude.
Lee, Hyunkwang; Troschel, Fabian M; Tajmir, Shahein; Fuchs, Georg; Mario, Julia; Fintelmann, Florian J; Do, Synho
2017-08-01
Pretreatment risk stratification is key for personalized medicine. While many physicians rely on an "eyeball test" to assess whether patients will tolerate major surgery or chemotherapy, "eyeballing" is inherently subjective and difficult to quantify. The concept of morphometric age derived from cross-sectional imaging has been found to correlate well with outcomes such as length of stay, morbidity, and mortality. However, the determination of the morphometric age is time intensive and requires highly trained experts. In this study, we propose a fully automated deep learning system for the segmentation of skeletal muscle cross-sectional area (CSA) on an axial computed tomography image taken at the third lumbar vertebra. We utilized a fully automated deep segmentation model derived from an extended implementation of a fully convolutional network with weight initialization of an ImageNet pre-trained model, followed by post processing to eliminate intramuscular fat for a more accurate analysis. This experiment was conducted by varying window level (WL), window width (WW), and bit resolutions in order to better understand the effects of the parameters on the model performance. Our best model, fine-tuned on 250 training images and ground truth labels, achieves 0.93 ± 0.02 Dice similarity coefficient (DSC) and 3.68 ± 2.29% difference between predicted and ground truth muscle CSA on 150 held-out test cases. Ultimately, the fully automated segmentation system can be embedded into the clinical environment to accelerate the quantification of muscle and expanded to volume analysis of 3D datasets.
NASA Astrophysics Data System (ADS)
Burba, George; Johnson, Dave; Velgersdyk, Michael; Begashaw, Israel; Allyn, Douglas
2016-04-01
In recent years, spatial and temporal flux data coverage improved significantly and on multiple scales, from a single station to continental networks, due to standardization, automation, and management of the data collection, and better handling of the extensive amounts of generated data. However, operating budgets for flux research items, such as labor, travel, and hardware, are becoming more difficult to acquire and sustain. With more stations and networks, larger data flows from each station, and smaller operating budgets, modern tools are required to effectively and efficiently handle the entire process, including sharing data among collaborative groups. On one hand, such tools can maximize time dedicated to publications answering research questions, and minimize time and expenses spent on data acquisition, processing, quality control and overall station management. On the other hand, cross-sharing the stations with external collaborators may help leverage available funding, and promote data analyses and publications. A new low-cost, advanced system, FluxSuite, utilizes a combination of hardware, software and web-services to address these specific demands. It automates key stages of flux workflow, minimizes day-to-day site management, and modernizes the handling of data flows: (i) The system can be easily incorporated into a new flux station, or as un upgrade to many presently operating flux stations, via weatherized remotely-accessible microcomputer, SmartFlux 2, with fully digital inputs (ii) Each next-generation station will measure all parameters needed for flux computations in a digital and PTP time-synchronized mode, accepting digital signals from a number of anemometers and data loggers (iii) The field microcomputer will calculate final fully-processed flux rates in real time, including computation-intensive Fourier transforms, spectra, co-spectra, multiple rotations, stationarity, footprint, etc. (iv) Final fluxes, radiation, weather and soil data will be merged into a single quality-control file (v) Multiple flux stations can be linked into an automated time-synchronized network (vi) Flux network managers, or PI's, can see all stations in real-time, including fluxes, supporting data, automated reports, and email alerts (vii) PI's can assign rights, allow or restrict access to stations and data: selected stations can be shared via rights-managed access internally or with external institutions (viii) Researchers without stations could form "virtual networks" for specific projects by collaborating with PIs from different actual networks This presentation provides detailed examples of FluxSuite currently utilized to manage two large flux networks in China (National Academy of Sciences and Agricultural Academy of Sciences), and smaller networks with stations in the USA, Germany, Ireland, Malaysia and other locations around the globe. Very latest 2016 developments and expanded functionality are also discussed.
A Wireless Sensor Network-Based Portable Vehicle Detector Evaluation System
Yoo, Seong-eun
2013-01-01
In an upcoming smart transportation environment, performance evaluations of existing Vehicle Detection Systems are crucial to maintain their accuracy. The existing evaluation method for Vehicle Detection Systems is based on a wired Vehicle Detection System reference and a video recorder, which must be operated and analyzed by capable traffic experts. However, this conventional evaluation system has many disadvantages. It is inconvenient to deploy, the evaluation takes a long time, and it lacks scalability and objectivity. To improve the evaluation procedure, this paper proposes a Portable Vehicle Detector Evaluation System based on wireless sensor networks. We describe both the architecture and design of a Vehicle Detector Evaluation System and the implementation results, focusing on the wireless sensor networks and methods for traffic information measurement. With the help of wireless sensor networks and automated analysis, our Vehicle Detector Evaluation System can evaluate a Vehicle Detection System conveniently and objectively. The extensive evaluations of our Vehicle Detector Evaluation System show that it can measure the traffic information such as volume counts and speed with over 98% accuracy. PMID:23344388
A wireless sensor network-based portable vehicle detector evaluation system.
Yoo, Seong-eun
2013-01-17
In an upcoming smart transportation environment, performance evaluations of existing Vehicle Detection Systems are crucial to maintain their accuracy. The existing evaluation method for Vehicle Detection Systems is based on a wired Vehicle Detection System reference and a video recorder, which must be operated and analyzed by capable traffic experts. However, this conventional evaluation system has many disadvantages. It is inconvenient to deploy, the evaluation takes a long time, and it lacks scalability and objectivity. To improve the evaluation procedure, this paper proposes a Portable Vehicle Detector Evaluation System based on wireless sensor networks. We describe both the architecture and design of a Vehicle Detector Evaluation System and the implementation results, focusing on the wireless sensor networks and methods for traffic information measurement. With the help of wireless sensor networks and automated analysis, our Vehicle Detector Evaluation System can evaluate a Vehicle Detection System conveniently and objectively. The extensive evaluations of our Vehicle Detector Evaluation System show that it can measure the traffic information such as volume counts and speed with over 98% accuracy.
Automated Target Acquisition, Recognition and Tracking (ATTRACT). Phase 1
NASA Technical Reports Server (NTRS)
Abdallah, Mahmoud A.
1995-01-01
The primary objective of phase 1 of this research project is to conduct multidisciplinary research that will contribute to fundamental scientific knowledge in several of the USAF critical technology areas. Specifically, neural networks, signal processing techniques, and electro-optic capabilities are utilized to solve problems associated with automated target acquisition, recognition, and tracking. To accomplish the stated objective, several tasks have been identified and were executed.
ERIC Educational Resources Information Center
Ozonoff, Sally; Cook, Ian; Coon, Hilary; Dawson, Geraldine; Joseph, Robert M.; Klin, Ami; McMahon, William M.; Minshew, Nancy; Munson, Jeffrey A.
2004-01-01
Recent structural and functional imaging work, as well as neuropathology and neuropsychology studies, provide strong empirical support for the involvement of frontal cortex in autism. The Cambridge Neuropsychological Test Automated Battery (CANTAB) is a computer-administered set of neuropsychological tests developed to examine specific components…
An Architecture for SCADA Network Forensics
NASA Astrophysics Data System (ADS)
Kilpatrick, Tim; Gonzalez, Jesus; Chandia, Rodrigo; Papa, Mauricio; Shenoi, Sujeet
Supervisory control and data acquisition (SCADA) systems are widely used in industrial control and automation. Modern SCADA protocols often employ TCP/IP to transport sensor data and control signals. Meanwhile, corporate IT infrastructures are interconnecting with previously isolated SCADA networks. The use of TCP/IP as a carrier protocol and the interconnection of IT and SCADA networks raise serious security issues. This paper describes an architecture for SCADA network forensics. In addition to supporting forensic investigations of SCADA network incidents, the architecture incorporates mechanisms for monitoring process behavior, analyzing trends and optimizing plant performance.
Lommen, Arjen; van der Kamp, Henk J; Kools, Harrie J; van der Lee, Martijn K; van der Weg, Guido; Mol, Hans G J
2012-11-09
A new alternative data processing tool set, metAlignID, is developed for automated pre-processing and library-based identification and concentration estimation of target compounds after analysis by comprehensive two-dimensional gas chromatography with mass spectrometric detection. The tool set has been developed for and tested on LECO data. The software is developed to run multi-threaded (one thread per processor core) on a standard PC (personal computer) under different operating systems and is as such capable of processing multiple data sets simultaneously. Raw data files are converted into netCDF (network Common Data Form) format using a fast conversion tool. They are then preprocessed using previously developed algorithms originating from metAlign software. Next, the resulting reduced data files are searched against a user-composed library (derived from user or commercial NIST-compatible libraries) (NIST=National Institute of Standards and Technology) and the identified compounds, including an indicative concentration, are reported in Excel format. Data can be processed batch wise. The overall time needed for conversion together with processing and searching of 30 raw data sets for 560 compounds is routinely within an hour. The screening performance is evaluated for detection of pesticides and contaminants in raw data obtained after analysis of soil and plant samples. Results are compared to the existing data-handling routine based on proprietary software (LECO, ChromaTOF). The developed software tool set, which is freely downloadable at www.metalign.nl, greatly accelerates data-analysis and offers more options for fine-tuning automated identification toward specific application needs. The quality of the results obtained is slightly better than the standard processing and also adds a quantitative estimate. The software tool set in combination with two-dimensional gas chromatography coupled to time-of-flight mass spectrometry shows great potential as a highly-automated and fast multi-residue instrumental screening method. Copyright © 2012 Elsevier B.V. All rights reserved.
Doctor, Daniel H.; Young, John A.
2013-01-01
LiDAR (Light Detection and Ranging) surveys of karst terrains provide high-resolution digital elevation models (DEMs) that are particularly useful for mapping sinkholes. In this study, we used automated processing tools within ArcGIS (v. 10.0) operating on a 1.0 m resolution LiDAR DEM in order to delineate sinkholes and closed depressions in the Boyce 7.5 minute quadrangle located in the northern Shenandoah Valley of Virginia. The results derived from the use of the automated tools were then compared with depressions manually delineated by a geologist. Manual delineation of closed depressions was conducted using a combination of 1.0 m DEM hillshade, slopeshade, aerial imagery, and Topographic Position Index (TPI) rasters. The most effective means of visualizing depressions in the GIS was using an overlay of the partially transparent TPI raster atop the slopeshade raster at 1.0 m resolution. Manually identified depressions were subsequently checked using aerial imagery to screen for false positives, and targeted ground-truthing was undertaken in the field. The automated tools that were utilized include the routines in ArcHydro Tools (v. 2.0) for prescreening, evaluating, and selecting sinks and depressions as well as thresholding, grouping, and assessing depressions from the TPI raster. Results showed that the automated delineation of sinks and depressions within the ArcHydro tools was highly dependent upon pre-conditioning of the DEM to produce "hydrologically correct" surface flow routes. Using stream vectors obtained from the National Hydrologic Dataset alone to condition the flow routing was not sufficient to produce a suitable drainage network, and numerous artificial depressions were generated where roads, railways, or other manmade structures acted as flow barriers in the elevation model. Additional conditioning of the DEM with drainage paths across these barriers was required prior to automated 2delineation of sinks and depressions. In regions where the DEM had been properly conditioned, the tools for automated delineation performed reasonably well as compared to the manually delineated depressions, but generally overestimated the number of depressions thus necessitating manual filtering of the final results. Results from the TPI thresholding analysis were not dependent on DEM pre-conditioning, but the ability to extract meaningful depressions depended on careful assessment of analysis scale and TPI thresholding.
NASA Astrophysics Data System (ADS)
Mizukami, Masato; Makihara, Mitsuhiro
2013-07-01
Conventionally, in intelligent buildings in a metropolitan area network and in small-scale facilities in the optical access network, optical connectors are joined manually using an optical connection board and a patch panel. In this manual connection approach, mistakes occur due to discrepancies between the actual physical settings of the connections and their management because these processes are independent. Moreover, manual cross-connection is time-consuming and expensive because maintenance personnel must be dispatched to remote places to correct mistakes. We have developed a fiber-handling robot and optical connection mechanisms for automatic cross-connection of multiple optical connectors, which are the key elements of automatic optical fiber cross-connect equipment. We evaluate the performance of the equipment, such as its optical characteristics and environmental specifications. We also devise new optical connection mechanisms that enable the automated optical fiber cross-connect module to handle and connect angled physical contact (APC) optical connector plugs. We evaluate the performance of the equipment, such as its optical characteristics. The evaluation results confirm that the automated optical fiber cross-connect equipment can connect APC connectors with low loss and high return loss, indicating that the automated optical fiber cross-connect equipment is suitable for practical use in intelligent buildings and optical access networks.
Thode, Aaron M; Kim, Katherine H; Blackwell, Susanna B; Greene, Charles R; Nations, Christopher S; McDonald, Trent L; Macrander, A Michael
2012-05-01
An automated procedure has been developed for detecting and localizing frequency-modulated bowhead whale sounds in the presence of seismic airgun surveys. The procedure was applied to four years of data, collected from over 30 directional autonomous recording packages deployed over a 280 km span of continental shelf in the Alaskan Beaufort Sea. The procedure has six sequential stages that begin by extracting 25-element feature vectors from spectrograms of potential call candidates. Two cascaded neural networks then classify some feature vectors as bowhead calls, and the procedure then matches calls between recorders to triangulate locations. To train the networks, manual analysts flagged 219 471 bowhead call examples from 2008 and 2009. Manual analyses were also used to identify 1.17 million transient signals that were not whale calls. The network output thresholds were adjusted to reject 20% of whale calls in the training data. Validation runs using 2007 and 2010 data found that the procedure missed 30%-40% of manually detected calls. Furthermore, 20%-40% of the sounds flagged as calls are not present in the manual analyses; however, these extra detections incorporate legitimate whale calls overlooked by human analysts. Both manual and automated methods produce similar spatial and temporal call distributions.
Smart border: ad-hoc wireless sensor networks for border surveillance
NASA Astrophysics Data System (ADS)
He, Jun; Fallahi, Mahmoud; Norwood, Robert A.; Peyghambarian, Nasser
2011-06-01
Wireless sensor networks have been proposed as promising candidates to provide automated monitoring, target tracking, and intrusion detection for border surveillance. In this paper, we demonstrate an ad-hoc wireless sensor network system for border surveillance. The network consists of heterogeneously autonomous sensor nodes that distributively cooperate with each other to enable a smart border in remote areas. This paper also presents energy-aware and sleeping algorithms designed to maximize the operating lifetime of the deployed sensor network. Lessons learned in building the network and important findings from field experiments are shared in the paper.
Automated imaging system for single molecules
Schwartz, David Charles; Runnheim, Rodney; Forrest, Daniel
2012-09-18
There is provided a high throughput automated single molecule image collection and processing system that requires minimal initial user input. The unique features embodied in the present disclosure allow automated collection and initial processing of optical images of single molecules and their assemblies. Correct focus may be automatically maintained while images are collected. Uneven illumination in fluorescence microscopy is accounted for, and an overall robust imaging operation is provided yielding individual images prepared for further processing in external systems. Embodiments described herein are useful in studies of any macromolecules such as DNA, RNA, peptides and proteins. The automated image collection and processing system and method of same may be implemented and deployed over a computer network, and may be ergonomically optimized to facilitate user interaction.
Inferring gene and protein interactions using PubMed citations and consensus Bayesian networks
Dalman, Mark; Haddad, Joseph; Duan, Zhong-Hui
2017-01-01
The PubMed database offers an extensive set of publication data that can be useful, yet inherently complex to use without automated computational techniques. Data repositories such as the Genomic Data Commons (GDC) and the Gene Expression Omnibus (GEO) offer experimental data storage and retrieval as well as curated gene expression profiles. Genetic interaction databases, including Reactome and Ingenuity Pathway Analysis, offer pathway and experiment data analysis using data curated from these publications and data repositories. We have created a method to generate and analyze consensus networks, inferring potential gene interactions, using large numbers of Bayesian networks generated by data mining publications in the PubMed database. Through the concept of network resolution, these consensus networks can be tailored to represent possible genetic interactions. We designed a set of experiments to confirm that our method is stable across variation in both sample and topological input sizes. Using gene product interactions from the KEGG pathway database and data mining PubMed publication abstracts, we verify that regardless of the network resolution or the inferred consensus network, our method is capable of inferring meaningful gene interactions through consensus Bayesian network generation with multiple, randomized topological orderings. Our method can not only confirm the existence of currently accepted interactions, but has the potential to hypothesize new ones as well. We show our method confirms the existence of known gene interactions such as JAK-STAT-PI3K-AKT-mTOR, infers novel gene interactions such as RAS- Bcl-2 and RAS-AKT, and found significant pathway-pathway interactions between the JAK-STAT signaling and Cardiac Muscle Contraction KEGG pathways. PMID:29049295
NASA Automatic Information Security Handbook
NASA Technical Reports Server (NTRS)
1993-01-01
This handbook details the Automated Information Security (AIS) management process for NASA. Automated information system security is becoming an increasingly important issue for all NASA managers and with rapid advancements in computer and network technologies and the demanding nature of space exploration and space research have made NASA increasingly dependent on automated systems to store, process, and transmit vast amounts of mission support information, hence the need for AIS systems and management. This handbook provides the consistent policies, procedures, and guidance to assure that an aggressive and effective AIS programs is developed, implemented, and sustained at all NASA organizations and NASA support contractors.
NASA Astrophysics Data System (ADS)
Bazhin, V. Yu; Danilov, I. V.; Petrov, P. A.
2018-05-01
During the casting of light alloys and ligatures based on aluminum and magnesium, problems of the qualitative distribution of the metal and its crystallization in the mold arise. To monitor the defects of molds on the casting conveyor, a camera with a resolution of 780 x 580 pixels and a shooting rate of 75 frames per second was selected. Images of molds from casting machines were used as input data for neural network algorithm. On the preparation of a digital database and its analytical evaluation stage, the architecture of the convolutional neural network was chosen for the algorithm. The information flow from the local controller is transferred to the OPC server and then to the SCADA system of foundry. After the training, accuracy of neural network defect recognition was about 95.1% on a validation split. After the training, weight coefficients of the neural network were used on testing split and algorithm had identical accuracy with validation images. The proposed technical solutions make it possible to increase the efficiency of the automated process control system in the foundry by expanding the digital database.
Deciphering kinase-substrate relationships by analysis of domain-specific phosphorylation network.
Damle, Nikhil Prakash; Mohanty, Debasisa
2014-06-15
In silico prediction of site-specific kinase-substrate relationships (ssKSRs) is crucial for deciphering phosphorylation networks by linking kinomes to phosphoproteomes. However, currently available predictors for ssKSRs give rise to a large number of false-positive results because they use only a short sequence stretch around phosphosite as determinants of kinase specificity and do not consider the biological context of kinase-substrate recognition. Based on the analysis of domain-specific kinase-substrate relationships, we have constructed a domain-level phosphorylation network that implicitly incorporates various contextual factors. It reveals preferential phosphorylation of specific domains by certain kinases. These novel correlations have been implemented in PhosNetConstruct, an automated program for predicting target kinases for a substrate protein. PhosNetConstruct distinguishes cognate kinase-substrate pairs from a large number of non-cognate combinations. Benchmarking on independent datasets using various statistical measures demonstrates the superior performance of PhosNetConstruct over ssKSR-based predictors. PhosNetConstruct is freely available at http://www.nii.ac.in/phosnetconstruct.html. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Sybil--efficient constraint-based modelling in R.
Gelius-Dietrich, Gabriel; Desouki, Abdelmoneim Amer; Fritzemeier, Claus Jonathan; Lercher, Martin J
2013-11-13
Constraint-based analyses of metabolic networks are widely used to simulate the properties of genome-scale metabolic networks. Publicly available implementations tend to be slow, impeding large scale analyses such as the genome-wide computation of pairwise gene knock-outs, or the automated search for model improvements. Furthermore, available implementations cannot easily be extended or adapted by users. Here, we present sybil, an open source software library for constraint-based analyses in R; R is a free, platform-independent environment for statistical computing and graphics that is widely used in bioinformatics. Among other functions, sybil currently provides efficient methods for flux-balance analysis (FBA), MOMA, and ROOM that are about ten times faster than previous implementations when calculating the effect of whole-genome single gene deletions in silico on a complete E. coli metabolic model. Due to the object-oriented architecture of sybil, users can easily build analysis pipelines in R or even implement their own constraint-based algorithms. Based on its highly efficient communication with different mathematical optimisation programs, sybil facilitates the exploration of high-dimensional optimisation problems on small time scales. Sybil and all its dependencies are open source. Sybil and its documentation are available for download from the comprehensive R archive network (CRAN).
Standardized Automated CO2/H2O Flux Systems for Individual Research Groups and Flux Networks
NASA Astrophysics Data System (ADS)
Burba, George; Begashaw, Israel; Fratini, Gerardo; Griessbaum, Frank; Kathilankal, James; Xu, Liukang; Franz, Daniela; Joseph, Everette; Larmanou, Eric; Miller, Scott; Papale, Dario; Sabbatini, Simone; Sachs, Torsten; Sakai, Ricardo; McDermitt, Dayle
2017-04-01
In recent years, spatial and temporal flux data coverage improved significantly, and on multiple scales, from a single station to continental networks, due to standardization, automation, and management of data collection, and better handling of the extensive amounts of generated data. With more stations and networks, larger data flows from each station, and smaller operating budgets, modern tools are required to effectively and efficiently handle the entire process. Such tools are needed to maximize time dedicated to authoring publications and answering research questions, and to minimize time and expenses spent on data acquisition, processing, and quality control. Thus, these tools should produce standardized verifiable datasets and provide a way to cross-share the standardized data with external collaborators to leverage available funding, promote data analyses and publications. LI-COR gas analyzers are widely used in past and present flux networks such as AmeriFlux, ICOS, AsiaFlux, OzFlux, NEON, CarboEurope, and FluxNet-Canada, etc. These analyzers have gone through several major improvements over the past 30 years. However, in 2016, a three-prong development was completed to create an automated flux system which can accept multiple sonic anemometer and datalogger models, compute final and complete fluxes on-site, merge final fluxes with supporting weather soil and radiation data, monitor station outputs and send automated alerts to researchers, and allow secure sharing and cross-sharing of the station and data access. Two types of these research systems were developed: open-path (LI-7500RS) and enclosed-path (LI-7200RS). Key developments included: • Improvement of gas analyzer performance • Standardization and automation of final flux calculations onsite, and in real-time • Seamless integration with latest site management and data sharing tools In terms of the gas analyzer performance, the RS analyzers are based on established LI-7500/A and LI-7200 models, and the improvements focused on increased stability in the presence of contamination, refining temperature control and compensation, and providing more accurate fast gas concentration measurements. In terms of the flux calculations, improvements focused on automating the on-site flux calculations using EddyPro® software run by a weatherized fully digital microcomputer, SmartFlux2. In terms of site management and data sharing, the development focused on web-based software, FluxSuite, which allows real-time station monitoring and data access by multiple users. The presentation will describe details for the key developments and will include results from field tests of the RS gas analyzer models in comparison with older models and control reference instruments.
Deep convolutional networks for automated detection of posterior-element fractures on spine CT
NASA Astrophysics Data System (ADS)
Roth, Holger R.; Wang, Yinong; Yao, Jianhua; Lu, Le; Burns, Joseph E.; Summers, Ronald M.
2016-03-01
Injuries of the spine, and its posterior elements in particular, are a common occurrence in trauma patients, with potentially devastating consequences. Computer-aided detection (CADe) could assist in the detection and classification of spine fractures. Furthermore, CAD could help assess the stability and chronicity of fractures, as well as facilitate research into optimization of treatment paradigms. In this work, we apply deep convolutional networks (ConvNets) for the automated detection of posterior element fractures of the spine. First, the vertebra bodies of the spine with its posterior elements are segmented in spine CT using multi-atlas label fusion. Then, edge maps of the posterior elements are computed. These edge maps serve as candidate regions for predicting a set of probabilities for fractures along the image edges using ConvNets in a 2.5D fashion (three orthogonal patches in axial, coronal and sagittal planes). We explore three different methods for training the ConvNet using 2.5D patches along the edge maps of `positive', i.e. fractured posterior-elements and `negative', i.e. non-fractured elements. An experienced radiologist retrospectively marked the location of 55 displaced posterior-element fractures in 18 trauma patients. We randomly split the data into training and testing cases. In testing, we achieve an area-under-the-curve of 0.857. This corresponds to 71% or 81% sensitivities at 5 or 10 false-positives per patient, respectively. Analysis of our set of trauma patients demonstrates the feasibility of detecting posterior-element fractures in spine CT images using computer vision techniques such as deep convolutional networks.
Bashford, Gregory R; Burnfield, Judith M; Perez, Lance C
2013-01-01
Automating documentation of physical activity data (e.g., duration and speed of walking or propelling a wheelchair) into the electronic medical record (EMR) offers promise for improving efficiency of documentation and understanding of best practices in the rehabilitation and home health settings. Commercially available devices which could be used to automate documentation of physical activities are either cumbersome to wear or lack the specificity required to differentiate activities. We have designed a novel system to differentiate and quantify physical activities, using inexpensive accelerometer-based biomechanical data technology and wireless sensor networks, a technology combination that has not been used in a rehabilitation setting to date. As a first step, a feasibility study was performed where 14 healthy young adults (mean age = 22.6 ± 2.5 years, mean height = 173 ± 10.0 cm, mean mass = 70.7 ± 11.3 kg) carried out eight different activities while wearing a biaxial accelerometer sensor. Activities were performed at each participants self-selected pace during a single testing session in a controlled environment. Linear discriminant analysis was performed by extracting spectral parameters from the subjects accelerometer patterns. It is shown that physical activity classification alone results in an average accuracy of 49.5%, but when combined with rule-based constraints using a wireless sensor network with localization capabilities in an in silico simulated room, accuracy improves to 99.3%. When fully implemented, our technology package is expected to improve goal setting, treatment interventions and patient outcomes by enhancing clinicians understanding of patients physical performance within a day and across the rehabilitation program.
Raith, Stefan; Vogel, Eric Per; Anees, Naeema; Keul, Christine; Güth, Jan-Frederik; Edelhoff, Daniel; Fischer, Horst
2017-01-01
Chairside manufacturing based on digital image acquisition is gainingincreasing importance in dentistry. For the standardized application of these methods, it is paramount to have highly automated digital workflows that can process acquired 3D image data of dental surfaces. Artificial Neural Networks (ANNs) arenumerical methods primarily used to mimic the complex networks of neural connections in the natural brain. Our hypothesis is that an ANNcan be developed that is capable of classifying dental cusps with sufficient accuracy. This bears enormous potential for an application in chairside manufacturing workflows in the dental field, as it closes the gap between digital acquisition of dental geometries and modern computer-aided manufacturing techniques.Three-dimensional surface scans of dental casts representing natural full dental arches were transformed to range image data. These data were processed using an automated algorithm to detect candidates for tooth cusps according to salient geometrical features. These candidates were classified following common dental terminology and used as training data for a tailored ANN.For the actual cusp feature description, two different approaches were developed and applied to the available data: The first uses the relative location of the detected cusps as input data and the second method directly takes the image information given in the range images. In addition, a combination of both was implemented and investigated.Both approaches showed high performance with correct classifications of 93.3% and 93.5%, respectively, with improvements by the combination shown to be minor.This article presents for the first time a fully automated method for the classification of teeththat could be confirmed to work with sufficient precision to exhibit the potential for its use in clinical practice,which is a prerequisite for automated computer-aided planning of prosthetic treatments with subsequent automated chairside manufacturing. Copyright © 2016 Elsevier Ltd. All rights reserved.
Sharpe, Jennifer B.; Soong, David T.
2015-01-01
This study used the National Land Cover Dataset (NLCD) and developed an automated process for determining the area of the three land cover types, thereby allowing faster updating of future models, and for evaluating land cover changes by use of historical NLCD datasets. The study also carried out a raingage partitioning analysis so that the segmentation of land cover and rainfall in each modeled unit is directly applicable to the HSPF modeling. Historical and existing impervious, grass, and forest land acreages partitioned by percentages covered by two sets of raingages for the Lake Michigan diversion SCAs, gaged basins, and ungaged basins are presented.
Ebert, Lars C; Heimer, Jakob; Schweitzer, Wolf; Sieberth, Till; Leipner, Anja; Thali, Michael; Ampanozi, Garyfalia
2017-12-01
Post mortem computed tomography (PMCT) can be used as a triage tool to better identify cases with a possibly non-natural cause of death, especially when high caseloads make it impossible to perform autopsies on all cases. Substantial data can be generated by modern medical scanners, especially in a forensic setting where the entire body is documented at high resolution. A solution for the resulting issues could be the use of deep learning techniques for automatic analysis of radiological images. In this article, we wanted to test the feasibility of such methods for forensic imaging by hypothesizing that deep learning methods can detect and segment a hemopericardium in PMCT. For deep learning image analysis software, we used the ViDi Suite 2.0. We retrospectively selected 28 cases with, and 24 cases without, hemopericardium. Based on these data, we trained two separate deep learning networks. The first one classified images into hemopericardium/not hemopericardium, and the second one segmented the blood content. We randomly selected 50% of the data for training and 50% for validation. This process was repeated 20 times. The best performing classification network classified all cases of hemopericardium from the validation images correctly with only a few false positives. The best performing segmentation network would tend to underestimate the amount of blood in the pericardium, which is the case for most networks. This is the first study that shows that deep learning has potential for automated image analysis of radiological images in forensic medicine.
Class identity assignment for amphetamines using neural networks and GC-FTIR data
NASA Astrophysics Data System (ADS)
Gosav, S.; Praisler, M.; Van Bocxlaer, J.; De Leenheer, A. P.; Massart, D. L.
2006-08-01
An exploratory analysis was performed in order to evaluate the feasibility of building of neural network (NN) systems automating the identification of amphetamines necessary in the investigation of drugs of abuse for epidemiological, clinical and forensic purposes. A first neural network system was built to distinguish between amphetamines and nonamphetamines. A second, more refined system, aimed to the recognition of amphetamines according to their toxicological activity (stimulant amphetamines, hallucinogenic amphetamines, nonamphetamines). Both systems proved that discrimination between amphetamines and nonamphetamines, as well as between stimulants, hallucinogens and nonamphetamines is possible (83.44% and 85.71% correct classification rate, respectively). The spectroscopic interpretation of the 40 most important input variables (GC-FTIR absorption intensities) shows that the modeling power of an input variable seems to be correlated with the stability and not with the intensity of the spectral interaction. Thus, discarding variables only because they correspond to spectral windows with weak absorptions does not seem be not advisable.
Mozumdar, Mohammad; Song, Zhen Yu; Lavagno, Luciano; Sangiovanni-Vincentelli, Alberto L.
2014-01-01
The Model Based Design (MBD) approach is a popular trend to speed up application development of embedded systems, which uses high-level abstractions to capture functional requirements in an executable manner, and which automates implementation code generation. Wireless Sensor Networks (WSNs) are an emerging very promising application area for embedded systems. However, there is a lack of tools in this area, which would allow an application developer to model a WSN application by using high level abstractions, simulate it mapped to a multi-node scenario for functional analysis, and finally use the refined model to automatically generate code for different WSN platforms. Motivated by this idea, in this paper we present a hybrid simulation framework that not only follows the MBD approach for WSN application development, but also interconnects a simulated sub-network with a physical sub-network and then allows one to co-simulate them, which is also known as Hardware-In-the-Loop (HIL) simulation. PMID:24960083
IAS telecommunication infrastructure and value added network services provided by IASNET
NASA Astrophysics Data System (ADS)
Smirnov, Oleg L.; Marchenko, Sergei
The topology of a packet switching network for the Soviet National Centre for Automated Data Exchange with Foreign Computer Networks and Databanks (NCADE) based on a design by the Institute for Automated Systems (IAS) is discussed. NCADE has partners all over the world: it is linked to East European countries via telephone lines while satellites are used for communication with remote partners, such as Cuba, Mongolia, and Vietnam. Moreover, there is a connection to the Austrian, British, Canadian, Finnish, French, U.S. and other western networks through which users can have access to databases on each network. At the same time, NCADE provides western customers with access to more than 70 Soviet databases. Software and hardware of IASNET use data exchange recommendations agreed with the International Standard Organization (ISO) and International Telegraph and Telephone Consultative Committee (CCITT). Technical parameters of IASNET are compatible with the majority of foreign networks such as DATAPAK, TRANSPAC, TELENET, and others. By means of IASNET, the NCADE provides connection of Soviet and foreign users to information and computer centers around the world on the basis of the CCITT X.25 and X.75 recommendations. Any information resources of IASNET and value added network services, such as computer teleconferences, E-mail, information retrieval system, intelligent support of access to databanks and databases, and others are discussed. The topology of the ACADEMNET connected to IASNET over an X.25 gateway is also discussed.
31 CFR 50.54 - Payment of Federal share of compensation.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Automated Clearing House (ACH) network. (d) Segregated account for advance payments. An insurer that seeks... eligible to receive payments through the ACH network. Such an account is limited to the purposes of: (i... subpart. (f) Affiliated group. In the case of an affiliated group of insurers, Treasury will make payment...
Spatiotemporal models for data-anomaly detection in dynamic environmental monitoring campaigns
E.W. Dereszynski; T.G. Dietterich
2011-01-01
The ecological sciences have benefited greatly from recent advances in wireless sensor technologies. These technologies allow researchers to deploy networks of automated sensors, which can monitor a landscape at very fine temporal and spatial scales. However, these networks are subject to harsh conditions, which lead to malfunctions in individual sensors and failures...
The NASA Fireball Network Database
NASA Technical Reports Server (NTRS)
Moser, Danielle E.
2011-01-01
The NASA Meteoroid Environment Office (MEO) has been operating an automated video fireball network since late-2008. Since that time, over 1,700 multi-station fireballs have been observed. A database containing orbital data and trajectory information on all these events has recently been compiled and is currently being mined for information. Preliminary results are presented here.
Ialongo, Cristiano; Pieri, Massimo; Bernardini, Sergio
2017-02-01
Diluting a sample to obtain a measure within the analytical range is a common task in clinical laboratories. However, for urgent samples, it can cause delays in test reporting, which can put patients' safety at risk. The aim of this work is to show a simple artificial neural network that can be used to make it unnecessary to predilute a sample using the information available through the laboratory information system. Particularly, the Multilayer Perceptron neural network built on a data set of 16,106 cardiac troponin I test records produced a correct inference rate of 100% for samples not requiring predilution and 86.2% for those requiring predilution. With respect to the inference reliability, the most relevant inputs were the presence of a cardiac event or surgery and the result of the previous assay. Therefore, such an artificial neural network can be easily implemented into a total automation framework to sensibly reduce the turnaround time of critical orders delayed by the operation required to retrieve, dilute, and retest the sample.
Automation of wear analysis for large caliber weapons
NASA Astrophysics Data System (ADS)
Salafia, Dominick; DeLeon, Norberto L.; Outlaw, James F.
1999-12-01
As part of the Test and Evaluation Command (TECOM) the Metrology and Simulation Division (MT-MS) at the U.S. Army Yuma Proving Ground (USAYPG) has the mission to measure and record the wear effects of conventional and experimental munitions on large caliber weapons. The primary objective is to ensure that the weapon to be fired will safely meet the mission requirements for the quantity and energy of the munitions under live fire testing. Currently, there are two criteria used to "deadline" a weapon. One is the actual physical wear tolerance. The other relates to the energy (zone) expended by the round and the subsequent fatigue induced in the microstructure of the gun tube. The latter is referred to as the Equivalent Full Charge (EFC) for the particular round. In order to maximize safety and reduce the time required to manually search records for the appropriate level of useful life, the Measurements and Simulation Branch of MT-MS at USAYPG has made use of the installation network such that critical information may be accessed from the local area network or the Internet. An electronic database has been constructed and the query routines have been written so that systems test personnel, test directors (TD), and other government organizations may conduct a search for a particular weapon. The user may enter specifications such as percent physical life, percent EFC life, caliber, model, modifications, and serial number or any combination thereof. This paper is intended to inform the engineering and scientific community, engaged in weapons performance evaluation using simulations and field testing, of the existence of wear analysis automation for large caliber weapons.