Sample records for large scale digital

  1. Large Scale Cross Drive Correlation Of Digital Media

    DTIC Science & Technology

    2016-03-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS LARGE SCALE CROSS-DRIVE CORRELATION OF DIGITAL MEDIA by Joseph Van Bruaene March 2016 Thesis Co...CROSS-DRIVE CORRELATION OF DIGITAL MEDIA 5. FUNDING NUMBERS 6. AUTHOR(S) Joseph Van Bruaene 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval...the ability to make large scale cross-drive correlations among a large corpus of digital media becomes increasingly important. We propose a

  2. Architectural Optimization of Digital Libraries

    NASA Technical Reports Server (NTRS)

    Biser, Aileen O.

    1998-01-01

    This work investigates performance and scaling issues relevant to large scale distributed digital libraries. Presently, performance and scaling studies focus on specific implementations of production or prototype digital libraries. Although useful information is gained to aid these designers and other researchers with insights to performance and scaling issues, the broader issues relevant to very large scale distributed libraries are not addressed. Specifically, no current studies look at the extreme or worst case possibilities in digital library implementations. A survey of digital library research issues is presented. Scaling and performance issues are mentioned frequently in the digital library literature but are generally not the focus of much of the current research. In this thesis a model for a Generic Distributed Digital Library (GDDL) and nine cases of typical user activities are defined. This model is used to facilitate some basic analysis of scaling issues. Specifically, the calculation of Internet traffic generated for different configurations of the study parameters and an estimate of the future bandwidth needed for a large scale distributed digital library implementation. This analysis demonstrates the potential impact a future distributed digital library implementation would have on the Internet traffic load and raises questions concerning the architecture decisions being made for future distributed digital library designs.

  3. US National Large-scale City Orthoimage Standard Initiative

    USGS Publications Warehouse

    Zhou, G.; Song, C.; Benjamin, S.; Schickler, W.

    2003-01-01

    The early procedures and algorithms for National digital orthophoto generation in National Digital Orthophoto Program (NDOP) were based on earlier USGS mapping operations, such as field control, aerotriangulation (derived in the early 1920's), the quarter-quadrangle-centered (3.75 minutes of longitude and latitude in geographic extent), 1:40,000 aerial photographs, and 2.5 D digital elevation models. However, large-scale city orthophotos using early procedures have disclosed many shortcomings, e.g., ghost image, occlusion, shadow. Thus, to provide the technical base (algorithms, procedure) and experience needed for city large-scale digital orthophoto creation is essential for the near future national large-scale digital orthophoto deployment and the revision of the Standards for National Large-scale City Digital Orthophoto in National Digital Orthophoto Program (NDOP). This paper will report our initial research results as follows: (1) High-precision 3D city DSM generation through LIDAR data processing, (2) Spatial objects/features extraction through surface material information and high-accuracy 3D DSM data, (3) 3D city model development, (4) Algorithm development for generation of DTM-based orthophoto, and DBM-based orthophoto, (5) True orthophoto generation by merging DBM-based orthophoto and DTM-based orthophoto, and (6) Automatic mosaic by optimizing and combining imagery from many perspectives.

  4. A behavioral-level HDL description of SFQ logic circuits for quantitative performance analysis of large-scale SFQ digital systems

    NASA Astrophysics Data System (ADS)

    Matsuzaki, F.; Yoshikawa, N.; Tanaka, M.; Fujimaki, A.; Takai, Y.

    2003-10-01

    Recently many single flux quantum (SFQ) logic circuits containing several thousands of Josephson junctions have been designed successfully by using digital domain simulation based on the hard ware description language (HDL). In the present HDL-based design of SFQ circuits, a structure-level HDL description has been used, where circuits are made up of basic gate cells. However, in order to analyze large-scale SFQ digital systems, such as a microprocessor, more higher-level circuit abstraction is necessary to reduce the circuit simulation time. In this paper we have investigated the way to describe functionality of the large-scale SFQ digital circuits by a behavior-level HDL description. In this method, the functionality and the timing of the circuit block is defined directly by describing their behavior by the HDL. Using this method, we can dramatically reduce the simulation time of large-scale SFQ digital circuits.

  5. A comprehensive study on urban true orthorectification

    USGS Publications Warehouse

    Zhou, G.; Chen, W.; Kelmelis, J.A.; Zhang, Dongxiao

    2005-01-01

    To provide some advanced technical bases (algorithms and procedures) and experience needed for national large-scale digital orthophoto generation and revision of the Standards for National Large-Scale City Digital Orthophoto in the National Digital Orthophoto Program (NDOP), this paper presents a comprehensive study on theories, algorithms, and methods of large-scale urban orthoimage generation. The procedures of orthorectification for digital terrain model (DTM)-based and digital building model (DBM)-based orthoimage generation and their mergence for true orthoimage generation are discussed in detail. A method of compensating for building occlusions using photogrammetric geometry is developed. The data structure needed to model urban buildings for accurately generating urban orthoimages is presented. Shadow detection and removal, the optimization of seamline for automatic mosaic, and the radiometric balance of neighbor images are discussed. Street visibility analysis, including the relationship between flight height, building height, street width, and relative location of the street to the imaging center, is analyzed for complete true orthoimage generation. The experimental results demonstrated that our method can effectively and correctly orthorectify the displacements caused by terrain and buildings in urban large-scale aerial images. ?? 2005 IEEE.

  6. Implementation of large-scale routine diagnostics using whole slide imaging in Sweden: Digital pathology experiences 2006-2013

    PubMed Central

    Thorstenson, Sten; Molin, Jesper; Lundström, Claes

    2014-01-01

    Recent technological advances have improved the whole slide imaging (WSI) scanner quality and reduced the cost of storage, thereby enabling the deployment of digital pathology for routine diagnostics. In this paper we present the experiences from two Swedish sites having deployed routine large-scale WSI for primary review. At Kalmar County Hospital, the digitization process started in 2006 to reduce the time spent at the microscope in order to improve the ergonomics. Since 2008, more than 500,000 glass slides have been scanned in the routine operations of Kalmar and the neighboring Linköping University Hospital. All glass slides are digitally scanned yet they are also physically delivered to the consulting pathologist who can choose to review the slides on screen, in the microscope, or both. The digital operations include regular remote case reporting by a few hospital pathologists, as well as around 150 cases per week where primary review is outsourced to a private clinic. To investigate how the pathologists choose to use the digital slides, a web-based questionnaire was designed and sent out to the pathologists in Kalmar and Linköping. The responses showed that almost all pathologists think that ergonomics have improved and that image quality was sufficient for most histopathologic diagnostic work. 38 ± 28% of the cases were diagnosed digitally, but the survey also revealed that the pathologists commonly switch back and forth between digital and conventional microscopy within the same case. The fact that two full-scale digital systems have been implemented and that a large portion of the primary reporting is voluntarily performed digitally shows that large-scale digitization is possible today. PMID:24843825

  7. Full-color digitized holography for large-scale holographic 3D imaging of physical and nonphysical objects.

    PubMed

    Matsushima, Kyoji; Sonobe, Noriaki

    2018-01-01

    Digitized holography techniques are used to reconstruct three-dimensional (3D) images of physical objects using large-scale computer-generated holograms (CGHs). The object field is captured at three wavelengths over a wide area at high densities. Synthetic aperture techniques using single sensors are used for image capture in phase-shifting digital holography. The captured object field is incorporated into a virtual 3D scene that includes nonphysical objects, e.g., polygon-meshed CG models. The synthetic object field is optically reconstructed as a large-scale full-color CGH using red-green-blue color filters. The CGH has a wide full-parallax viewing zone and reconstructs a deep 3D scene with natural motion parallax.

  8. Event management for large scale event-driven digital hardware spiking neural networks.

    PubMed

    Caron, Louis-Charles; D'Haene, Michiel; Mailhot, Frédéric; Schrauwen, Benjamin; Rouat, Jean

    2013-09-01

    The interest in brain-like computation has led to the design of a plethora of innovative neuromorphic systems. Individually, spiking neural networks (SNNs), event-driven simulation and digital hardware neuromorphic systems get a lot of attention. Despite the popularity of event-driven SNNs in software, very few digital hardware architectures are found. This is because existing hardware solutions for event management scale badly with the number of events. This paper introduces the structured heap queue, a pipelined digital hardware data structure, and demonstrates its suitability for event management. The structured heap queue scales gracefully with the number of events, allowing the efficient implementation of large scale digital hardware event-driven SNNs. The scaling is linear for memory, logarithmic for logic resources and constant for processing time. The use of the structured heap queue is demonstrated on a field-programmable gate array (FPGA) with an image segmentation experiment and a SNN of 65,536 neurons and 513,184 synapses. Events can be processed at the rate of 1 every 7 clock cycles and a 406×158 pixel image is segmented in 200 ms. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. The study of integration about measurable image and 4D production

    NASA Astrophysics Data System (ADS)

    Zhang, Chunsen; Hu, Pingbo; Niu, Weiyun

    2008-12-01

    In this paper, we create the geospatial data of three-dimensional (3D) modeling by the combination of digital photogrammetry and digital close-range photogrammetry. For large-scale geographical background, we make the establishment of DEM and DOM combination of three-dimensional landscape model based on the digital photogrammetry which uses aerial image data to make "4D" (DOM: Digital Orthophoto Map, DEM: Digital Elevation Model, DLG: Digital Line Graphic and DRG: Digital Raster Graphic) production. For the range of building and other artificial features which the users are interested in, we realize that the real features of the three-dimensional reconstruction adopting the method of the digital close-range photogrammetry can come true on the basis of following steps : non-metric cameras for data collection, the camera calibration, feature extraction, image matching, and other steps. At last, we combine three-dimensional background and local measurements real images of these large geographic data and realize the integration of measurable real image and the 4D production.The article discussed the way of the whole flow and technology, achieved the three-dimensional reconstruction and the integration of the large-scale threedimensional landscape and the metric building.

  10. Digital disruption ?syndromes.

    PubMed

    Sullivan, Clair; Staib, Andrew

    2017-05-18

    The digital transformation of hospitals in Australia is occurring rapidly in order to facilitate innovation and improve efficiency. Rapid transformation can cause temporary disruption of hospital workflows and staff as processes are adapted to the new digital workflows. The aim of this paper is to outline various types of digital disruption and some strategies for effective management. A large tertiary university hospital recently underwent a rapid, successful roll-out of an integrated electronic medical record (EMR). We observed this transformation and propose several digital disruption "syndromes" to assist with understanding and management during digital transformation: digital deceleration, digital transparency, digital hypervigilance, data discordance, digital churn and post-digital 'depression'. These 'syndromes' are defined and discussed in detail. Successful management of this temporary digital disruption is important to ensure a successful transition to a digital platform. What is known about this topic? Digital disruption is defined as the changes facilitated by digital technologies that occur at a pace and magnitude that disrupt established ways of value creation, social interactions, doing business and more generally our thinking. Increasing numbers of Australian hospitals are implementing digital solutions to replace traditional paper-based systems for patient care in order to create opportunities for improved care and efficiencies. Such large scale change has the potential to create transient disruption to workflows and staff. Managing this temporary disruption effectively is an important factor in the successful implementation of an EMR. What does this paper add? A large tertiary university hospital recently underwent a successful rapid roll-out of an integrated electronic medical record (EMR) to become Australia's largest digital hospital over a 3-week period. We observed and assisted with the management of several cultural, behavioural and operational forms of digital disruption which lead us to propose some digital disruption 'syndromes'. The definition and management of these 'syndromes' are discussed in detail. What are the implications for practitioners? Minimising the temporary effects of digital disruption in hospitals requires an understanding that these digital 'syndromes' are to be expected and actively managed during large-scale transformation.

  11. Scaling up digital circuit computation with DNA strand displacement cascades.

    PubMed

    Qian, Lulu; Winfree, Erik

    2011-06-03

    To construct sophisticated biochemical circuits from scratch, one needs to understand how simple the building blocks can be and how robustly such circuits can scale up. Using a simple DNA reaction mechanism based on a reversible strand displacement process, we experimentally demonstrated several digital logic circuits, culminating in a four-bit square-root circuit that comprises 130 DNA strands. These multilayer circuits include thresholding and catalysis within every logical operation to perform digital signal restoration, which enables fast and reliable function in large circuits with roughly constant switching time and linear signal propagation delays. The design naturally incorporates other crucial elements for large-scale circuitry, such as general debugging tools, parallel circuit preparation, and an abstraction hierarchy supported by an automated circuit compiler.

  12. A Study on Fast Gates for Large-Scale Quantum Simulation with Trapped Ions

    PubMed Central

    Taylor, Richard L.; Bentley, Christopher D. B.; Pedernales, Julen S.; Lamata, Lucas; Solano, Enrique; Carvalho, André R. R.; Hope, Joseph J.

    2017-01-01

    Large-scale digital quantum simulations require thousands of fundamental entangling gates to construct the simulated dynamics. Despite success in a variety of small-scale simulations, quantum information processing platforms have hitherto failed to demonstrate the combination of precise control and scalability required to systematically outmatch classical simulators. We analyse how fast gates could enable trapped-ion quantum processors to achieve the requisite scalability to outperform classical computers without error correction. We analyze the performance of a large-scale digital simulator, and find that fidelity of around 70% is realizable for π-pulse infidelities below 10−5 in traps subject to realistic rates of heating and dephasing. This scalability relies on fast gates: entangling gates faster than the trap period. PMID:28401945

  13. A Study on Fast Gates for Large-Scale Quantum Simulation with Trapped Ions.

    PubMed

    Taylor, Richard L; Bentley, Christopher D B; Pedernales, Julen S; Lamata, Lucas; Solano, Enrique; Carvalho, André R R; Hope, Joseph J

    2017-04-12

    Large-scale digital quantum simulations require thousands of fundamental entangling gates to construct the simulated dynamics. Despite success in a variety of small-scale simulations, quantum information processing platforms have hitherto failed to demonstrate the combination of precise control and scalability required to systematically outmatch classical simulators. We analyse how fast gates could enable trapped-ion quantum processors to achieve the requisite scalability to outperform classical computers without error correction. We analyze the performance of a large-scale digital simulator, and find that fidelity of around 70% is realizable for π-pulse infidelities below 10 -5 in traps subject to realistic rates of heating and dephasing. This scalability relies on fast gates: entangling gates faster than the trap period.

  14. A practical overview and comparison of certain commercial forensic software tools for processing large-scale digital investigations

    NASA Astrophysics Data System (ADS)

    Kröger, Knut; Creutzburg, Reiner

    2013-05-01

    The aim of this paper is to show the usefulness of modern forensic software tools for processing large-scale digital investigations. In particular, we focus on the new version of Nuix 4.2 and compare it with AccessData FTK 4.2, X-Ways Forensics 16.9 and Guidance Encase Forensic 7 regarding its performance, functionality, usability and capability. We will show how these software tools work with large forensic images and how capable they are in examining complex and big data scenarios.

  15. Current Barriers to Large-scale Interoperability of Traceability Technology in the Seafood Sector.

    PubMed

    Hardt, Marah J; Flett, Keith; Howell, Colleen J

    2017-08-01

    Interoperability is a critical component of full-chain digital traceability, but is almost nonexistent in the seafood industry. Using both quantitative and qualitative methodology, this study explores the barriers impeding progress toward large-scale interoperability among digital traceability systems in the seafood sector from the perspectives of seafood companies, technology vendors, and supply chains as a whole. We highlight lessons from recent research and field work focused on implementing traceability across full supply chains and make some recommendations for next steps in terms of overcoming challenges and scaling current efforts. © 2017 Institute of Food Technologists®.

  16. Iterative learning-based decentralized adaptive tracker for large-scale systems: a digital redesign approach.

    PubMed

    Tsai, Jason Sheng-Hong; Du, Yan-Yi; Huang, Pei-Hsiang; Guo, Shu-Mei; Shieh, Leang-San; Chen, Yuhua

    2011-07-01

    In this paper, a digital redesign methodology of the iterative learning-based decentralized adaptive tracker is proposed to improve the dynamic performance of sampled-data linear large-scale control systems consisting of N interconnected multi-input multi-output subsystems, so that the system output will follow any trajectory which may not be presented by the analytic reference model initially. To overcome the interference of each sub-system and simplify the controller design, the proposed model reference decentralized adaptive control scheme constructs a decoupled well-designed reference model first. Then, according to the well-designed model, this paper develops a digital decentralized adaptive tracker based on the optimal analog control and prediction-based digital redesign technique for the sampled-data large-scale coupling system. In order to enhance the tracking performance of the digital tracker at specified sampling instants, we apply the iterative learning control (ILC) to train the control input via continual learning. As a result, the proposed iterative learning-based decentralized adaptive tracker not only has robust closed-loop decoupled property but also possesses good tracking performance at both transient and steady state. Besides, evolutionary programming is applied to search for a good learning gain to speed up the learning process of ILC. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  17. Research on the Application of Rapid Surveying and Mapping for Large Scare Topographic Map by Uav Aerial Photography System

    NASA Astrophysics Data System (ADS)

    Gao, Z.; Song, Y.; Li, C.; Zeng, F.; Wang, F.

    2017-08-01

    Rapid acquisition and processing method of large scale topographic map data, which relies on the Unmanned Aerial Vehicle (UAV) low-altitude aerial photogrammetry system, is studied in this paper, elaborating the main work flow. Key technologies of UAV photograph mapping is also studied, developing a rapid mapping system based on electronic plate mapping system, thus changing the traditional mapping mode and greatly improving the efficiency of the mapping. Production test and achievement precision evaluation of Digital Orth photo Map (DOM), Digital Line Graphic (DLG) and other digital production were carried out combined with the city basic topographic map update project, which provides a new techniques for large scale rapid surveying and has obvious technical advantage and good application prospect.

  18. Mass Digitization at Yale University Library: Exposing the Treasures in Our Stacks

    ERIC Educational Resources Information Center

    Weintraub, Jennifer; Wisner, Melissa

    2008-01-01

    In September 2007, Yale University Library (YUL) and Microsoft agreed to partner in a large-scale project to digitize 100,000 books from the YUL collections--an ambitious effort that would substantially increase the library's digitized holdings, particularly in the area of its own text collections. YUL has been digitizing materials from its…

  19. Evaluation of nucleus segmentation in digital pathology images through large scale image synthesis

    NASA Astrophysics Data System (ADS)

    Zhou, Naiyun; Yu, Xiaxia; Zhao, Tianhao; Wen, Si; Wang, Fusheng; Zhu, Wei; Kurc, Tahsin; Tannenbaum, Allen; Saltz, Joel; Gao, Yi

    2017-03-01

    Digital histopathology images with more than 1 Gigapixel are drawing more and more attention in clinical, biomedical research, and computer vision fields. Among the multiple observable features spanning multiple scales in the pathology images, the nuclear morphology is one of the central criteria for diagnosis and grading. As a result it is also the mostly studied target in image computing. Large amount of research papers have devoted to the problem of extracting nuclei from digital pathology images, which is the foundation of any further correlation study. However, the validation and evaluation of nucleus extraction have yet been formulated rigorously and systematically. Some researches report a human verified segmentation with thousands of nuclei, whereas a single whole slide image may contain up to million. The main obstacle lies in the difficulty of obtaining such a large number of validated nuclei, which is essentially an impossible task for pathologist. We propose a systematic validation and evaluation approach based on large scale image synthesis. This could facilitate a more quantitatively validated study for current and future histopathology image analysis field.

  20. Exploration of malingering indices in the Wechsler Adult Intelligence Scale-Fourth Edition Digit Span subtest.

    PubMed

    Reese, Caitlin S; Suhr, Julie A; Riddle, Tara L

    2012-03-01

    Prior research shows that Digit Span is a useful embedded measure of malingering. However, the Wechsler Adult Intelligence Scale-IV (Wechsler, 2008) altered Digit Span in meaningful ways, necessitating another look at Digit Span as an embedded measure of malingering. Using a simulated malingerer design, we examined the predictive accuracy of existing Digit Span validity indices and explored whether patterns of performance utilizing the new version would provide additional evidence for malingering. Undergraduates with a history of mild head injury performed with best effort or simulated impaired cognition and were also compared with a large sample of non-head-injured controls. Previously established cutoffs for the age-corrected scaled score and Reliable Digit Span (RDS) performed similarly in the present samples. Patterns of RDS length using all three subscales of the new scale were different in malingerers when compared with both head-injured and non-head-injured controls. Two potential alternative RDS scores were introduced, which showed better sensitivity than the traditional RDS, while retaining specificity to malingering.

  1. Digital selective growth of a ZnO nanowire array by large scale laser decomposition of zinc acetate.

    PubMed

    Hong, Sukjoon; Yeo, Junyeob; Manorotkul, Wanit; Kang, Hyun Wook; Lee, Jinhwan; Han, Seungyong; Rho, Yoonsoo; Suh, Young Duk; Sung, Hyung Jin; Ko, Seung Hwan

    2013-05-07

    We develop a digital direct writing method for ZnO NW micro-patterned growth on a large scale by selective laser decomposition of zinc acetate. For ZnO NW growth, by replacing the bulk heating with the scanning focused laser as a fully digital local heat source, zinc acetate crystallites can be selectively activated as a ZnO seed pattern to grow ZnO nanowires locally on a larger area. Together with the selective laser sintering process of metal nanoparticles, more than 10,000 UV sensors have been demonstrated on a 4 cm × 4 cm glass substrate to develop all-solution processible, all-laser mask-less digital fabrication of electronic devices including active layer and metal electrodes without any conventional vacuum deposition, photolithographic process, premade mask, high temperature and vacuum environment.

  2. Discovering and Mitigating Software Vulnerabilities through Large-Scale Collaboration

    ERIC Educational Resources Information Center

    Zhao, Mingyi

    2016-01-01

    In today's rapidly digitizing society, people place their trust in a wide range of digital services and systems that deliver latest news, process financial transactions, store sensitive information, etc. However, this trust does not have a solid foundation, because software code that supports this digital world has security vulnerabilities. These…

  3. A large scale GIS geodatabase of soil parameters supporting the modeling of conservation practice alternatives in the United States

    USDA-ARS?s Scientific Manuscript database

    Water quality modeling requires across-scale support of combined digital soil elements and simulation parameters. This paper presents the unprecedented development of a large spatial scale (1:250,000) ArcGIS geodatabase coverage designed as a functional repository of soil-parameters for modeling an...

  4. The Factors and Impacts of Large-Scale Digital Content Accreditations

    ERIC Educational Resources Information Center

    Kuo, Tony C. T.; Chen, Hong-Ren; Hwang, Wu-Yuin; Chen, Nian-Shing

    2015-01-01

    E-learning is an important and widespread contemporary trend in education. Because its success depends on the quality of digital materials, the mechanism by which such materials are accredited has received considerable attention and has influenced the design and implementation of digital courseware. For this reason, this study examined the…

  5. Documenting Student Connectivity and Use of Digital Annotation Devices in Virginia Commonwealth University Connected Courses: An Assessment Toolkit for Digital Pedagogies in Higher Education

    ERIC Educational Resources Information Center

    Gogia, Laura Park

    2016-01-01

    Virginia Commonwealth University (VCU) is implementing a large scale exploration of digital pedagogies, including connected learning and open education, in an effort to promote digital fluency and integrative thinking among students. The purpose of this study was to develop a classroom assessment toolkit for faculty who wish to document student…

  6. Digital footprints: facilitating large-scale environmental psychiatric research in naturalistic settings through data from everyday technologies

    PubMed Central

    Bidargaddi, N; Musiat, P; Makinen, V-P; Ermes, M; Schrader, G; Licinio, J

    2017-01-01

    Digital footprints, the automatically accumulated by-products of our technology-saturated lives, offer an exciting opportunity for psychiatric research. The commercial sector has already embraced the electronic trails of customers as an enabling tool for guiding consumer behaviour, and analogous efforts are ongoing to monitor and improve the mental health of psychiatric patients. The untargeted collection of digital footprints that may or may not be health orientated comprises a large untapped information resource for epidemiological scale research into psychiatric disorders. Real-time monitoring of mood, sleep and physical and social activity in a substantial portion of the affected population in a naturalistic setting is unprecedented in psychiatry. We propose that digital footprints can provide these measurements from real world setting unobtrusively and in a longitudinal fashion. In this perspective article, we outline the concept of digital footprints and the services and devices that create them, and present examples where digital footprints have been successfully used in research. We then critically discuss the opportunities and fundamental challenges associated digital footprints in psychiatric research, such as collecting data from different sources, analysis, ethical and research design challenges. PMID:27922603

  7. Digital footprints: facilitating large-scale environmental psychiatric research in naturalistic settings through data from everyday technologies.

    PubMed

    Bidargaddi, N; Musiat, P; Makinen, V-P; Ermes, M; Schrader, G; Licinio, J

    2017-02-01

    Digital footprints, the automatically accumulated by-products of our technology-saturated lives, offer an exciting opportunity for psychiatric research. The commercial sector has already embraced the electronic trails of customers as an enabling tool for guiding consumer behaviour, and analogous efforts are ongoing to monitor and improve the mental health of psychiatric patients. The untargeted collection of digital footprints that may or may not be health orientated comprises a large untapped information resource for epidemiological scale research into psychiatric disorders. Real-time monitoring of mood, sleep and physical and social activity in a substantial portion of the affected population in a naturalistic setting is unprecedented in psychiatry. We propose that digital footprints can provide these measurements from real world setting unobtrusively and in a longitudinal fashion. In this perspective article, we outline the concept of digital footprints and the services and devices that create them, and present examples where digital footprints have been successfully used in research. We then critically discuss the opportunities and fundamental challenges associated digital footprints in psychiatric research, such as collecting data from different sources, analysis, ethical and research design challenges.

  8. Study of a hybrid multispectral processor

    NASA Technical Reports Server (NTRS)

    Marshall, R. E.; Kriegler, F. J.

    1973-01-01

    A hybrid processor is described offering enough handling capacity and speed to process efficiently the large quantities of multispectral data that can be gathered by scanner systems such as MSDS, SKYLAB, ERTS, and ERIM M-7. Combinations of general-purpose and special-purpose hybrid computers were examined to include both analog and digital types as well as all-digital configurations. The current trend toward lower costs for medium-scale digital circuitry suggests that the all-digital approach may offer the better solution within the time frame of the next few years. The study recommends and defines such a hybrid digital computing system in which both special-purpose and general-purpose digital computers would be employed. The tasks of recognizing surface objects would be performed in a parallel, pipeline digital system while the tasks of control and monitoring would be handled by a medium-scale minicomputer system. A program to design and construct a small, prototype, all-digital system has been started.

  9. Digital geomorphological landslide hazard mapping of the Alpago area, Italy

    NASA Astrophysics Data System (ADS)

    van Westen, Cees J.; Soeters, Rob; Sijmons, Koert

    Large-scale geomorphological maps of mountainous areas are traditionally made using complex symbol-based legends. They can serve as excellent "geomorphological databases", from which an experienced geomorphologist can extract a large amount of information for hazard mapping. However, these maps are not designed to be used in combination with a GIS, due to their complex cartographic structure. In this paper, two methods are presented for digital geomorphological mapping at large scales using GIS and digital cartographic software. The methods are applied to an area with a complex geomorphological setting on the Borsoia catchment, located in the Alpago region, near Belluno in the Italian Alps. The GIS database set-up is presented with an overview of the data layers that have been generated and how they are interrelated. The GIS database was also converted into a paper map, using a digital cartographic package. The resulting largescale geomorphological hazard map is attached. The resulting GIS database and cartographic product can be used to analyse the hazard type and hazard degree for each polygon, and to find the reasons for the hazard classification.

  10. YF-12 cooperative airframe/propulsion control system program, volume 1

    NASA Technical Reports Server (NTRS)

    Anderson, D. L.; Connolly, G. F.; Mauro, F. M.; Reukauf, P. J.; Marks, R. (Editor)

    1980-01-01

    Several YF-12C airplane analog control systems were converted to a digital system. Included were the air data computer, autopilot, inlet control system, and autothrottle systems. This conversion was performed to allow assessment of digital technology applications to supersonic cruise aircraft. The digital system was composed of a digital computer and specialized interface unit. A large scale mathematical simulation of the airplane was used for integration testing and software checkout.

  11. Inexpensive Tools To Quantify And Map Vegetative Cover For Large-Scale Research Or Management Decisions.

    USDA-ARS?s Scientific Manuscript database

    Vegetative cover can be quantified quickly and consistently and often at lower cost with image analysis of color digital images than with visual assessments. Image-based mapping of vegetative cover for large-scale research and management decisions can now be considered with the accuracy of these met...

  12. USGS Releases New Digital Aerial Products

    USGS Publications Warehouse

    ,

    2005-01-01

    The U.S. Geological Survey (USGS) Center for Earth Resources Observation and Science (EROS) has initiated distribution of digital aerial photographic products produced by scanning or digitizing film from its historical aerial photography film archive. This archive, located in Sioux Falls, South Dakota, contains thousands of rolls of film that contain more than 8 million frames of historic aerial photographs. The largest portion of this archive consists of original film acquired by Federal agencies from the 1930s through the 1970s to produce 1:24,000-scale USGS topographic quadrangle maps. Most of this photography is reasonably large scale (USGS photography ranges from 1:8,000 to 1:80,000) to support the production of the maps. Two digital products are currently available for ordering: high-resolution scanned products and medium-resolution digitized products.

  13. Characterizing stroke lesions using digital templates and lesion quantification tools in a web-based imaging informatics system for a large-scale stroke rehabilitation clinical trial

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Edwardson, Matthew; Dromerick, Alexander; Winstein, Carolee; Wang, Jing; Liu, Brent

    2015-03-01

    Previously, we presented an Interdisciplinary Comprehensive Arm Rehabilitation Evaluation (ICARE) imaging informatics system that supports a large-scale phase III stroke rehabilitation trial. The ePR system is capable of displaying anonymized patient imaging studies and reports, and the system is accessible to multiple clinical trial sites and users across the United States via the web. However, the prior multicenter stroke rehabilitation trials lack any significant neuroimaging analysis infrastructure. In stroke related clinical trials, identification of the stroke lesion characteristics can be meaningful as recent research shows that lesion characteristics are related to stroke scale and functional recovery after stroke. To facilitate the stroke clinical trials, we hope to gain insight into specific lesion characteristics, such as vascular territory, for patients enrolled into large stroke rehabilitation trials. To enhance the system's capability for data analysis and data reporting, we have integrated new features with the system: a digital brain template display, a lesion quantification tool and a digital case report form. The digital brain templates are compiled from published vascular territory templates at each of 5 angles of incidence. These templates were updated to include territories in the brainstem using a vascular territory atlas and the Medical Image Processing, Analysis and Visualization (MIPAV) tool. The digital templates are displayed for side-by-side comparisons and transparent template overlay onto patients' images in the image viewer. The lesion quantification tool quantifies planimetric lesion area from user-defined contour. The digital case report form stores user input into a database, then displays contents in the interface to allow for reviewing, editing, and new inputs. In sum, the newly integrated system features provide the user with readily-accessible web-based tools to identify the vascular territory involved, estimate lesion area, and store these results in a web-based digital format.

  14. Children's Understanding of Large-Scale Mapping Tasks: An Analysis of Talk, Drawings, and Gesture

    ERIC Educational Resources Information Center

    Kotsopoulos, Donna; Cordy, Michelle; Langemeyer, Melanie

    2015-01-01

    This research examined how children represent motion in large-scale mapping tasks that we referred to as "motion maps". The underlying mathematical content was transformational geometry. In total, 19 children, 8- to 10-year-old, created motion maps and captured their motion maps with accompanying verbal description digitally. Analysis of…

  15. The Effect of Digital Publishing on Technical Services in University Libraries

    ERIC Educational Resources Information Center

    Hunter, Ben

    2013-01-01

    The past decade has brought enormous changes in scholarly communication, leading many libraries to undertake large-scale digital publishing initiatives. However, no study has investigated how technical services departments are changing to support these new services. Using change management as a theoretical framework, the investigator uses content…

  16. Digital Systems Validation Handbook. Volume 2. Chapter 18. Avionic Data Bus Integration Technology

    DTIC Science & Technology

    1993-11-01

    interaction between a digital data bus and an avionic system. Very Large Scale Integration (VLSI) ICs and multiversion software, which make up digital...1984, the Sperry Corporation developed a fault tolerant system which employed multiversion programming, voting, and monitoring for error detection and...formulate all the significant behavior of a system. MULTIVERSION PROGRAMMING. N-version programming. N-VERSION PROGRAMMING. The independent coding of a

  17. Symbolic magnitude processing in elementary school children: A group administered paper-and-pencil measure (SYMP Test).

    PubMed

    Brankaer, Carmen; Ghesquière, Pol; De Smedt, Bert

    2017-08-01

    The ability to compare symbolic numerical magnitudes correlates with children's concurrent and future mathematics achievement. We developed and evaluated a quick timed paper-and-pencil measure that can easily be used, for example in large-scale research, in which children have to cross out the numerically larger of two Arabic one- and two-digit numbers (SYMP Test). We investigated performance on this test in 1,588 primary school children (Grades 1-6) and examined in each grade its associations with mathematics achievement. The SYMP Test had satisfactory test-retest reliability. The SYMP Test showed significant and stable correlations with mathematics achievement for both one-digit and two-digit comparison, across all grades. This replicates the previously observed association between symbolic numerical magnitude processing and mathematics achievement, but extends it by showing that the association is observed in all grades in primary education and occurs for single- as well as multi-digit processing. Children with mathematical learning difficulties performed significantly lower on one-digit comparison and two-digit comparison in all grades. This all suggests satisfactory construct and criterion-related validity of the SYMP Test, which can be used in research, when performing large-scale (intervention) studies, and by practitioners, as screening measure to identify children at risk for mathematical difficulties or dyscalculia.

  18. Thermal Testing and Integration: Magnetospheric MultiScale (MMS) Observatories with Digital 1-Wire Sensors

    NASA Technical Reports Server (NTRS)

    Solimani, Jason A.; Rosanova, Santino

    2015-01-01

    Thermocouples require two thin wires to be routed out of the spacecraft to connect to the ground support equipment used to monitor and record the temperature data. This large number of wires that exit the observatory complicates integration and creates an undesirable heat path during testing. These wires exiting the spacecraft need to be characterized as a thermal short that will not exist during flight. To minimize complexity and reduce thermal variables from these ground support equipment (GSE) wires, MMS pursued a hybrid path for temperature monitoring, utilizing thermocouples and digital 1-wire temperature sensors. Digital 1-wire sensors can greatly reduce harness mass, length and complexity as they can be spliced together. For MMS, 350 digital 1-wire sensors were installed on the spacecraft with only 18 wires exiting as opposed to a potential 700 thermocouple wires. Digital 1-wire sensors had not been used in such a large scale at NASAGSFC prior to the MMS mission. During the MMS thermal vacuum testing a lessons learned matrix was formulated that will assist future integration of 1-wires into thermal testing and one day into flight.

  19. Validation Study on Alos Prism Dsm Mosaic and Aster Gdem 2

    NASA Astrophysics Data System (ADS)

    Tadono, T.; Takaku, J.; Shimada, M.

    2012-07-01

    This study aims to evaluate height accuracy of two datasets obtained by spaceborne optical instruments of a digital elevation data for a large-scale area. The digital surface model (DSM) was generated by the Panchromatic Remote-sensing Instrument for Stereo Mapping (PRISM) onboard the Advanced Land Observing Satellite (ALOS, nicknamed 'Daichi'), and the global digital elevation model (DEM) version 2 (GDEM-2) was derived from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) onboard NASA's TERRA satellite. The test site of this study was the entire country of Bhutan, which is located on the southern slopes of the eastern Himalayas. Bhutan is not a large country, covering about 330 km from east to west, and 170 km from north to south; however, it has large height variation from 200 m to more than 7,000 m. This therefore makes it very interesting for validating digital topographic information in terms of national scale generation as well as wide height range. Regarding the reference data, field surveys were conducted in 2010 and 2011, and collected ground control points by a global positioning system were used for evaluating precise height accuracies in point scale as check points (CPs), with a 3 arc-sec DEM created by the Shuttle Radar Topography Mission (SRTM-3) used to validate the wide region. The results confirmed a root mean square error of 8.1 m for PRISM DSM and 29.4 m for GDEM-2 by CPs.

  20. Variation of Supergranule Parameters with Solar Cycles: Results from Century-long Kodaikanal Digitized Ca ii K Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatterjee, Subhamoy; Mandal, Sudip; Banerjee, Dipankar, E-mail: dipu@iiap.res.in

    The Ca ii K spectroheliograms spanning over a century (1907–2007) from Kodaikanal Solar Observatory, India, have recently been digitized and calibrated. Applying a fully automated algorithm (which includes contrast enhancement and the “Watershed method”) to these data, we have identified the supergranules and calculated the associated parameters, such as scale, circularity, and fractal dimension. We have segregated the quiet and active regions and obtained the supergranule parameters separately for these two domains. In this way, we have isolated the effect of large-scale and small-scale magnetic fields on these structures and find a significantly different behavior of the supergranule parameters overmore » solar cycles. These differences indicate intrinsic changes in the physical mechanism behind the generation and evolution of supergranules in the presence of small-scale and large-scale magnetic fields. This also highlights the need for further studies using solar dynamo theory along with magneto-convection models.« less

  1. Addressing Methodological Challenges in Large Communication Data Sets: Collecting and Coding Longitudinal Interactions in Home Hospice Cancer Care.

    PubMed

    Reblin, Maija; Clayton, Margaret F; John, Kevin K; Ellington, Lee

    2016-07-01

    In this article, we present strategies for collecting and coding a large longitudinal communication data set collected across multiple sites, consisting of more than 2000 hours of digital audio recordings from approximately 300 families. We describe our methods within the context of implementing a large-scale study of communication during cancer home hospice nurse visits, but this procedure could be adapted to communication data sets across a wide variety of settings. This research is the first study designed to capture home hospice nurse-caregiver communication, a highly understudied location and type of communication event. We present a detailed example protocol encompassing data collection in the home environment, large-scale, multisite secure data management, the development of theoretically-based communication coding, and strategies for preventing coder drift and ensuring reliability of analyses. Although each of these challenges has the potential to undermine the utility of the data, reliability between coders is often the only issue consistently reported and addressed in the literature. Overall, our approach demonstrates rigor and provides a "how-to" example for managing large, digitally recorded data sets from collection through analysis. These strategies can inform other large-scale health communication research.

  2. Addressing Methodological Challenges in Large Communication Datasets: Collecting and Coding Longitudinal Interactions in Home Hospice Cancer Care

    PubMed Central

    Reblin, Maija; Clayton, Margaret F; John, Kevin K; Ellington, Lee

    2015-01-01

    In this paper, we present strategies for collecting and coding a large longitudinal communication dataset collected across multiple sites, consisting of over 2000 hours of digital audio recordings from approximately 300 families. We describe our methods within the context of implementing a large-scale study of communication during cancer home hospice nurse visits, but this procedure could be adapted to communication datasets across a wide variety of settings. This research is the first study designed to capture home hospice nurse-caregiver communication, a highly understudied location and type of communication event. We present a detailed example protocol encompassing data collection in the home environment, large-scale, multi-site secure data management, the development of theoretically-based communication coding, and strategies for preventing coder drift and ensuring reliability of analyses. Although each of these challenges have the potential to undermine the utility of the data, reliability between coders is often the only issue consistently reported and addressed in the literature. Overall, our approach demonstrates rigor and provides a “how-to” example for managing large, digitally-recorded data sets from collection through analysis. These strategies can inform other large-scale health communication research. PMID:26580414

  3. Neuromorphic Hardware Architecture Using the Neural Engineering Framework for Pattern Recognition.

    PubMed

    Wang, Runchun; Thakur, Chetan Singh; Cohen, Gregory; Hamilton, Tara Julia; Tapson, Jonathan; van Schaik, Andre

    2017-06-01

    We present a hardware architecture that uses the neural engineering framework (NEF) to implement large-scale neural networks on field programmable gate arrays (FPGAs) for performing massively parallel real-time pattern recognition. NEF is a framework that is capable of synthesising large-scale cognitive systems from subnetworks and we have previously presented an FPGA implementation of the NEF that successfully performs nonlinear mathematical computations. That work was developed based on a compact digital neural core, which consists of 64 neurons that are instantiated by a single physical neuron using a time-multiplexing approach. We have now scaled this approach up to build a pattern recognition system by combining identical neural cores together. As a proof of concept, we have developed a handwritten digit recognition system using the MNIST database and achieved a recognition rate of 96.55%. The system is implemented on a state-of-the-art FPGA and can process 5.12 million digits per second. The architecture and hardware optimisations presented offer high-speed and resource-efficient means for performing high-speed, neuromorphic, and massively parallel pattern recognition and classification tasks.

  4. Cartography of irregularly shaped satellites

    NASA Technical Reports Server (NTRS)

    Batson, R. M.; Edwards, Kathleen

    1987-01-01

    Irregularly shaped satellites, such as Phobos and Amalthea, do not lend themselves to mapping by conventional methods because mathematical projections of their surfaces fail to convey an accurate visual impression of the landforms, and because large and irregular scale changes make their features difficult to measure on maps. A digital mapping technique has therefore been developed by which maps are compiled from digital topographic and spacecraft image files. The digital file is geometrically transformed as desired for human viewing, either on video screens or on hard copy. Digital files of this kind consist of digital images superimposed on another digital file representing the three-dimensional form of a body.

  5. "Baby-Cam" and Researching with Infants: Viewer, Image and (Not) Knowing

    ERIC Educational Resources Information Center

    Elwick, Sheena

    2015-01-01

    This article offers a methodological reflection on how "baby-cam" enhanced ethically reflective attitudes in a large-scale research project that set out to research with infants in Australian early childhood education and care settings. By juxtaposing digital images produced by two different digital-camera technologies and drawing on…

  6. Marginalised Behaviour: Digital Annotations, Spatial Encoding and the Implications for Reading Comprehension

    ERIC Educational Resources Information Center

    Johnson, Martin; Nadas, Rita

    2009-01-01

    Within large scale educational assessment agencies in the UK, there has been a shift towards assessors marking digitally scanned copies rather than the original paper scripts that were traditionally used. This project uses extended essay examination scripts to consider whether the mode in which an essay is read potentially influences the…

  7. A Better Blend: A Vision for Boosting Student Outcomes with Digital Learning

    ERIC Educational Resources Information Center

    Public Impact, 2013

    2013-01-01

    Blended learning that combines digital instruction with live, accountable teachers holds unique promise to improve student outcomes dramatically. Schools will not realize this promise at large scale with technology improvements alone, though, or with technology and today's typical teaching roles. This brief explains how schools can use blended…

  8. Electronic Scientific Data & Literature Aggregation: A Review for Librarians

    ERIC Educational Resources Information Center

    Losoff, Barbara

    2009-01-01

    The advent of large-scale digital repositories, along with the need for sharing useful data world-wide, demands change to the current information structure. The merging of digital scientific data with scholarly literature has the potential to fulfill the Semantic Web design principles. This paper will identify factors leading to integration of…

  9. Network Access to Visual Information: A Study of Costs and Uses.

    ERIC Educational Resources Information Center

    Besser, Howard

    This paper summarizes a subset of the findings of a study of digital image distribution that focused on the Museum Educational Site Licensing (MESL) project--the first large-scale multi-institutional project to explore digital delivery of art images and accompanying text/metadata from disparate sources. This Mellon Foundation-sponsored study…

  10. Science Teachers' Response to the Digital Education Revolution

    ERIC Educational Resources Information Center

    Nielsen, Wendy; Miller, K. Alex; Hoban, Garry

    2015-01-01

    We report a case study of two highly qualified science teachers as they implemented laptop computers in their Years 9 and 10 science classes at the beginning of the "Digital Education Revolution," Australia's national one-to-one laptop program initiated in 2009. When a large-scale investment is made in a significant educational change,…

  11. Three-dimensional displays for natural hazards analysis, using classified Landsat Thematic Mapper digital data and large-scale digital elevation models

    NASA Technical Reports Server (NTRS)

    Butler, David R.; Walsh, Stephen J.; Brown, Daniel G.

    1991-01-01

    Methods are described for using Landsat Thematic Mapper digital data and digital elevation models for the display of natural hazard sites in a mountainous region of northwestern Montana, USA. Hazard zones can be easily identified on the three-dimensional images. Proximity of facilities such as highways and building locations to hazard sites can also be easily displayed. A temporal sequence of Landsat TM (or similar) satellite data sets could also be used to display landscape changes associated with dynamic natural hazard processes.

  12. Implementation factors affecting the large-scale deployment of digital health and well-being technologies: A qualitative study of the initial phases of the 'Living-It-Up' programme.

    PubMed

    Agbakoba, Ruth; McGee-Lennon, Marilyn; Bouamrane, Matt-Mouley; Watson, Nicholas; Mair, Frances S

    2016-12-01

    Little is known about the factors which facilitate or impede the large-scale deployment of health and well-being consumer technologies. The Living-It-Up project is a large-scale digital intervention led by NHS 24, aiming to transform health and well-being services delivery throughout Scotland. We conducted a qualitative study of the factors affecting the implementation and deployment of the Living-It-Up services. We collected a range of data during the initial phase of deployment, including semi-structured interviews (N = 6); participant observation sessions (N = 5) and meetings with key stakeholders (N = 3). We used the Normalisation Process Theory as an explanatory framework to interpret the social processes at play during the initial phases of deployment.Initial findings illustrate that it is clear - and perhaps not surprising - that the size and diversity of the Living-It-Up consortium made implementation processes more complex within a 'multi-stakeholder' environment. To overcome these barriers, there is a need to clearly define roles, tasks and responsibilities among the consortium partners. Furthermore, varying levels of expectations and requirements, as well as diverse cultures and ways of working, must be effectively managed. Factors which facilitated implementation included extensive stakeholder engagement, such as co-design activities, which can contribute to an increased 'buy-in' from users in the long term. An important lesson from the Living-It-Up initiative is that attempting to co-design innovative digital services, but at the same time, recruiting large numbers of users is likely to generate conflicting implementation priorities which hinder - or at least substantially slow down - the effective rollout of services at scale.The deployment of Living-It-Up services is ongoing, but our results to date suggest that - in order to be successful - the roll-out of digital health and well-being technologies at scale requires a delicate and pragmatic trade-off between co-design activities, the development of innovative services and the efforts allocated to widespread marketing and recruitment initiatives. © The Author(s) 2015.

  13. Low power signal processing research at Stanford

    NASA Technical Reports Server (NTRS)

    Burr, J.; Williamson, P. R.; Peterson, A.

    1991-01-01

    This paper gives an overview of the research being conducted at Stanford University's Space, Telecommunications, and Radioscience Laboratory in the area of low energy computation. It discusses the work we are doing in large scale digital VLSI neural networks, interleaved processor and pipelined memory architectures, energy estimation and optimization, multichip module packaging, and low voltage digital logic.

  14. Preservation and Access to Manuscript Collections of the Czech National Library.

    ERIC Educational Resources Information Center

    Karen, Vladimir; Psohlavec, Stanislav

    In 1996, the Czech National Library started a large-scale digitization of its extensive and invaluable collection of historical manuscripts and printed books. Each page of the selected documents is scanned using a high-resolution, full-color digital camera, processed, and archived on a CD-ROM disk. HTML coded description is added to the entire…

  15. Large-scale quantitative analysis of painting arts.

    PubMed

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-11

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  16. Capabilities of the Large-Scale Sediment Transport Facility

    DTIC Science & Technology

    2016-04-01

    experiments in wave /current environments. INTRODUCTION: The LSTF (Figure 1) is a large-scale laboratory facility capable of simulating conditions...comparable to low- wave energy coasts. The facility was constructed to address deficiencies in existing methods for calculating longshore sediment...transport. The LSTF consists of a 30 m wide, 50 m long, 1.4 m deep basin. Waves are generated by four digitally controlled wave makers capable of producing

  17. Delivering digital health and well-being at scale: lessons learned during the implementation of the dallas program in the United Kingdom

    PubMed Central

    Devlin, Alison M; McGee-Lennon, Marilyn; O’Donnell, Catherine A; Bouamrane, Matt-Mouley; Agbakoba, Ruth; O’Connor, Siobhan; Grieve, Eleanor; Finch, Tracy; Wyke, Sally; Watson, Nicholas; Browne, Susan

    2016-01-01

    Objective To identify implementation lessons from the United Kingdom Delivering Assisted Living Lifestyles at Scale (dallas) program—a large-scale, national technology program that aims to deliver a broad range of digital services and products to the public to promote health and well-being. Materials and Methods Prospective, longitudinal qualitative research study investigating implementation processes. Qualitative data collected includes semi-structured e-Health Implementation Toolkit–led interviews at baseline/mid-point (n = 38), quarterly evaluation, quarterly technical and barrier and solutions reports, observational logs, quarterly evaluation alignment interviews with project leads, observational data collected during meetings, and ethnographic data from dallas events (n > 200 distinct pieces of qualitative data). Data analysis was guided by Normalization Process Theory, a sociological theory that aids conceptualization of implementation issues in complex healthcare settings. Results Five key challenges were identified: 1) The challenge of establishing and maintaining large heterogeneous, multi-agency partnerships to deliver new models of healthcare; 2) The need for resilience in the face of barriers and set-backs including the backdrop of continually changing external environments; 3) The inherent tension between embracing innovative co-design and achieving delivery at pace and at scale; 4) The effects of branding and marketing issues in consumer healthcare settings; and 5) The challenge of interoperability and information governance, when commercial proprietary models are dominant. Conclusions The magnitude and ambition of the dallas program provides a unique opportunity to investigate the macro level implementation challenges faced when designing and delivering digital health and wellness services at scale. Flexibility, adaptability, and resilience are key implementation facilitators when shifting to new digitally enabled models of care. PMID:26254480

  18. Delivering digital health and well-being at scale: lessons learned during the implementation of the dallas program in the United Kingdom.

    PubMed

    Devlin, Alison M; McGee-Lennon, Marilyn; O'Donnell, Catherine A; Bouamrane, Matt-Mouley; Agbakoba, Ruth; O'Connor, Siobhan; Grieve, Eleanor; Finch, Tracy; Wyke, Sally; Watson, Nicholas; Browne, Susan; Mair, Frances S

    2016-01-01

    To identify implementation lessons from the United Kingdom Delivering Assisted Living Lifestyles at Scale (dallas) program-a large-scale, national technology program that aims to deliver a broad range of digital services and products to the public to promote health and well-being. Prospective, longitudinal qualitative research study investigating implementation processes. Qualitative data collected includes semi-structured e-Health Implementation Toolkit-led interviews at baseline/mid-point (n = 38), quarterly evaluation, quarterly technical and barrier and solutions reports, observational logs, quarterly evaluation alignment interviews with project leads, observational data collected during meetings, and ethnographic data from dallas events (n > 200 distinct pieces of qualitative data). Data analysis was guided by Normalization Process Theory, a sociological theory that aids conceptualization of implementation issues in complex healthcare settings. Five key challenges were identified: 1) The challenge of establishing and maintaining large heterogeneous, multi-agency partnerships to deliver new models of healthcare; 2) The need for resilience in the face of barriers and set-backs including the backdrop of continually changing external environments; 3) The inherent tension between embracing innovative co-design and achieving delivery at pace and at scale; 4) The effects of branding and marketing issues in consumer healthcare settings; and 5) The challenge of interoperability and information governance, when commercial proprietary models are dominant. The magnitude and ambition of the dallas program provides a unique opportunity to investigate the macro level implementation challenges faced when designing and delivering digital health and wellness services at scale. Flexibility, adaptability, and resilience are key implementation facilitators when shifting to new digitally enabled models of care. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  19. Comparison of Multi-Scale Digital Elevation Models for Defining Waterways and Catchments Over Large Areas

    NASA Astrophysics Data System (ADS)

    Harris, B.; McDougall, K.; Barry, M.

    2012-07-01

    Digital Elevation Models (DEMs) allow for the efficient and consistent creation of waterways and catchment boundaries over large areas. Studies of waterway delineation from DEMs are usually undertaken over small or single catchment areas due to the nature of the problems being investigated. Improvements in Geographic Information Systems (GIS) techniques, software, hardware and data allow for analysis of larger data sets and also facilitate a consistent tool for the creation and analysis of waterways over extensive areas. However, rarely are they developed over large regional areas because of the lack of available raw data sets and the amount of work required to create the underlying DEMs. This paper examines definition of waterways and catchments over an area of approximately 25,000 km2 to establish the optimal DEM scale required for waterway delineation over large regional projects. The comparative study analysed multi-scale DEMs over two test areas (Wivenhoe catchment, 543 km2 and a detailed 13 km2 within the Wivenhoe catchment) including various data types, scales, quality, and variable catchment input parameters. Historic and available DEM data was compared to high resolution Lidar based DEMs to assess variations in the formation of stream networks. The results identified that, particularly in areas of high elevation change, DEMs at 20 m cell size created from broad scale 1:25,000 data (combined with more detailed data or manual delineation in flat areas) are adequate for the creation of waterways and catchments at a regional scale.

  20. Hardware synthesis from DDL description. [simulating a digital system for computerized design of large scale integrated circuits

    NASA Technical Reports Server (NTRS)

    Shiva, S. G.; Shah, A. M.

    1980-01-01

    The details of digital systems can be conveniently input into the design automation system by means of hardware description language (HDL). The computer aided design and test (CADAT) system at NASA MSFC is used for the LSI design. The digital design language (DDL) was selected as HDL for the CADAT System. DDL translator output can be used for the hardware implementation of the digital design. Problems of selecting the standard cells from the CADAT standard cell library to realize the logic implied by the DDL description of the system are addressed.

  1. Digital CODEC for real-time processing of broadcast quality video signals at 1.8 bits/pixel

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary JO; Whyte, Wayne A., Jr.

    1989-01-01

    Advances in very large-scale integration and recent work in the field of bandwidth efficient digital modulation techniques have combined to make digital video processing technically feasible and potentially cost competitive for broadcast quality television transmission. A hardware implementation was developed for a DPCM-based digital television bandwidth compression algorithm which processes standard NTSC composite color television signals and produces broadcast quality video in real time at an average of 1.8 bits/pixel. The data compression algorithm and the hardware implementation of the CODEC are described, and performance results are provided.

  2. Digital CODEC for real-time processing of broadcast quality video signals at 1.8 bits/pixel

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary JO; Whyte, Wayne A.

    1991-01-01

    Advances in very large scale integration and recent work in the field of bandwidth efficient digital modulation techniques have combined to make digital video processing technically feasible an potentially cost competitive for broadcast quality television transmission. A hardware implementation was developed for DPCM (differential pulse code midulation)-based digital television bandwidth compression algorithm which processes standard NTSC composite color television signals and produces broadcast quality video in real time at an average of 1.8 bits/pixel. The data compression algorithm and the hardware implementation of the codec are described, and performance results are provided.

  3. Changes in the High-Latitude Topside Ionospheric Vertical Electron-Density Profiles in Response to Solar-Wind Perturbations During Large Magnetic Storms

    NASA Technical Reports Server (NTRS)

    Benson, Robert F.; Fainberg, Joseph; Osherovich, Vladimir; Truhlik, Vladimir; Wang, Yongli; Arbacher, Becca

    2011-01-01

    The latest results from an investigation to establish links between solar-wind and topside-ionospheric parameters will be presented including a case where high-latitude topside electron-density Ne(h) profiles indicated dramatic rapid changes in the scale height during the main phase of a large magnetic storm (Dst < -200 nT). These scale-height changes suggest a large heat input to the topside ionosphere at this time. The topside profiles were derived from ISIS-1 digital ionograms obtained from the NASA Space Physics Data Facility (SPDF) Coordinated Data Analysis Web (CDA Web). Solar-wind data obtained from the NASA OMNIWeb database indicated that the magnetic storm was due to a magnetic cloud. This event is one of several large magnetic storms being investigated during the interval from 1965 to 1984 when both solar-wind and digital topside ionograms, from either Alouette-2, ISIS-1, or ISIS-2, are potentially available.

  4. Concept For Generation Of Long Pseudorandom Sequences

    NASA Technical Reports Server (NTRS)

    Wang, C. C.

    1990-01-01

    Conceptual very-large-scale integrated (VLSI) digital circuit performs exponentiation in finite field. Algorithm that generates unusually long sequences of pseudorandom numbers executed by digital processor that includes such circuits. Concepts particularly advantageous for such applications as spread-spectrum communications, cryptography, and generation of ranging codes, synthetic noise, and test data, where usually desirable to make pseudorandom sequences as long as possible.

  5. Implementation and Performance of GaAs Digital Signal Processing ASICs

    NASA Technical Reports Server (NTRS)

    Whitaker, William D.; Buchanan, Jeffrey R.; Burke, Gary R.; Chow, Terrance W.; Graham, J. Scott; Kowalski, James E.; Lam, Barbara; Siavoshi, Fardad; Thompson, Matthew S.; Johnson, Robert A.

    1993-01-01

    The feasibility of performing high speed digital signal processing in GaAs gate array technology has been demonstrated with the successful implementation of a VLSI communications chip set for NASA's Deep Space Network. This paper describes the techniques developed to solve some of the technology and implementation problems associated with large scale integration of GaAs gate arrays.

  6. Large-Scale Quantitative Analysis of Painting Arts

    PubMed Central

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-01-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877

  7. Spatial Modeling and Uncertainty Assessment of Fine Scale Surface Processes Based on Coarse Terrain Elevation Data

    NASA Astrophysics Data System (ADS)

    Rasera, L. G.; Mariethoz, G.; Lane, S. N.

    2017-12-01

    Frequent acquisition of high-resolution digital elevation models (HR-DEMs) over large areas is expensive and difficult. Satellite-derived low-resolution digital elevation models (LR-DEMs) provide extensive coverage of Earth's surface but at coarser spatial and temporal resolutions. Although useful for large scale problems, LR-DEMs are not suitable for modeling hydrologic and geomorphic processes at scales smaller than their spatial resolution. In this work, we present a multiple-point geostatistical approach for downscaling a target LR-DEM based on available high-resolution training data and recurrent high-resolution remote sensing images. The method aims at generating several equiprobable HR-DEMs conditioned to a given target LR-DEM by borrowing small scale topographic patterns from an analogue containing data at both coarse and fine scales. An application of the methodology is demonstrated by using an ensemble of simulated HR-DEMs as input to a flow-routing algorithm. The proposed framework enables a probabilistic assessment of the spatial structures generated by natural phenomena operating at scales finer than the available terrain elevation measurements. A case study in the Swiss Alps is provided to illustrate the methodology.

  8. Large scale digital atlases in neuroscience

    NASA Astrophysics Data System (ADS)

    Hawrylycz, M.; Feng, D.; Lau, C.; Kuan, C.; Miller, J.; Dang, C.; Ng, L.

    2014-03-01

    Imaging in neuroscience has revolutionized our current understanding of brain structure, architecture and increasingly its function. Many characteristics of morphology, cell type, and neuronal circuitry have been elucidated through methods of neuroimaging. Combining this data in a meaningful, standardized, and accessible manner is the scope and goal of the digital brain atlas. Digital brain atlases are used today in neuroscience to characterize the spatial organization of neuronal structures, for planning and guidance during neurosurgery, and as a reference for interpreting other data modalities such as gene expression and connectivity data. The field of digital atlases is extensive and in addition to atlases of the human includes high quality brain atlases of the mouse, rat, rhesus macaque, and other model organisms. Using techniques based on histology, structural and functional magnetic resonance imaging as well as gene expression data, modern digital atlases use probabilistic and multimodal techniques, as well as sophisticated visualization software to form an integrated product. Toward this goal, brain atlases form a common coordinate framework for summarizing, accessing, and organizing this knowledge and will undoubtedly remain a key technology in neuroscience in the future. Since the development of its flagship project of a genome wide image-based atlas of the mouse brain, the Allen Institute for Brain Science has used imaging as a primary data modality for many of its large scale atlas projects. We present an overview of Allen Institute digital atlases in neuroscience, with a focus on the challenges and opportunities for image processing and computation.

  9. MicroEcos: Micro-Scale Explorations of Large-Scale Late Pleistocene Ecosystems

    NASA Astrophysics Data System (ADS)

    Gellis, B. S.

    2017-12-01

    Pollen data can inform the reconstruction of early-floral environments by providing data for artistic representations of what early-terrestrial ecosystems looked like, and how existing terrestrial landscapes have evolved. For example, what did the Bighorn Basin look like when large ice sheets covered modern Canada, the Yellowstone Plateau had an ice cap, and the Bighorn Mountains were mantled with alpine glaciers? MicroEcos is an immersive, multimedia project that aims to strengthen human-nature connections through the understanding and appreciation of biological ecosystems. Collected pollen data elucidates flora that are visible in the fossil record - associated with the Late-Pleistocene - and have been illustrated and described in botanical literature. It aims to make scientific data accessible and interesting to all audiences through a series of interactive-digital sculptures, large-scale photography and field-based videography. While this project is driven by scientific data, it is rooted in deeply artistic and outreach-based practices, which include broad artistic practices, e.g.: digital design, illustration, photography, video and sound design. Using 3D modeling and printing technology MicroEcos centers around a series of 3D-printed models of the Last Canyon rock shelter on the Wyoming and Montana border, Little Windy Hill pond site in Wyoming's Medicine Bow National Forest, and Natural Trap Cave site in Wyoming's Big Horn Basin. These digital, interactive-3D sculpture provide audiences with glimpses of three-dimensional Late-Pleistocene environments, and helps create dialogue of how grass, sagebrush, and spruce based ecosystems form. To help audiences better contextualize how MicroEcos bridges notions of time, space, and place, modern photography and videography of the Last Canyon, Little Windy Hill and Natural Trap Cave sites surround these 3D-digital reconstructions.

  10. Benefits and applications of interdisciplinary digital tools for environmental meta-reviews and analyses

    NASA Astrophysics Data System (ADS)

    Grubert, Emily; Siders, Anne

    2016-09-01

    Digitally-aided reviews of large bodies of text-based information, such as academic literature, are growing in capability but are not yet common in environmental fields. Environmental sciences and studies can benefit from application of digital tools to create comprehensive, replicable, interdisciplinary reviews that provide rapid, up-to-date, and policy-relevant reports of existing work. This work reviews the potential for applications of computational text mining and analysis tools originating in the humanities to environmental science and policy questions. Two process-oriented case studies of digitally-aided environmental literature reviews and meta-analyses illustrate potential benefits and limitations. A medium-sized, medium-resolution review (∼8000 journal abstracts and titles) focuses on topic modeling as a rapid way to identify thematic changes over time. A small, high-resolution review (∼300 full text journal articles) combines collocation and network analysis with manual coding to synthesize and question empirical field work. We note that even small digitally-aided analyses are close to the upper limit of what can be done manually. Established computational methods developed in humanities disciplines and refined by humanities and social science scholars to interrogate large bodies of textual data are applicable and useful in environmental sciences but have not yet been widely applied. Two case studies provide evidence that digital tools can enhance insight. Two major conclusions emerge. First, digital tools enable scholars to engage large literatures rapidly and, in some cases, more comprehensively than is possible manually. Digital tools can confirm manually identified patterns or identify additional patterns visible only at a large scale. Second, digital tools allow for more replicable and transparent conclusions to be drawn from literature reviews and meta-analyses. The methodological subfields of digital humanities and computational social sciences will likely continue to create innovative tools for analyzing large bodies of text, providing opportunities for interdisciplinary collaboration with the environmental fields.

  11. English Teachers in the Digital Age--A Case Study of Policy and Expert Practice from England

    ERIC Educational Resources Information Center

    Goodwyn, Andy

    2011-01-01

    This article is a case study of how English teachers in England have coped with the paradigm shift from print to digital literacy. It reviews a large scale national initiative that was intended to upskill all teachers, considers its weak impact and explores the author's involvement in the evaluation of the project's direct value to English…

  12. Evaluation of a digital food photography atlas used as portion size measurement aid in dietary surveys in Greece.

    PubMed

    Naska, Androniki; Valanou, Elisavet; Peppa, Eleni; Katsoulis, Michail; Barbouni, Anastasia; Trichopoulou, Antonia

    2016-09-01

    To evaluate how well respondents perceive digital images of food portions commonly consumed in Greece. The picture series was defined on the basis of usual dietary intakes assessed in earlier large-scale studies in Greece. The evaluation included 2218 pre-weighed actual portions shown to participants, who were subsequently asked to link each portion to a food picture. Mean differences between picture numbers selected and portions actually shown were compared using the Wilcoxon paired signed-rank test. The effect of personal characteristics on participants' selections was evaluated through unpaired t tests (sex and school years) or through Tukey-Kramer pairwise comparisons (age and food groups). Testing of participants' perception of digital food images used in the Greek national nutrition survey. Individuals (n 103, 61 % females) aged 12 years and over, selected on the basis of the target population of the Greek nutrition survey using convenience sampling. Individuals selected the correct or adjacent image in about 90 % of the assessments and tended to overestimate small and underestimate large quantities. Photographs of Greek traditional pies and meat-based pastry dishes led participants to perceive the amounts in the photos larger than they actually were. Adolescents were more prone to underestimating food quantities through the pictures. The digital food atlas appears generally suitable to be used for the estimation of average food intakes in large-scale dietary surveys in Greece. However, individuals who consistently consume only small or only large food portions may have biased perceptions in relation to others.

  13. Historical glacier outlines from digitized topographic maps of the Swiss Alps

    NASA Astrophysics Data System (ADS)

    Freudiger, Daphné; Mennekes, David; Seibert, Jan; Weiler, Markus

    2018-04-01

    Since the end of the Little Ice Age around 1850, the total glacier area of the central European Alps has considerably decreased. In order to understand the changes in glacier coverage at various scales and to model past and future streamflow accurately, long-term and large-scale datasets of glacier outlines are needed. To fill the gap between the morphologically reconstructed glacier outlines from the moraine extent corresponding to the time period around 1850 and the first complete dataset of glacier areas in the Swiss Alps from aerial photographs in 1973, glacier areas from 80 sheets of a historical topographic map (the Siegfried map) were manually digitized for the publication years 1878-1918 (further called first period, with most sheets being published around 1900) and 1917-1944 (further called second period, with most sheets being published around 1935). The accuracy of the digitized glacier areas was then assessed through a two-step validation process: the data were (1) visually and (2) quantitatively compared to glacier area datasets of the years 1850, 1973, 2003, and 2010, which were derived from different sources, at the large scale, basin scale, and locally. The validation showed that at least 70 % of the digitized glaciers were comparable to the outlines from the other datasets and were therefore plausible. Furthermore, the inaccuracy of the manual digitization was found to be less than 5 %. The presented datasets of glacier outlines for the first and second periods are a valuable source of information for long-term glacier mass balance or hydrological modelling in glacierized basins. The uncertainty of the historical topographic maps should be considered during the interpretation of the results. The datasets can be downloaded from the FreiDok plus data repository (https://freidok.uni-freiburg.de/data/15008, https://doi.org/10.6094/UNIFR/15008).

  14. Compact wavelength-selective optical switch based on digital optical phase conjugation.

    PubMed

    Li, Zhiyang; Claver, Havyarimana

    2013-11-15

    In this Letter, we show that digital optical phase conjugation might be utilized to construct a new kind of wavelength-selective switches. When incorporated with a multimode interferometer, these switches have wide bandwidth, high tolerance for fabrication error, and low polarization dependency. They might help to build large-scale multiwavelength nonblocking switching systems, or even to fabricate an optical cross-connecting or routing system on a chip.

  15. Large-scale retrieval for medical image analytics: A comprehensive review.

    PubMed

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. A multiparametric automatic method to monitor long-term reproducibility in digital mammography: results from a regional screening programme.

    PubMed

    Gennaro, G; Ballaminut, A; Contento, G

    2017-09-01

    This study aims to illustrate a multiparametric automatic method for monitoring long-term reproducibility of digital mammography systems, and its application on a large scale. Twenty-five digital mammography systems employed within a regional screening programme were controlled weekly using the same type of phantom, whose images were analysed by an automatic software tool. To assess system reproducibility levels, 15 image quality indices (IQIs) were extracted and compared with the corresponding indices previously determined by a baseline procedure. The coefficients of variation (COVs) of the IQIs were used to assess the overall variability. A total of 2553 phantom images were collected from the 25 digital mammography systems from March 2013 to December 2014. Most of the systems showed excellent image quality reproducibility over the surveillance interval, with mean variability below 5%. Variability of each IQI was 5%, with the exception of one index associated with the smallest phantom objects (0.25 mm), which was below 10%. The method applied for reproducibility tests-multi-detail phantoms, cloud automatic software tool to measure multiple image quality indices and statistical process control-was proven to be effective and applicable on a large scale and to any type of digital mammography system. • Reproducibility of mammography image quality should be monitored by appropriate quality controls. • Use of automatic software tools allows image quality evaluation by multiple indices. • System reproducibility can be assessed comparing current index value with baseline data. • Overall system reproducibility of modern digital mammography systems is excellent. • The method proposed and applied is cost-effective and easily scalable.

  17. Large-scale feature searches of collections of medical imagery

    NASA Astrophysics Data System (ADS)

    Hedgcock, Marcus W.; Karshat, Walter B.; Levitt, Tod S.; Vosky, D. N.

    1993-09-01

    Large scale feature searches of accumulated collections of medical imagery are required for multiple purposes, including clinical studies, administrative planning, epidemiology, teaching, quality improvement, and research. To perform a feature search of large collections of medical imagery, one can either search text descriptors of the imagery in the collection (usually the interpretation), or (if the imagery is in digital format) the imagery itself. At our institution, text interpretations of medical imagery are all available in our VA Hospital Information System. These are downloaded daily into an off-line computer. The text descriptors of most medical imagery are usually formatted as free text, and so require a user friendly database search tool to make searches quick and easy for any user to design and execute. We are tailoring such a database search tool (Liveview), developed by one of the authors (Karshat). To further facilitate search construction, we are constructing (from our accumulated interpretation data) a dictionary of medical and radiological terms and synonyms. If the imagery database is digital, the imagery which the search discovers is easily retrieved from the computer archive. We describe our database search user interface, with examples, and compare the efficacy of computer assisted imagery searches from a clinical text database with manual searches. Our initial work on direct feature searches of digital medical imagery is outlined.

  18. How much a galaxy knows about its large-scale environment?: An information theoretic perspective

    NASA Astrophysics Data System (ADS)

    Pandey, Biswajit; Sarkar, Suman

    2017-05-01

    The small-scale environment characterized by the local density is known to play a crucial role in deciding the galaxy properties but the role of large-scale environment on galaxy formation and evolution still remain a less clear issue. We propose an information theoretic framework to investigate the influence of large-scale environment on galaxy properties and apply it to the data from the Galaxy Zoo project that provides the visual morphological classifications of ˜1 million galaxies from the Sloan Digital Sky Survey. We find a non-zero mutual information between morphology and environment that decreases with increasing length-scales but persists throughout the entire length-scales probed. We estimate the conditional mutual information and the interaction information between morphology and environment by conditioning the environment on different length-scales and find a synergic interaction between them that operates up to at least a length-scales of ˜30 h-1 Mpc. Our analysis indicates that these interactions largely arise due to the mutual information shared between the environments on different length-scales.

  19. Operation and tests of a DDC101 A/D

    NASA Astrophysics Data System (ADS)

    Nguyen, H.

    1994-11-01

    For the KTeV PMT laser monitoring system, one needs a high resolution device with a large dynamic range to be used for digitizing PIN photodiodes. The dynamic range should be wider than or comparable to the KTeV digitizer (17-bits). The Burr-Brown DDC101 is a precision, wide dynamic range, charge digitizing A/D converter with 20-bit resolution, packaged in a 28-pin plastic, double-wide DP. Low level current output devices such as photosensors can be directly connected to its input. The digital output can be clocked-out serially from the pins. For typical operations, a relatively wide gate of 1 msec should be used. The full scale charge is 500 pC for unipolar mode. The bipolar mode scale is +/- 250 pC. The advertised integral nonlinearity is 0.003% of FSR. This document describes only the basic DDC101 operations since full detail can be found in the DDC101 manual. Tests results are given in section 3.

  20. Sloan Digital Sky Survey III photometric quasar clustering: Probing the initial conditions of the Universe

    DOE PAGES

    Ho, Shirley; Agarwal, Nishant; Myers, Adam D.; ...

    2015-05-22

    Here, the Sloan Digital Sky Survey has surveyed 14,555 square degrees of the sky, and delivered over a trillion pixels of imaging data. We present the large-scale clustering of 1.6 million quasars between z=0.5 and z=2.5 that have been classified from this imaging, representing the highest density of quasars ever studied for clustering measurements. This data set spans 0~ 11,00 square degrees and probes a volume of 80 h –3 Gpc 3. In principle, such a large volume and medium density of tracers should facilitate high-precision cosmological constraints. We measure the angular clustering of photometrically classified quasars using an optimalmore » quadratic estimator in four redshift slices with an accuracy of ~ 25% over a bin width of δ l ~ 10–15 on scales corresponding to matter-radiation equality and larger (0ℓ ~ 2–3).« less

  1. Seismic safety in conducting large-scale blasts

    NASA Astrophysics Data System (ADS)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  2. A kilobyte rewritable atomic memory

    NASA Astrophysics Data System (ADS)

    Kalff, Floris; Rebergen, Marnix; Fahrenfort, Nora; Girovsky, Jan; Toskovic, Ranko; Lado, Jose; FernáNdez-Rossier, JoaquíN.; Otte, Sander

    The ability to manipulate individual atoms by means of scanning tunneling microscopy (STM) opens op opportunities for storage of digital data on the atomic scale. Recent achievements in this direction include data storage based on bits encoded in the charge state, the magnetic state, or the local presence of single atoms or atomic assemblies. However, a key challenge at this stage is the extension of such technologies into large-scale rewritable bit arrays. We demonstrate a digital atomic-scale memory of up to 1 kilobyte (8000 bits) using an array of individual surface vacancies in a chlorine terminated Cu(100) surface. The chlorine vacancies are found to be stable at temperatures up to 77 K. The memory, crafted using scanning tunneling microscopy at low temperature, can be read and re-written automatically by means of atomic-scale markers, and offers an areal density of 502 Terabits per square inch, outperforming state-of-the-art hard disk drives by three orders of magnitude.

  3. Mapping the Heavens: Probing Cosmology with Large Surveys

    ScienceCinema

    Frieman, Joshua [Fermilab

    2017-12-09

    This talk will provide an overview of recent and on-going sky surveys, focusing on their implications for cosmology. I will place particular emphasis on the Sloan Digital Sky Survey, the most ambitious mapping of the Universe yet undertaken, showing a virtual fly-through of the survey that reveals the large-scale structure of the galaxy distribution. Recent measurements of this large-scale structure, in combination with observations of the cosmic microwave background, have provided independent evidence for a Universe dominated by dark matter and dark energy as well as insights into how galaxies and larger-scale structures formed. Future planned surveys will build on these foundations to probe the history of the cosmic expansion--and thereby the dark energy--with greater precision.

  4. Soil-geographical regionalization as a basis for digital soil mapping: Karelia case study

    NASA Astrophysics Data System (ADS)

    Krasilnikov, P.; Sidorova, V.; Dubrovina, I.

    2010-12-01

    Recent development of digital soil mapping (DSM) allowed improving significantly the quality of soil maps. We tried to make a set of empirical models for the territory of Karelia, a republic at the North-East of the European territory of Russian Federation. This territory was selected for the pilot study for DSM for two reasons. First, the soils of the region are mainly monogenetic; thus, the effect of paleogeographic environment on recent soils is reduced. Second, the territory was poorly mapped because of low agricultural development: only 1.8% of the total area of the republic is used for agriculture and has large-scale soil maps. The rest of the territory has only small-scale soil maps, compiled basing on the general geographic concepts rather than on field surveys. Thus, the only solution for soil inventory was the predictive digital mapping. The absence of large-scaled soil maps did not allow data mining from previous soil surveys, and only empirical models could be applied. For regionalization purposes, we accepted the division into Northern and Southern Karelia, proposed in the general scheme of soil regionalization of Russia; boundaries between the regions were somewhat modified. Within each region, we specified from 15 (Northern Karelia) to 32 (Southern Karelia) individual soilscapes and proposed soil-topographic and soil-lithological relationships for every soilscape. Further field verification is needed to adjust the models.

  5. Building continental-scale 3D subsurface layers in the Digital Crust project: constrained interpolation and uncertainty estimation.

    NASA Astrophysics Data System (ADS)

    Yulaeva, E.; Fan, Y.; Moosdorf, N.; Richard, S. M.; Bristol, S.; Peters, S. E.; Zaslavsky, I.; Ingebritsen, S.

    2015-12-01

    The Digital Crust EarthCube building block creates a framework for integrating disparate 3D/4D information from multiple sources into a comprehensive model of the structure and composition of the Earth's upper crust, and to demonstrate the utility of this model in several research scenarios. One of such scenarios is estimation of various crustal properties related to fluid dynamics (e.g. permeability and porosity) at each node of any arbitrary unstructured 3D grid to support continental-scale numerical models of fluid flow and transport. Starting from Macrostrat, an existing 4D database of 33,903 chronostratigraphic units, and employing GeoDeepDive, a software system for extracting structured information from unstructured documents, we construct 3D gridded fields of sediment/rock porosity, permeability and geochemistry for large sedimentary basins of North America, which will be used to improve our understanding of large-scale fluid flow, chemical weathering rates, and geochemical fluxes into the ocean. In this talk, we discuss the methods, data gaps (particularly in geologically complex terrain), and various physical and geological constraints on interpolation and uncertainty estimation.

  6. Studies of Global Solar Magnetic Field Patterns Using a Newly Digitized Archive

    NASA Astrophysics Data System (ADS)

    Hewins, I.; Webb, D. F.; Gibson, S. E.; McFadden, R.; Emery, B. A.; Malanushenko, A. V.

    2017-12-01

    The McIntosh Archive consists of a set of hand-drawn solar Carrington maps created by Patrick McIntosh from 1964 to 2009. McIntosh used mainly Ha, He 10830Å and photospheric magnetic measurements from both ground-based and NASA satellite observations. With these he traced polarity inversion lines (PILs), filaments, sunspots and plage and, later, coronal holes, yielding a unique 45-year record of features associated with the large-scale organization of the solar magnetic field. We discuss our efforts to preserve and digitize this archive; the original hand-drawn maps have been scanned, a method for processing these scans into digital, searchable format has been developed, and a website and an archival repository at NOAA's National Centers for Environmental Information (NCEI) has been created. The archive is complete for SC 23 and partially complete for SCs 21 and 22. In this paper we show examples of how the data base can be utilized for scientific applications. We compare the evolution of the areas and boundaries of CHs with other recent results, and we use the maps to track the global, SC-evolution of filaments, large-scale positive and negative polarity regions, PILs and sunspots.

  7. Force-independent distribution of correlated neural inputs to hand muscles during three-digit grasping.

    PubMed

    Poston, Brach; Danna-Dos Santos, Alessander; Jesunathadas, Mark; Hamm, Thomas M; Santello, Marco

    2010-08-01

    The ability to modulate digit forces during grasping relies on the coordination of multiple hand muscles. Because many muscles innervate each digit, the CNS can potentially choose from a large number of muscle coordination patterns to generate a given digit force. Studies of single-digit force production tasks have revealed that the electromyographic (EMG) activity scales uniformly across all muscles as a function of digit force. However, the extent to which this finding applies to the coordination of forces across multiple digits is unknown. We addressed this question by asking subjects (n = 8) to exert isometric forces using a three-digit grip (thumb, index, and middle fingers) that allowed for the quantification of hand muscle coordination within and across digits as a function of grasp force (5, 20, 40, 60, and 80% maximal voluntary force). We recorded EMG from 12 muscles (6 extrinsic and 6 intrinsic) of the three digits. Hand muscle coordination patterns were quantified in the amplitude and frequency domains (EMG-EMG coherence). EMG amplitude scaled uniformly across all hand muscles as a function of grasp force (muscle x force interaction: P = 0.997; cosines of angle between muscle activation pattern vector pairs: 0.897-0.997). Similarly, EMG-EMG coherence was not significantly affected by force (P = 0.324). However, coherence was stronger across extrinsic than that across intrinsic muscle pairs (P = 0.0039). These findings indicate that the distribution of neural drive to multiple hand muscles is force independent and may reflect the anatomical properties or functional roles of hand muscle groups.

  8. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    PubMed Central

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  9. Radiologic image communication and archive service: a secure, scalable, shared approach

    NASA Astrophysics Data System (ADS)

    Fellingham, Linda L.; Kohli, Jagdish C.

    1995-11-01

    The Radiologic Image Communication and Archive (RICA) service is designed to provide a shared archive for medical images to the widest possible audience of customers. Images are acquired from a number of different modalities, each available from many different vendors. Images are acquired digitally from those modalities which support direct digital output and by digitizing films for projection x-ray exams. The RICA Central Archive receives standard DICOM 3.0 messages and data streams from the medical imaging devices at customer institutions over the public telecommunication network. RICA represents a completely scalable resource. The user pays only for what he is using today with the full assurance that as the volume of image data that he wishes to send to the archive increases, the capacity will be there to accept it. To provide this seamless scalability imposes several requirements on the RICA architecture: (1) RICA must support the full array of transport services. (2) The Archive Interface must scale cost-effectively to support local networks that range from the very small (one x-ray digitizer in a medical clinic) to the very large and complex (a large hospital with several CTs, MRs, Nuclear medicine devices, ultrasound machines, CRs, and x-ray digitizers). (3) The Archive Server must scale cost-effectively to support rapidly increasing demands for service providing storage for and access to millions of patients and hundreds of millions of images. The architecture must support the incorporation of improved technology as it becomes available to maintain performance and remain cost-effective as demand rises.

  10. Integration of airborne Thematic Mapper Simulator (TMS) data and digitized aerial photography via an ISH transformation. [Intensity Saturation Hue

    NASA Technical Reports Server (NTRS)

    Ambrosia, Vincent G.; Myers, Jeffrey S.; Ekstrand, Robert E.; Fitzgerald, Michael T.

    1991-01-01

    A simple method for enhancing the spatial and spectral resolution of disparate data sets is presented. Two data sets, digitized aerial photography at a nominal spatial resolution 3,7 meters and TMS digital data at 24.6 meters, were coregistered through a bilinear interpolation to solve the problem of blocky pixel groups resulting from rectification expansion. The two data sets were then subjected to intensity-saturation-hue (ISH) transformations in order to 'blend' the high-spatial-resolution (3.7 m) digitized RC-10 photography with the high spectral (12-bands) and lower spatial (24.6 m) resolution TMS digital data. The resultant merged products make it possible to perform large-scale mapping, ease photointerpretation, and can be derived for any of the 12 available TMS spectral bands.

  11. Demonstration of three gorges archaeological relics based on 3D-visualization technology

    NASA Astrophysics Data System (ADS)

    Xu, Wenli

    2015-12-01

    This paper mainly focuses on the digital demonstration of three gorges archeological relics to exhibit the achievements of the protective measures. A novel and effective method based on 3D-visualization technology, which includes large-scaled landscape reconstruction, virtual studio, and virtual panoramic roaming, etc, is proposed to create a digitized interactive demonstration system. The method contains three stages: pre-processing, 3D modeling and integration. Firstly, abundant archaeological information is classified according to its history and geographical information. Secondly, build up a 3D-model library with the technology of digital images processing and 3D modeling. Thirdly, use virtual reality technology to display the archaeological scenes and cultural relics vividly and realistically. The present work promotes the application of virtual reality to digital projects and enriches the content of digital archaeology.

  12. Large-scale mapping of hard-rock aquifer properties applied to Burkina Faso.

    PubMed

    Courtois, Nathalie; Lachassagne, Patrick; Wyns, Robert; Blanchin, Raymonde; Bougaïré, Francis D; Somé, Sylvain; Tapsoba, Aïssata

    2010-01-01

    A country-scale (1:1,000,000) methodology has been developed for hydrogeologic mapping of hard-rock aquifers (granitic and metamorphic rocks) of the type that underlie a large part of the African continent. The method is based on quantifying the "useful thickness" and hydrodynamic properties of such aquifers and uses a recent conceptual model developed for this hydrogeologic context. This model links hydrodynamic parameters (transmissivity, storativity) to lithology and the geometry of the various layers constituting a weathering profile. The country-scale hydrogeological mapping was implemented in Burkina Faso, where a recent 1:1,000,000-scale digital geological map and a database of some 16,000 water wells were used to evaluate the methodology.

  13. Issues for bringing digital libraries into public use

    NASA Technical Reports Server (NTRS)

    Flater, David W.; Yesha, Yelena

    1993-01-01

    In much the same way that the field of artificial intelligence produced a cult which fervently believed that computers would soon think like human beings, the existence of electronic books has resurrected the paperless society as a utopian vision to some, an apocalyptic horror to others. In this essay we have attempted to provide realistic notions of what digital libraries are likely to become if they are a popular success. E-books are capable of subsuming most of the media we use today and have the potential for added functionality by being interactive. The environmental impact of having millions more computers will be offset to some degree, perhaps even exceeded, by the fact that televisions, stereos, VCR's, CD players, newspapers, magazines, and books will become part of the computer system or be made redundant. On the whole, large-scale use of digital libraries is likely to be a winning proposition. Whether or not this comes to pass depends on the directions taken by today's researchers and software developers. By involving the public, the effort being put into digital libraries can be leveraged into something which is big enough to make a real change for the better. If digital libraries remain the exclusive property of government, universities, and large research firms, then large parts of the world will remain without digital libraries for years to come, just as they have remained without digital phone service for far too long. If software companies try to scuttle the project by patenting crucial algorithms and using proprietary data formats, all of us will suffer. Let us reverse the errors of the past and create a truly open digital library system.

  14. Digital version of "Open-File Report 92-179: Geologic map of the Cow Cove Quadrangle, San Bernardino County, California"

    USGS Publications Warehouse

    Wilshire, Howard G.; Bedford, David R.; Coleman, Teresa

    2002-01-01

    3. Plottable map representations of the database at 1:24,000 scale in PostScript and Adobe PDF formats. The plottable files consist of a color geologic map derived from the spatial database, composited with a topographic base map in the form of the USGS Digital Raster Graphic for the map area. Color symbology from each of these datasets is maintained, which can cause plot file sizes to be large.

  15. Research on an optoelectronic measurement system of dynamic envelope measurement for China Railway high-speed train

    NASA Astrophysics Data System (ADS)

    Zhao, Ziyue; Gan, Xiaochuan; Zou, Zhi; Ma, Liqun

    2018-01-01

    The dynamic envelope measurement plays very important role in the external dimension design for high-speed train. Recently there is no digital measurement system to solve this problem. This paper develops an optoelectronic measurement system by using monocular digital camera, and presents the research of measurement theory, visual target design, calibration algorithm design, software programming and so on. This system consists of several CMOS digital cameras, several luminous targets for measuring, a scale bar, data processing software and a terminal computer. The system has such advantages as large measurement scale, high degree of automation, strong anti-interference ability, noise rejection and real-time measurement. In this paper, we resolve the key technology such as the transformation, storage and calculation of multiple cameras' high resolution digital image. The experimental data show that the repeatability of the system is within 0.02mm and the distance error of the system is within 0.12mm in the whole workspace. This experiment has verified the rationality of the system scheme, the correctness, the precision and effectiveness of the relevant methods.

  16. Evaluation of Cartosat-1 Multi-Scale Digital Surface Modelling Over France

    PubMed Central

    Gianinetto, Marco

    2009-01-01

    On 5 May 2005, the Indian Space Research Organization launched Cartosat-1, the eleventh satellite of its constellation, dedicated to the stereo viewing of the Earth's surface for terrain modeling and large-scale mapping, from the Satish Dhawan Space Centre (India). In early 2006, the Indian Space Research Organization started the Cartosat-1 Scientific Assessment Programme, jointly established with the International Society for Photogrammetry and Remote Sensing. Within this framework, this study evaluated the capabilities of digital surface modeling from Cartosat-1 stereo data for the French test sites of Mausanne les Alpilles and Salon de Provence. The investigation pointed out that for hilly territories it is possible to produce high-resolution digital surface models with a root mean square error less than 7.1 m and a linear error at 90% confidence level less than 9.5 m. The accuracy of the generated digital surface models also fulfilled the requirements of the French Reference 3D®, so Cartosat-1 data may be used to produce or update such kinds of products. PMID:22412311

  17. Bundle block adjustment of large-scale remote sensing data with Block-based Sparse Matrix Compression combined with Preconditioned Conjugate Gradient

    NASA Astrophysics Data System (ADS)

    Zheng, Maoteng; Zhang, Yongjun; Zhou, Shunping; Zhu, Junfeng; Xiong, Xiaodong

    2016-07-01

    In recent years, new platforms and sensors in photogrammetry, remote sensing and computer vision areas have become available, such as Unmanned Aircraft Vehicles (UAV), oblique camera systems, common digital cameras and even mobile phone cameras. Images collected by all these kinds of sensors could be used as remote sensing data sources. These sensors can obtain large-scale remote sensing data which consist of a great number of images. Bundle block adjustment of large-scale data with conventional algorithm is very time and space (memory) consuming due to the super large normal matrix arising from large-scale data. In this paper, an efficient Block-based Sparse Matrix Compression (BSMC) method combined with the Preconditioned Conjugate Gradient (PCG) algorithm is chosen to develop a stable and efficient bundle block adjustment system in order to deal with the large-scale remote sensing data. The main contribution of this work is the BSMC-based PCG algorithm which is more efficient in time and memory than the traditional algorithm without compromising the accuracy. Totally 8 datasets of real data are used to test our proposed method. Preliminary results have shown that the BSMC method can efficiently decrease the time and memory requirement of large-scale data.

  18. The use of digital imaging, video conferencing, and telepathology in histopathology: a national survey.

    PubMed

    Dennis, T; Start, R D; Cross, S S

    2005-03-01

    To undertake a large scale survey of histopathologists in the UK to determine the current infrastructure, training, and attitudes to digital pathology. A postal questionnaire was sent to 500 consultant histopathologists randomly selected from the membership of the Royal College of Pathologists in the UK. There was a response rate of 47%. Sixty four per cent of respondents had a digital camera mounted on their microscope, but only 12% had any sort of telepathology equipment. Thirty per cent used digital images in electronic presentations at meetings at least once a year and only 24% had ever used telepathology in a diagnostic situation. Fifty nine per cent had received no training in digital imaging. Fifty eight per cent felt that the medicolegal implications of duty of care were a barrier to its use. A large proportion of pathologists (69%) were interested in using video conferencing for remote attendance at multidisciplinary team meetings. There is a reasonable level of equipment and communications infrastructure among histopathologists in the UK but a very low level of training. There is resistance to the use of telepathology in the diagnostic context but enthusiasm for the use of video conferencing in multidisciplinary team meetings.

  19. The large-scale effect of environment on galactic conformity

    NASA Astrophysics Data System (ADS)

    Sun, Shuangpeng; Guo, Qi; Wang, Lan; Lacey, Cedric G.; Wang, Jie; Gao, Liang; Pan, Jun

    2018-07-01

    We use a volume-limited galaxy sample from the Sloan Digital Sky Survey Data Release 7 to explore the dependence of galactic conformity on the large-scale environment, measured on ˜4 Mpc scales. We find that the star formation activity of neighbour galaxies depends more strongly on the environment than on the activity of their primary galaxies. In underdense regions most neighbour galaxies tend to be active, while in overdense regions neighbour galaxies are mostly passive, regardless of the activity of their primary galaxies. At a given stellar mass, passive primary galaxies reside in higher density regions than active primary galaxies, leading to the apparently strong conformity signal. The dependence of the activity of neighbour galaxies on environment can be explained by the corresponding dependence of the fraction of satellite galaxies. Similar results are found for galaxies in a semi-analytical model, suggesting that no new physics is required to explain the observed large-scale conformity.

  20. Random access in large-scale DNA data storage.

    PubMed

    Organick, Lee; Ang, Siena Dumas; Chen, Yuan-Jyue; Lopez, Randolph; Yekhanin, Sergey; Makarychev, Konstantin; Racz, Miklos Z; Kamath, Govinda; Gopalan, Parikshit; Nguyen, Bichlien; Takahashi, Christopher N; Newman, Sharon; Parker, Hsing-Yeh; Rashtchian, Cyrus; Stewart, Kendall; Gupta, Gagan; Carlson, Robert; Mulligan, John; Carmean, Douglas; Seelig, Georg; Ceze, Luis; Strauss, Karin

    2018-03-01

    Synthetic DNA is durable and can encode digital data with high density, making it an attractive medium for data storage. However, recovering stored data on a large-scale currently requires all the DNA in a pool to be sequenced, even if only a subset of the information needs to be extracted. Here, we encode and store 35 distinct files (over 200 MB of data), in more than 13 million DNA oligonucleotides, and show that we can recover each file individually and with no errors, using a random access approach. We design and validate a large library of primers that enable individual recovery of all files stored within the DNA. We also develop an algorithm that greatly reduces the sequencing read coverage required for error-free decoding by maximizing information from all sequence reads. These advances demonstrate a viable, large-scale system for DNA data storage and retrieval.

  1. Large-Scale Document Automation: The Systems Integration Issue.

    ERIC Educational Resources Information Center

    Kalthoff, Robert J.

    1985-01-01

    Reviews current technologies for electronic imaging and its recording and transmission, including digital recording, optical data disks, automated image-delivery micrographics, high-density-magnetic recording, and new developments in telecommunications and computers. The role of the document automation systems integrator, who will bring these…

  2. To Make Archives Available Online: Transcending Boundaries or Building Walls?

    ERIC Educational Resources Information Center

    Hansen, Lars-Erik; Sundqvist, Anneli

    2012-01-01

    The development of information technology and the rise of the Internet have rendered a large-scale digitization and dissemination of originally analog information objects. On the Web sites "Lararnas Historia" ("History of Teachers" www.lararhistoria.se) and "Ingenjorshistoria" ("History of Engineers"…

  3. Introducing Large-Scale Innovation in Schools

    ERIC Educational Resources Information Center

    Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.

    2016-01-01

    Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school…

  4. Efficient design of clinical trials and epidemiological research: is it possible?

    PubMed

    Lauer, Michael S; Gordon, David; Wei, Gina; Pearson, Gail

    2017-08-01

    Randomized clinical trials and large-scale, cohort studies continue to have a critical role in generating evidence in cardiovascular medicine; however, the increasing concern is that ballooning costs threaten the clinical trial enterprise. In this Perspectives article, we discuss the changing landscape of clinical research, and clinical trials in particular, focusing on reasons for the increasing costs and inefficiencies. These reasons include excessively complex design, overly restrictive inclusion and exclusion criteria, burdensome regulations, excessive source-data verification, and concerns about the effect of clinical research conduct on workflow. Thought leaders have called on the clinical research community to consider alternative, transformative business models, including those models that focus on simplicity and leveraging of digital resources. We present some examples of innovative approaches by which some investigators have successfully conducted large-scale, clinical trials at relatively low cost. These examples include randomized registry trials, cluster-randomized trials, adaptive trials, and trials that are fully embedded within digital clinical care or administrative platforms.

  5. On the impact of approximate computation in an analog DeSTIN architecture.

    PubMed

    Young, Steven; Lu, Junjie; Holleman, Jeremy; Arel, Itamar

    2014-05-01

    Deep machine learning (DML) holds the potential to revolutionize machine learning by automating rich feature extraction, which has become the primary bottleneck of human engineering in pattern recognition systems. However, the heavy computational burden renders DML systems implemented on conventional digital processors impractical for large-scale problems. The highly parallel computations required to implement large-scale deep learning systems are well suited to custom hardware. Analog computation has demonstrated power efficiency advantages of multiple orders of magnitude relative to digital systems while performing nonideal computations. In this paper, we investigate typical error sources introduced by analog computational elements and their impact on system-level performance in DeSTIN--a compositional deep learning architecture. These inaccuracies are evaluated on a pattern classification benchmark, clearly demonstrating the robustness of the underlying algorithm to the errors introduced by analog computational elements. A clear understanding of the impacts of nonideal computations is necessary to fully exploit the efficiency of analog circuits.

  6. Sensitivity and specificity of a digit symbol recognition trial in the identification of response bias.

    PubMed

    Kim, Nancy; Boone, Kyle B; Victor, Tara; Lu, Po; Keatinge, Carolyn; Mitchell, Cary

    2010-08-01

    Recently published practice standards recommend that multiple effort indicators be interspersed throughout neuropsychological evaluations to assess for response bias, which is most efficiently accomplished through use of effort indicators from standard cognitive tests already included in test batteries. The present study examined the utility of a timed recognition trial added to standard administration of the WAIS-III Digit Symbol subtest in a large sample of "real world" noncredible patients (n=82) as compared with credible neuropsychology clinic patients (n=89). Scores from the recognition trial were more sensitive in identifying poor effort than were standard Digit Symbol scores, and use of an equation incorporating Digit Symbol Age-Corrected Scaled Scores plus accuracy and time scores from the recognition trial was associated with nearly 80% sensitivity at 88.7% specificity. Thus, inclusion of a brief recognition trial to Digit Symbol administration has the potential to provide accurate assessment of response bias.

  7. Hierarchical hybrid control of manipulators: Artificial intelligence in large scale integrated circuits

    NASA Technical Reports Server (NTRS)

    Greene, P. H.

    1972-01-01

    Both in practical engineering and in control of muscular systems, low level subsystems automatically provide crude approximations to the proper response. Through low level tuning of these approximations, the proper response variant can emerge from standardized high level commands. Such systems are expressly suited to emerging large scale integrated circuit technology. A computer, using symbolic descriptions of subsystem responses, can select and shape responses of low level digital or analog microcircuits. A mathematical theory that reveals significant informational units in this style of control and software for realizing such information structures are formulated.

  8. Multi-resonant piezoelectric shunting induced by digital controllers for subwavelength elastic wave attenuation in smart metamaterial

    NASA Astrophysics Data System (ADS)

    Wang, Gang; Cheng, Jianqing; Chen, Jingwei; He, Yunze

    2017-02-01

    Instead of analog electronic circuits and components, digital controllers that are capable of active multi-resonant piezoelectric shunting are applied to elastic metamaterials integrated with piezoelectric patches. Thanks to recently introduced digital control techniques, shunting strategies are possible now with transfer functions that can hardly be realized with analog circuits. As an example, the ‘pole-zero’ method is developed to design single- or multi-resonant bandgaps by adjusting poles and zeros in the transfer function of piezoelectric shunting directly. Large simultaneous attenuations in up to three frequency bands at deep subwavelength scale (with normalized frequency as low as 0.077) are achieved. The underlying physical mechanism is attributable to the negative group velocity of the flexural wave within bandgaps. As digital controllers can be readily adapted via wireless broadcasting, the bandgaps can be tuned easily unlike the electric components in analog shunting circuits, which must be tuned one by one manually. The theoretical results are verified experimentally with the measured vibration transmission properties, where large insulations of up to 20 dB in low-frequency ranges are observed.

  9. Reconnaissance geologic map of Kodiak Island and adjacent islands, Alaska

    USGS Publications Warehouse

    Wilson, Frederic H.

    2013-01-01

    Kodiak Island and its adjacent islands, located on the west side of the Gulf of Alaska, contain one of the largest areas of exposure of the flysch and melange of the Chugach terrane of southern Alaska. However, in the past 25 years, only detailed mapping covering small areas in the archipelago has been done. This map and its associated digital files (Wilson and others, 2005) present the best available mapping compiled in an integrated fashion. The map and associated digital files represent part of a systematic effort to release geologic map data for the United States in a uniform manner. The geologic data have been compiled from a wide variety of sources, ranging from state and regional geologic maps to large-scale field mapping. The map data are presented for use at a nominal scale of 1:500,000, although individual datasets (see Wilson and others, 2005) may contain data suitable for use at larger scales.

  10. Digitally programmable microfluidic automaton for multiscale combinatorial mixing and sample processing†

    PubMed Central

    Jensen, Erik C.; Stockton, Amanda M.; Chiesl, Thomas N.; Kim, Jungkyu; Bera, Abhisek; Mathies, Richard A.

    2013-01-01

    A digitally programmable microfluidic Automaton consisting of a 2-dimensional array of pneumatically actuated microvalves is programmed to perform new multiscale mixing and sample processing operations. Large (µL-scale) volume processing operations are enabled by precise metering of multiple reagents within individual nL-scale valves followed by serial repetitive transfer to programmed locations in the array. A novel process exploiting new combining valve concepts is developed for continuous rapid and complete mixing of reagents in less than 800 ms. Mixing, transfer, storage, and rinsing operations are implemented combinatorially to achieve complex assay automation protocols. The practical utility of this technology is demonstrated by performing automated serial dilution for quantitative analysis as well as the first demonstration of on-chip fluorescent derivatization of biomarker targets (carboxylic acids) for microchip capillary electrophoresis on the Mars Organic Analyzer. A language is developed to describe how unit operations are combined to form a microfluidic program. Finally, this technology is used to develop a novel microfluidic 6-sample processor for combinatorial mixing of large sets (>26 unique combinations) of reagents. The digitally programmable microfluidic Automaton is a versatile programmable sample processor for a wide range of process volumes, for multiple samples, and for different types of analyses. PMID:23172232

  11. Application of Virtual and Augmented reality to geoscientific teaching and research.

    NASA Astrophysics Data System (ADS)

    Hodgetts, David

    2017-04-01

    The geological sciences are the ideal candidate for the application of Virtual Reality (VR) and Augmented Reality (AR). Digital data collection techniques such as laser scanning, digital photogrammetry and the increasing use of Unmanned Aerial Vehicles (UAV) or Small Unmanned Aircraft (SUA) technology allow us to collect large datasets efficiently and evermore affordably. This linked with the recent resurgence in VR and AR technologies make these 3D digital datasets even more valuable. These advances in VR and AR have been further supported by rapid improvements in graphics card technologies, and by development of high performance software applications to support them. Visualising data in VR is more complex than normal 3D rendering, consideration needs to be given to latency, frame-rate and the comfort of the viewer to enable reasonably long immersion time. Each frame has to be rendered from 2 viewpoints (one for each eye) requiring twice the rendering than for normal monoscopic views. Any unnatural effects (e.g. incorrect lighting) can lead to an uncomfortable VR experience so these have to be minimised. With large digital outcrop datasets comprising 10's-100's of millions of triangles this is challenging but achievable. Apart from the obvious "wow factor" of VR there are some serious applications. It is often the case that users of digital outcrop data do not appreciate the size of features they are dealing with. This is not the case when using correctly scaled VR, and a true sense of scale can be achieved. In addition VR provides an excellent way of performing quality control on 3D models and interpretations and errors are much more easily visible. VR models can then be used to create content that can then be used in AR applications closing the loop and taking interpretations back into the field.

  12. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    PubMed

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  13. The Electronic Librarian: Inching Towards the Revolution

    ERIC Educational Resources Information Center

    Cuesta, Emerita M.

    2005-01-01

    Electronic resources are transforming the way librarians work. New technological skills have been added to the librarian's tool kit. Some libraries have undertaken large-scale organizational reconfigurations to meet the challenges of the digital environment. Yet libraries still rely on traditional functions such as acquisitions, cataloging, and…

  14. Economizing Education: Assessment Algorithms and Calculative Agencies

    ERIC Educational Resources Information Center

    O'Keeffe, Cormac

    2017-01-01

    International Large Scale Assessments have been producing data about educational attainment for over 60 years. More recently however, these assessments as tests have become digitally and computationally complex and increasingly rely on the calculative work performed by algorithms. In this article I first consider the coordination of relations…

  15. Measurement Invariance of the Digital Natives Assessment Scale across Gender in a Sample of Turkish University Students

    ERIC Educational Resources Information Center

    Ursavas, Ömer Faruk; Kabakçi Yurdakul, Isil; Türk, Mesut; Mcilroy, David

    2016-01-01

    With reference to the digital natives' debate, there is a gap on digital natives' characteristics. To fill this gap, the Digital Natives Assessment Scale was developed to measure students' assessment of the degree to which they perceived themselves to possess the attributes of digital natives. The scale was developed within the Turkish language…

  16. Global energy and water cycle experiment (GEWEX) continental-scale international project (GCIP); reference data sets CD-ROM

    USGS Publications Warehouse

    Rea, Alan; Cederstrand, Joel R.

    1994-01-01

    The data sets on this compact disc are a compilation of several geographic reference data sets of interest to the global-change research community. The data sets were chosen with input from the Global Energy and Water Cycle Experiment (GEWEX) Continental-Scale International Project (GCIP) Data Committee and the GCIP Hydrometeorology and Atmospheric Subpanels. The data sets include: locations and periods of record for stream gages, reservoir gages, and meteorological stations; a 500-meter-resolution digital elevation model; grid-node locations for the Eta numerical weather-prediction model; and digital map data sets of geology, land use, streams, large reservoirs, average annual runoff, average annual precipitation, average annual temperature, average annual heating and cooling degree days, hydrologic units, and state and county boundaries. Also included are digital index maps for LANDSAT scenes, and for the U.S. Geological Survey 1:250,000, 1:100,000, and 1:24,000-scale map series. Most of the data sets cover the conterminous United States; the digital elevation model also includes part of southern Canada. The stream and reservoir gage and meteorological station files cover all states having area within the Mississippi River Basin plus that part of the Mississippi River Basin lying within Canada. Several data-base retrievals were processed by state, therefore many sites outside the Mississippi River Basin are included.

  17. Evaluating Unmanned Aerial Platforms for Cultural Heritage Large Scale Mapping

    NASA Astrophysics Data System (ADS)

    Georgopoulos, A.; Oikonomou, C.; Adamopoulos, E.; Stathopoulou, E. K.

    2016-06-01

    When it comes to large scale mapping of limited areas especially for cultural heritage sites, things become critical. Optical and non-optical sensors are developed to such sizes and weights that can be lifted by such platforms, like e.g. LiDAR units. At the same time there is an increase in emphasis on solutions that enable users to get access to 3D information faster and cheaper. Considering the multitude of platforms, cameras and the advancement of algorithms in conjunction with the increase of available computing power this challenge should and indeed is further investigated. In this paper a short review of the UAS technologies today is attempted. A discussion follows as to their applicability and advantages, depending on their specifications, which vary immensely. The on-board cameras available are also compared and evaluated for large scale mapping. Furthermore a thorough analysis, review and experimentation with different software implementations of Structure from Motion and Multiple View Stereo algorithms, able to process such dense and mostly unordered sequence of digital images is also conducted and presented. As test data set, we use a rich optical and thermal data set from both fixed wing and multi-rotor platforms over an archaeological excavation with adverse height variations and using different cameras. Dense 3D point clouds, digital terrain models and orthophotos have been produced and evaluated for their radiometric as well as metric qualities.

  18. Neural Networks For Demodulation Of Phase-Modulated Signals

    NASA Technical Reports Server (NTRS)

    Altes, Richard A.

    1995-01-01

    Hopfield neural networks proposed for demodulating quadrature phase-shift-keyed (QPSK) signals carrying digital information. Networks solve nonlinear integral equations prior demodulation circuits cannot solve. Consists of set of N operational amplifiers connected in parallel, with weighted feedback from output terminal of each amplifier to input terminals of other amplifiers. Used to solve signal processing problems. Implemented as analog very-large-scale integrated circuit that achieves rapid convergence. Alternatively, implemented as digital simulation of such circuit. Also used to improve phase estimation performance over that of phase-locked loop.

  19. A parallel VLSI architecture for a digital filter using a number theoretic transform

    NASA Technical Reports Server (NTRS)

    Truong, T. K.; Reed, I. S.; Yeh, C. S.; Shao, H. M.

    1983-01-01

    The advantages of a very large scalee integration (VLSI) architecture for implementing a digital filter using fermat number transforms (FNT) are the following: It requires no multiplication. Only additions and bit rotations are needed. It alleviates the usual dynamic range limitation for long sequence FNT's. It utilizes the FNT and inverse FNT circuits 100% of the time. The lengths of the input data and filter sequences can be arbitraty and different. It is regular, simple, and expandable, and as a consequence suitable for VLSI implementation.

  20. Large-Scale Simulations of Plastic Neural Networks on Neuromorphic Hardware

    PubMed Central

    Knight, James C.; Tully, Philip J.; Kaplan, Bernhard A.; Lansner, Anders; Furber, Steve B.

    2016-01-01

    SpiNNaker is a digital, neuromorphic architecture designed for simulating large-scale spiking neural networks at speeds close to biological real-time. Rather than using bespoke analog or digital hardware, the basic computational unit of a SpiNNaker system is a general-purpose ARM processor, allowing it to be programmed to simulate a wide variety of neuron and synapse models. This flexibility is particularly valuable in the study of biological plasticity phenomena. A recently proposed learning rule based on the Bayesian Confidence Propagation Neural Network (BCPNN) paradigm offers a generic framework for modeling the interaction of different plasticity mechanisms using spiking neurons. However, it can be computationally expensive to simulate large networks with BCPNN learning since it requires multiple state variables for each synapse, each of which needs to be updated every simulation time-step. We discuss the trade-offs in efficiency and accuracy involved in developing an event-based BCPNN implementation for SpiNNaker based on an analytical solution to the BCPNN equations, and detail the steps taken to fit this within the limited computational and memory resources of the SpiNNaker architecture. We demonstrate this learning rule by learning temporal sequences of neural activity within a recurrent attractor network which we simulate at scales of up to 2.0 × 104 neurons and 5.1 × 107 plastic synapses: the largest plastic neural network ever to be simulated on neuromorphic hardware. We also run a comparable simulation on a Cray XC-30 supercomputer system and find that, if it is to match the run-time of our SpiNNaker simulation, the super computer system uses approximately 45× more power. This suggests that cheaper, more power efficient neuromorphic systems are becoming useful discovery tools in the study of plasticity in large-scale brain models. PMID:27092061

  1. Computers in Electrical Engineering Education at Virginia Polytechnic Institute.

    ERIC Educational Resources Information Center

    Bennett, A. Wayne

    1982-01-01

    Discusses use of computers in Electrical Engineering (EE) at Virginia Polytechnic Institute. Topics include: departmental background, level of computing power using large scale systems, mini and microcomputers, use of digital logic trainers and analog/hybrid computers, comments on integrating computers into EE curricula, and computer use in…

  2. Implementing Technology: A Change Process

    ERIC Educational Resources Information Center

    Atwell, Nedra; Maxwell, Marge; Romero, Elizabeth

    2008-01-01

    The state of Kentucky has embarked upon a large scale systems change effort to integrate Universal Design for Learning (UDL) principles, including use of digital curriculum and computerized reading supports to improve overall student achievement. A major component of this initiative is the use of Read & Write Gold. As higher expectations are…

  3. Quake Final Video

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Critical infrastructures of the world are at constant risks for earthquakes. Most of these critical structures are designed using archaic, seismic, simulation methods that were built from early digital computers from the 1970s. Idaho National Laboratory’s Seismic Research Group are working to modernize the simulation methods through computational research and large-scale laboratory experiments.

  4. Large Scale IR Evaluation

    ERIC Educational Resources Information Center

    Pavlu, Virgil

    2008-01-01

    Today, search engines are embedded into all aspects of digital world: in addition to Internet search, all operating systems have integrated search engines that respond even as you type, even over the network, even on cell phones; therefore the importance of their efficacy and efficiency cannot be overstated. There are many open possibilities for…

  5. Extraction of drainage networks from large terrain datasets using high throughput computing

    NASA Astrophysics Data System (ADS)

    Gong, Jianya; Xie, Jibo

    2009-02-01

    Advanced digital photogrammetry and remote sensing technology produces large terrain datasets (LTD). How to process and use these LTD has become a big challenge for GIS users. Extracting drainage networks, which are basic for hydrological applications, from LTD is one of the typical applications of digital terrain analysis (DTA) in geographical information applications. Existing serial drainage algorithms cannot deal with large data volumes in a timely fashion, and few GIS platforms can process LTD beyond the GB size. High throughput computing (HTC), a distributed parallel computing mode, is proposed to improve the efficiency of drainage networks extraction from LTD. Drainage network extraction using HTC involves two key issues: (1) how to decompose the large DEM datasets into independent computing units and (2) how to merge the separate outputs into a final result. A new decomposition method is presented in which the large datasets are partitioned into independent computing units using natural watershed boundaries instead of using regular 1-dimensional (strip-wise) and 2-dimensional (block-wise) decomposition. Because the distribution of drainage networks is strongly related to watershed boundaries, the new decomposition method is more effective and natural. The method to extract natural watershed boundaries was improved by using multi-scale DEMs instead of single-scale DEMs. A HTC environment is employed to test the proposed methods with real datasets.

  6. Comparing automated classification and digitization approaches to detect change in eelgrass bed extent during restoration of a large river delta

    USGS Publications Warehouse

    Davenport, Anna Elizabeth; Davis, Jerry D.; Woo, Isa; Grossman, Eric; Barham, Jesse B.; Ellings, Christopher S.; Takekawa, John Y.

    2017-01-01

    Native eelgrass (Zostera marina) is an important contributor to ecosystem services that supplies cover for juvenile fish, supports a variety of invertebrate prey resources for fish and waterbirds, provides substrate for herring roe consumed by numerous fish and birds, helps stabilize sediment, and sequesters organic carbon. Seagrasses are in decline globally, and monitoring changes in their growth and extent is increasingly valuable to determine impacts from large-scale estuarine restoration and inform blue carbon mapping initiatives. Thus, we examined the efficacy of two remote sensing mapping methods with high-resolution (0.5 m pixel size) color near infrared imagery with ground validation to assess change following major tidal marsh restoration. Automated classification of false color aerial imagery and digitized polygons documented a slight decline in eelgrass area directly after restoration followed by an increase two years later. Classification of sparse and low to medium density eelgrass was confounded in areas with algal cover, however large dense patches of eelgrass were well delineated. Automated classification of aerial imagery from unsupervised and supervised methods provided reasonable accuracies of 73% and hand-digitizing polygons from the same imagery yielded similar results. Visual clues for hand digitizing from the high-resolution imagery provided as reliable a map of dense eelgrass extent as automated image classification. We found that automated classification had no advantages over manual digitization particularly because of the limitations of detecting eelgrass with only three bands of imagery and near infrared.

  7. A pilot rating scale for evaluating failure transients in electronic flight control systems

    NASA Technical Reports Server (NTRS)

    Hindson, William S.; Schroeder, Jeffery A.; Eshow, Michelle M.

    1990-01-01

    A pilot rating scale was developed to describe the effects of transients in helicopter flight-control systems on safety-of-flight and on pilot recovery action. The scale was applied to the evaluation of hardovers that could potentially occur in the digital flight-control system being designed for a variable-stability UH-60A research helicopter. Tests were conducted in a large moving-base simulator and in flight. The results of the investigation were combined with existing airworthiness criteria to determine quantitative reliability design goals for the control system.

  8. The use of digital imaging, video conferencing, and telepathology in histopathology: a national survey

    PubMed Central

    Dennis, T; Start, R D; Cross, S S

    2005-01-01

    Aims: To undertake a large scale survey of histopathologists in the UK to determine the current infrastructure, training, and attitudes to digital pathology. Methods: A postal questionnaire was sent to 500 consultant histopathologists randomly selected from the membership of the Royal College of Pathologists in the UK. Results: There was a response rate of 47%. Sixty four per cent of respondents had a digital camera mounted on their microscope, but only 12% had any sort of telepathology equipment. Thirty per cent used digital images in electronic presentations at meetings at least once a year and only 24% had ever used telepathology in a diagnostic situation. Fifty nine per cent had received no training in digital imaging. Fifty eight per cent felt that the medicolegal implications of duty of care were a barrier to its use. A large proportion of pathologists (69%) were interested in using video conferencing for remote attendance at multidisciplinary team meetings. Conclusions: There is a reasonable level of equipment and communications infrastructure among histopathologists in the UK but a very low level of training. There is resistance to the use of telepathology in the diagnostic context but enthusiasm for the use of video conferencing in multidisciplinary team meetings. PMID:15735155

  9. The Receiver System for the Ooty Wide Field Array

    NASA Astrophysics Data System (ADS)

    Subrahmanya, C. R.; Prasad, P.; Girish, B. S.; Somashekar, R.; Manoharan, P. K.; Mittal, A. K.

    2017-03-01

    The legacy Ooty Radio Telescope (ORT) is being reconfigured as a 264-element synthesis telescope, called the Ooty Wide Field Array (OWFA). Its antenna elements are the contiguous 1.92 m sections of the parabolic cylinder. It will operate in a 38-MHz frequency band centred at 326.5 MHz and will be equipped with a digital receiver including a 264-element spectral correlator with a spectral resolution of 48 kHz. OWFA is designed to retain the benefits of equatorial mount, continuous 9-hour tracking ability and large collecting area of the legacy telescope and use of modern digital techniques to enhance the instantaneous field-of-view by more than an order of magnitude. OWFA has unique advantages for contemporary investigations related to large scale structure, transient events and space weather watch. In this paper, we describe the RF subsystems, digitizers and fibre optic communication of OWFA and highlight some specific aspects of the system relevant for the observations planned during the initial operation.

  10. Concepts and Plans towards fast large scale Monte Carlo production for the ATLAS Experiment

    NASA Astrophysics Data System (ADS)

    Ritsch, E.; Atlas Collaboration

    2014-06-01

    The huge success of the physics program of the ATLAS experiment at the Large Hadron Collider (LHC) during Run 1 relies upon a great number of simulated Monte Carlo events. This Monte Carlo production takes the biggest part of the computing resources being in use by ATLAS as of now. In this document we describe the plans to overcome the computing resource limitations for large scale Monte Carlo production in the ATLAS Experiment for Run 2, and beyond. A number of fast detector simulation, digitization and reconstruction techniques are being discussed, based upon a new flexible detector simulation framework. To optimally benefit from these developments, a redesigned ATLAS MC production chain is presented at the end of this document.

  11. Evaluation of Reading Habits of Teacher Candidates: Study of Scale Development

    ERIC Educational Resources Information Center

    Erkan, Senem Seda Sahenk; Dagal, Asude Balaban; Tezcan, Özlem

    2016-01-01

    The main purpose of this study was to develop a valid and reliable scale for printed and digital competencies ("The Printed and Digital Reading Habits Scale"). The problem statement of this research can be expressed as: "The Printed and Digital Reading Habits Scale: is a valid and reliable scale?" In this study, the scale…

  12. Introducing Large-Scale Innovation in Schools

    NASA Astrophysics Data System (ADS)

    Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.

    2016-08-01

    Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school education via a large-scale implementation of an open-scale methodology in using technology-supported innovation. The present paper describes this innovation scheme which involved schools and teachers all over Europe, embedded technology-enhanced learning into wider school environments and provided training to teachers. Our implementation scheme consisted of three phases: (1) stimulating interest, (2) incorporating the innovation into school settings and (3) accelerating the implementation of the innovation. The scheme's impact was monitored for a school year using five indicators: leadership and vision building, ICT in the curriculum, development of ICT culture, professional development support, and school resources and infrastructure. Based on about 400 schools, our study produced four results: (1) The growth in digital maturity was substantial, even for previously high scoring schools. This was even more important for indicators such as vision and leadership" and "professional development." (2) The evolution of networking is presented graphically, showing the gradual growth of connections achieved. (3) These communities became core nodes, involving numerous teachers in sharing educational content and experiences: One out of three registered users (36 %) has shared his/her educational resources in at least one community. (4) Satisfaction scores ranged from 76 % (offer of useful support through teacher academies) to 87 % (good environment to exchange best practices). Initiatives such as ODS add substantial value to schools on a large scale.

  13. Towards a Standard Mixed-Signal Parallel Processing Architecture for Miniature and Microrobotics.

    PubMed

    Sadler, Brian M; Hoyos, Sebastian

    2014-01-01

    The conventional analog-to-digital conversion (ADC) and digital signal processing (DSP) architecture has led to major advances in miniature and micro-systems technology over the past several decades. The outlook for these systems is significantly enhanced by advances in sensing, signal processing, communications and control, and the combination of these technologies enables autonomous robotics on the miniature to micro scales. In this article we look at trends in the combination of analog and digital (mixed-signal) processing, and consider a generalized sampling architecture. Employing a parallel analog basis expansion of the input signal, this scalable approach is adaptable and reconfigurable, and is suitable for a large variety of current and future applications in networking, perception, cognition, and control.

  14. Towards a Standard Mixed-Signal Parallel Processing Architecture for Miniature and Microrobotics

    PubMed Central

    Sadler, Brian M; Hoyos, Sebastian

    2014-01-01

    The conventional analog-to-digital conversion (ADC) and digital signal processing (DSP) architecture has led to major advances in miniature and micro-systems technology over the past several decades. The outlook for these systems is significantly enhanced by advances in sensing, signal processing, communications and control, and the combination of these technologies enables autonomous robotics on the miniature to micro scales. In this article we look at trends in the combination of analog and digital (mixed-signal) processing, and consider a generalized sampling architecture. Employing a parallel analog basis expansion of the input signal, this scalable approach is adaptable and reconfigurable, and is suitable for a large variety of current and future applications in networking, perception, cognition, and control. PMID:26601042

  15. Large-scale environments of narrow-line Seyfert 1 galaxies

    NASA Astrophysics Data System (ADS)

    Järvelä, E.; Lähteenmäki, A.; Lietzen, H.; Poudel, A.; Heinämäki, P.; Einasto, M.

    2017-09-01

    Studying large-scale environments of narrow-line Seyfert 1 (NLS1) galaxies gives a new perspective on their properties, particularly their radio loudness. The large-scale environment is believed to have an impact on the evolution and intrinsic properties of galaxies, however, NLS1 sources have not been studied in this context before. We have a large and diverse sample of 1341 NLS1 galaxies and three separate environment data sets constructed using Sloan Digital Sky Survey. We use various statistical methods to investigate how the properties of NLS1 galaxies are connected to the large-scale environment, and compare the large-scale environments of NLS1 galaxies with other active galactic nuclei (AGN) classes, for example, other jetted AGN and broad-line Seyfert 1 (BLS1) galaxies, to study how they are related. NLS1 galaxies reside in less dense environments than any of the comparison samples, thus confirming their young age. The average large-scale environment density and environmental distribution of NLS1 sources is clearly different compared to BLS1 galaxies, thus it is improbable that they could be the parent population of NLS1 galaxies and unified by orientation. Within the NLS1 class there is a trend of increasing radio loudness with increasing large-scale environment density, indicating that the large-scale environment affects their intrinsic properties. Our results suggest that the NLS1 class of sources is not homogeneous, and furthermore, that a considerable fraction of them are misclassified. We further support a published proposal to replace the traditional classification to radio-loud, and radio-quiet or radio-silent sources with a division into jetted and non-jetted sources.

  16. Challenges in Managing Trustworthy Large-scale Digital Science

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.

    2017-12-01

    The increased use of large-scale international digital science has opened a number of challenges for managing, handling, using and preserving scientific information. The large volumes of information are driven by three main categories - model outputs including coupled models and ensembles, data products that have been processing to a level of usability, and increasingly heuristically driven data analysis. These data products are increasingly the ones that are usable by the broad communities, and far in excess of the raw instruments data outputs. The data, software and workflows are then shared and replicated to allow broad use at an international scale, which places further demands of infrastructure to support how the information is managed reliably across distributed resources. Users necessarily rely on these underlying "black boxes" so that they are productive to produce new scientific outcomes. The software for these systems depend on computational infrastructure, software interconnected systems, and information capture systems. This ranges from the fundamentals of the reliability of the compute hardware, system software stacks and libraries, and the model software. Due to these complexities and capacity of the infrastructure, there is an increased emphasis of transparency of the approach and robustness of the methods over the full reproducibility. Furthermore, with large volume data management, it is increasingly difficult to store the historical versions of all model and derived data. Instead, the emphasis is on the ability to access the updated products and the reliability by which both previous outcomes are still relevant and can be updated for the new information. We will discuss these challenges and some of the approaches underway that are being used to address these issues.

  17. Accuracy and consistency of weights provided by home bathroom scales.

    PubMed

    Yorkin, Meredith; Spaccarotella, Kim; Martin-Biggers, Jennifer; Quick, Virginia; Byrd-Bredbenner, Carol

    2013-12-17

    Self-reported body weight is often used for calculation of Body Mass Index because it is easy to collect. Little is known about sources of error introduced by using bathroom scales to measure weight at home. The objective of this study was to evaluate the accuracy and consistency of digital versus dial-type bathroom scales commonly used for self-reported weight. Participants brought functioning bathroom scales (n=18 dial-type, n=43 digital-type) to a central location. Trained researchers assessed accuracy and consistency using certified calibration weights at 10 kg, 25 kg, 50 kg, 75 kg, 100 kg, and 110 kg. Data also were collected on frequency of calibration, age and floor surface beneath the scale. All participants reported using their scale on hard surface flooring. Before calibration, all digital scales displayed 0, but dial scales displayed a mean absolute initial weight of 0.95 (1.9 SD) kg. Digital scales accurately weighed test loads whereas dial-type scale weights differed significantly (p<0.05). Imprecision of dial scales was significantly greater than that of digital scales at all weights (p<0.05). Accuracy and precision did not vary by scale age. Digital home bathroom scales provide sufficiently accurate and consistent weights for public health research. Reminders to zero scales before each use may further improve accuracy of self-reported weight.

  18. Keeping Connected: A Review of the Research Relationship

    ERIC Educational Resources Information Center

    Moss, Julianne; Hay, Trevor

    2014-01-01

    In this paper, some key findings of the Keeping Connected project are discussed in light of the methodological challenges of developing an analytical approach in a large-scale study, particularly in starting with open-ended, participant-selected, digital still visual images as part of 31 longitudinal case studies. The paper works to clarify the…

  19. A Review of Large-Scale "How Much Information?" Inventories: Variations, Achievements and Challenges

    ERIC Educational Resources Information Center

    Hilbert, Martin

    2015-01-01

    Introduction: Pressed by the increasing social importance of digital information, including the current attention given to the "big data paradigm", several research projects have taken up the challenge to quantify the amount of technologically mediated information. Method: This meta-study reviews the eight most important inventories in a…

  20. Design and Evaluation of Simulations for the Development of Complex Decision-Making Skills.

    ERIC Educational Resources Information Center

    Hartley, Roger; Varley, Glen

    2002-01-01

    Command and Control Training Using Simulation (CACTUS) is a computer digital mapping system used by police to manage large-scale public events. Audio and video records of adaptive training scenarios using CACTUS show how the simulation develops decision-making skills for strategic and tactical event management. (SK)

  1. Predicting Southern Appalachian overstory vegetation with digital terrain data

    Treesearch

    Paul V. Bolstad; Wayne Swank; James Vose

    1998-01-01

    Vegetation in mountainous regions responds to small-scale variation in terrain, largely due to effects on both temperature and soil moisture. However, there are few studies of quantitative, terrain-based methods for predicting vegetation composition. This study investigated relationships between forest composition, elevation, and a derived index of terrain shape, and...

  2. Task-driven dictionary learning.

    PubMed

    Mairal, Julien; Bach, Francis; Ponce, Jean

    2012-04-01

    Modeling data with linear combinations of a few elements from a learned dictionary has been the focus of much recent research in machine learning, neuroscience, and signal processing. For signals such as natural images that admit such sparse representations, it is now well established that these models are well suited to restoration tasks. In this context, learning the dictionary amounts to solving a large-scale matrix factorization problem, which can be done efficiently with classical optimization tools. The same approach has also been used for learning features from data for other purposes, e.g., image classification, but tuning the dictionary in a supervised way for these tasks has proven to be more difficult. In this paper, we present a general formulation for supervised dictionary learning adapted to a wide variety of tasks, and present an efficient algorithm for solving the corresponding optimization problem. Experiments on handwritten digit classification, digital art identification, nonlinear inverse image problems, and compressed sensing demonstrate that our approach is effective in large-scale settings, and is well suited to supervised and semi-supervised classification, as well as regression tasks for data that admit sparse representations.

  3. American Society of Photogrammetry and American Congress on Surveying and Mapping, Fall Technical Meeting, ASP Technical Papers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1981-01-01

    Various topics in the field of photogrammetry are addressed. Among the subjects discussed are: remote sensing of Gulf Stream dynamics using VHRR satellite imagery an interactive rectification system for remote sensing imagery use of a single photo and digital terrain matrix for point positioning crop type analysis using Landsat digital data use of a fisheye lens in solar energy assessment remote sensing inventory of Rocky Mountain elk habitat Washington state's large scale ortho program educational image processing. Also discussed are: operational advantages of on-line photogrammetric triangulation analysis of fracturation field photogrammetry as a tool for measuring glacier movement double modelmore » orthophotos used for forest inventory mapping map revisioning module for the Kern PG2 stereoplotter assessing accuracy of digital land-use and terrain data accuracy of earthwork calculations from digital elevation data.« less

  4. Using Swarming Agents for Scalable Security in Large Network Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crouse, Michael; White, Jacob L.; Fulp, Errin W.

    2011-09-23

    The difficulty of securing computer infrastructures increases as they grow in size and complexity. Network-based security solutions such as IDS and firewalls cannot scale because of exponentially increasing computational costs inherent in detecting the rapidly growing number of threat signatures. Hostbased solutions like virus scanners and IDS suffer similar issues, and these are compounded when enterprises try to monitor these in a centralized manner. Swarm-based autonomous agent systems like digital ants and artificial immune systems can provide a scalable security solution for large network environments. The digital ants approach offers a biologically inspired design where each ant in the virtualmore » colony can detect atoms of evidence that may help identify a possible threat. By assembling the atomic evidences from different ant types the colony may detect the threat. This decentralized approach can require, on average, fewer computational resources than traditional centralized solutions; however there are limits to its scalability. This paper describes how dividing a large infrastructure into smaller managed enclaves allows the digital ant framework to effectively operate in larger environments. Experimental results will show that using smaller enclaves allows for more consistent distribution of agents and results in faster response times.« less

  5. Information collection and processing of dam distortion in digital reservoir system

    NASA Astrophysics Data System (ADS)

    Liang, Yong; Zhang, Chengming; Li, Yanling; Wu, Qiulan; Ge, Pingju

    2007-06-01

    The "digital reservoir" is usually understood as describing the whole reservoir with digital information technology to make it serve the human existence and development furthest. Strictly speaking, the "digital reservoir" is referred to describing vast information of the reservoir in different dimension and space-time by RS, GPS, GIS, telemetry, remote-control and virtual reality technology based on computer, multi-media, large-scale memory and wide-band networks technology for the human existence, development and daily work, life and entertainment. The core of "digital reservoir" is to realize the intelligence and visibility of vast information of the reservoir through computers and networks. The dam is main building of reservoir, whose safety concerns reservoir and people's safety. Safety monitoring is important way guaranteeing the dam's safety, which controls the dam's running through collecting the dam's information concerned and developing trend. Safety monitoring of the dam is the process from collection and processing of initial safety information to forming safety concept in the brain. The paper mainly researches information collection and processing of the dam by digital means.

  6. Partially filled electrodes for digital microfluidic devices

    NASA Astrophysics Data System (ADS)

    Pyne, D. G.; Salman, W. M.; Abdelgawad, M.; Sun, Y.

    2013-07-01

    As digital microfluidics technology evolves, the need for integrating additional elements (e.g., sensing/detection and heating elements) on the electrode increases. Consequently, electrode area for droplet actuation is reduced to create space for accommodating these additional elements, which undesirably affects force generation. Electrodes cannot simply be scaled larger to compensate for this loss of force, as this would also increase droplet volume and thereby compromise the advantages thought in miniaturization. Here, we present a study evaluating, numerically with preliminary experimental verification, different partially filled electrode designs and suggesting designs that combine high actuation forces with a large reduction in electrode area.

  7. Quantum-classical interface based on single flux quantum digital logic

    NASA Astrophysics Data System (ADS)

    McDermott, R.; Vavilov, M. G.; Plourde, B. L. T.; Wilhelm, F. K.; Liebermann, P. J.; Mukhanov, O. A.; Ohki, T. A.

    2018-04-01

    We describe an approach to the integrated control and measurement of a large-scale superconducting multiqubit array comprising up to 108 physical qubits using a proximal coprocessor based on the Single Flux Quantum (SFQ) digital logic family. Coherent control is realized by irradiating the qubits directly with classical bitstreams derived from optimal control theory. Qubit measurement is performed by a Josephson photon counter, which provides access to the classical result of projective quantum measurement at the millikelvin stage. We analyze the power budget and physical footprint of the SFQ coprocessor and discuss challenges and opportunities associated with this approach.

  8. Development of Youth Digital Citizenship Scale and Implication for Educational Setting

    ERIC Educational Resources Information Center

    Kim, Minjeong; Choi, Dongyeon

    2018-01-01

    Digital citizens need comprehensive knowledge and technological accessibility to the internet and digital world and teachers have a responsibility to lead them to become digital citizens. However, existing Digital Citizenship Scales contain too broad ranges and do not precisely focus on the target students, so teachers do not have clear criteria…

  9. 5D Modelling: An Efficient Approach for Creating Spatiotemporal Predictive 3D Maps of Large-Scale Cultural Resources

    NASA Astrophysics Data System (ADS)

    Doulamis, A.; Doulamis, N.; Ioannidis, C.; Chrysouli, C.; Grammalidis, N.; Dimitropoulos, K.; Potsiou, C.; Stathopoulou, E.-K.; Ioannides, M.

    2015-08-01

    Outdoor large-scale cultural sites are mostly sensitive to environmental, natural and human made factors, implying an imminent need for a spatio-temporal assessment to identify regions of potential cultural interest (material degradation, structuring, conservation). On the other hand, in Cultural Heritage research quite different actors are involved (archaeologists, curators, conservators, simple users) each of diverse needs. All these statements advocate that a 5D modelling (3D geometry plus time plus levels of details) is ideally required for preservation and assessment of outdoor large scale cultural sites, which is currently implemented as a simple aggregation of 3D digital models at different time and levels of details. The main bottleneck of such an approach is its complexity, making 5D modelling impossible to be validated in real life conditions. In this paper, a cost effective and affordable framework for 5D modelling is proposed based on a spatial-temporal dependent aggregation of 3D digital models, by incorporating a predictive assessment procedure to indicate which regions (surfaces) of an object should be reconstructed at higher levels of details at next time instances and which at lower ones. In this way, dynamic change history maps are created, indicating spatial probabilities of regions needed further 3D modelling at forthcoming instances. Using these maps, predictive assessment can be made, that is, to localize surfaces within the objects where a high accuracy reconstruction process needs to be activated at the forthcoming time instances. The proposed 5D Digital Cultural Heritage Model (5D-DCHM) is implemented using open interoperable standards based on the CityGML framework, which also allows the description of additional semantic metadata information. Visualization aspects are also supported to allow easy manipulation, interaction and representation of the 5D-DCHM geometry and the respective semantic information. The open source 3DCityDB incorporating a PostgreSQL geo-database is used to manage and manipulate 3D data and their semantics.

  10. Implementation of AN Unmanned Aerial Vehicle System for Large Scale Mapping

    NASA Astrophysics Data System (ADS)

    Mah, S. B.; Cryderman, C. S.

    2015-08-01

    Unmanned Aerial Vehicles (UAVs), digital cameras, powerful personal computers, and software have made it possible for geomatics professionals to capture aerial photographs and generate digital terrain models and orthophotographs without using full scale aircraft or hiring mapping professionals. This has been made possible by the availability of miniaturized computers and sensors, and software which has been driven, in part, by the demand for this technology in consumer items such as smartphones. The other force that is in play is the increasing number of Do-It-Yourself (DIY) people who are building UAVs as a hobby or for professional use. Building a UAV system for mapping is an alternative to purchasing a turnkey system. This paper describes factors to be considered when building a UAV mapping system, the choices made, and the test results of a project using this completed system.

  11. Large-scale femtoliter droplet array for digital counting of single biomolecules.

    PubMed

    Kim, Soo Hyeon; Iwai, Shino; Araki, Suguru; Sakakihara, Shouichi; Iino, Ryota; Noji, Hiroyuki

    2012-12-07

    We present a novel device employing one million femtoliter droplets immobilized on a substrate for the quantitative detection of extremely low concentrations of biomolecules in a sample. Surface-modified polystyrene beads carrying either zero or a single biomolecule-reporter enzyme complex are efficiently isolated into femtoliter droplets formed on hydrophilic-in-hydrophobic surfaces. Using a conventional micropipette, this is achieved by sequential injection first with an aqueous solution containing beads, and then with fluorinated oil. The concentration of target biomolecules is estimated from the ratio of the number of signal-emitting droplets to the total number of trapped beads (digital counting). The performance of our digital counting device was demonstrated by detecting a streptavidin-β-galactosidase conjugate with a limit of detection (LOD) of 10 zM. The sensitivity of our device was >20-fold higher than that noted in previous studies where a smaller number of reactors (fifty thousand reactors) were used. Such a low LOD was achieved because of the large number of droplets in an array, allowing simultaneous examination of a large number of beads. When combined with bead-based enzyme-linked immunosorbent assay (digital ELISA), the LOD for the detection of prostate specific antigen reached 2 aM. This value, again, was improved over that noted in a previous study, because of the decreased coefficient of variance of the background measurement determined by the Poisson noise. Our digital counting device using one million droplets has great potential as a highly sensitive, portable immunoassay device that could be used to diagnose diseases.

  12. Clinical comparative study with a large-area amorphous silicon flat-panel detector: image quality and visibility of anatomic structures on chest radiography.

    PubMed

    Fink, Christian; Hallscheidt, Peter J; Noeldge, Gerd; Kampschulte, Annette; Radeleff, Boris; Hosch, Waldemar P; Kauffmann, Günter W; Hansmann, Jochen

    2002-02-01

    The objective of this study was to compare clinical chest radiographs of a large-area, flat-panel digital radiography system and a conventional film-screen radiography system. The comparison was based on an observer preference study of image quality and visibility of anatomic structures. Routine follow-up chest radiographs were obtained from 100 consecutive oncology patients using a large-area, amorphous silicon flat-panel detector digital radiography system (dose equivalent to a 400-speed film system). Hard-copy images were compared with previous examinations of the same individuals taken on a conventional film-screen system (200-speed). Patients were excluded if changes in the chest anatomy were detected or if the time interval between the examinations exceeded 1 year. Observer preference was evaluated for the image quality and the visibility of 15 anatomic structures using a five-point scale. Dose measurements with a chest phantom showed a dose reduction of approximately 50% with the digital radiography system compared with the film-screen radiography system. The image quality and the visibility of all but one anatomic structure of the images obtained with the digital flat-panel detector system were rated significantly superior (p < or = 0.0003) to those obtained with the conventional film-screen radiography system. The image quality and visibility of anatomic structures on the images obtained by the flat-panel detector system were perceived as equal or superior to the images from conventional film-screen chest radiography. This was true even though the radiation dose was reduced approximately 50% with the digital flat-panel detector system.

  13. Flexible, High-Speed CdSe Nanocrystal Integrated Circuits.

    PubMed

    Stinner, F Scott; Lai, Yuming; Straus, Daniel B; Diroll, Benjamin T; Kim, David K; Murray, Christopher B; Kagan, Cherie R

    2015-10-14

    We report large-area, flexible, high-speed analog and digital colloidal CdSe nanocrystal integrated circuits operating at low voltages. Using photolithography and a newly developed process to fabricate vertical interconnect access holes, we scale down device dimensions, reducing parasitic capacitances and increasing the frequency of circuit operation, and scale up device fabrication over 4 in. flexible substrates. We demonstrate amplifiers with ∼7 kHz bandwidth, ring oscillators with <10 μs stage delays, and NAND and NOR logic gates.

  14. Accuracy and consistency of weights provided by home bathroom scales

    PubMed Central

    2013-01-01

    Background Self-reported body weight is often used for calculation of Body Mass Index because it is easy to collect. Little is known about sources of error introduced by using bathroom scales to measure weight at home. The objective of this study was to evaluate the accuracy and consistency of digital versus dial-type bathroom scales commonly used for self-reported weight. Methods Participants brought functioning bathroom scales (n = 18 dial-type, n = 43 digital-type) to a central location. Trained researchers assessed accuracy and consistency using certified calibration weights at 10 kg, 25 kg, 50 kg, 75 kg, 100 kg, and 110 kg. Data also were collected on frequency of calibration, age and floor surface beneath the scale. Results All participants reported using their scale on hard surface flooring. Before calibration, all digital scales displayed 0, but dial scales displayed a mean absolute initial weight of 0.95 (1.9 SD) kg. Digital scales accurately weighed test loads whereas dial-type scale weights differed significantly (p < 0.05). Imprecision of dial scales was significantly greater than that of digital scales at all weights (p < 0.05). Accuracy and precision did not vary by scale age. Conclusions Digital home bathroom scales provide sufficiently accurate and consistent weights for public health research. Reminders to zero scales before each use may further improve accuracy of self-reported weight. PMID:24341761

  15. US GeoData Available Through the Internet

    USGS Publications Warehouse

    ,

    2000-01-01

    The U.S. Geological Survey (USGS) offers certain US GeoData data sets through the Internet. They can be retrieved using the World Wide Web or anonymous File Transfer Protocol (FTP). The data bases and their directory paths are as follows: * 1:24,000-scale digital line graph data in SDTS format (/pub/data/DLG/24K) * 1:2,000,000-scale digital line graph data in SDTS format (/pub/data/DLG/2M) * 1:100,000-scale digital line graph data (/pub/data/DLG/100K) * 1:100,000-scale land use and land cover data (/pub/data/LULC/100K) * 1:250,000-scale land use and land cover data (/pub/data/LULC/250K) * 1:24,000-scale digital elevation data (/pub/data/DEM/7.5min) * 1-degree digital elevation model data (/pub/data/DEM/250)

  16. Karst map of Puerto Rico

    USGS Publications Warehouse

    Alemán González, Wilma B.

    2010-01-01

    This map is a digital compilation, combining the mapping of earlier geologists. Their work, cited on the map, contains more detailed descriptions of karst areas and landforms in Puerto Rico. This map is the basis for the Puerto Rico part of a new national karst map currently being compiled by the U.S. Geological Survey. In addition, this product is a standalone, citable source of digital karst data for Puerto Rico. Nearly 25 percent of the United States is underlain by karst terrain, and a large part of that area is undergoing urban and industrial development. Accurate delineations of karstic rocks are needed at scales suitable for national, State, and local maps. The data on this map contribute to a better understanding of subsidence hazards, groundwater contamination potential, and cave resources as well as serve as a guide to topical research on karst. Because the karst data were digitized from maps having a different scale and projection from those on the base map used for this publication, some karst features may not coincide perfectly with physiographic features portrayed on the base map.

  17. Design of a Digital-Based, Multicomponent Nutrition Guidance System for Prevention of Early Childhood Obesity

    PubMed Central

    Black, Maureen M.; Saavedra, Jose M.

    2016-01-01

    Interventions targeting parenting focused modifiable factors to prevent obesity and promote healthy growth in the first 1000 days of life are needed. Scale-up of interventions to global populations is necessary to reverse trends in weight status among infants and toddlers, and large scale dissemination will require understanding of effective strategies. Utilizing nutrition education theories, this paper describes the design of a digital-based nutrition guidance system targeted to first-time mothers to prevent obesity during the first two years. The multicomponent system consists of scientifically substantiated content, tools, and telephone-based professional support delivered in an anticipatory and sequential manner via the internet, email, and text messages, focusing on educational modules addressing the modifiable factors associated with childhood obesity. Digital delivery formats leverage consumer media trends and provide the opportunity for scale-up, unavailable to previous interventions reliant on resource heavy clinic and home-based counseling. Designed initially for use in the United States, this system's core features are applicable to all contexts and constitute an approach fostering healthy growth, not just obesity prevention. The multicomponent features, combined with a global concern for optimal growth and positive trends in mobile internet use, represent this system's future potential to affect change in nutrition practice in developing countries. PMID:27635257

  18. Distribution and arrest of vertical through-going joints in a seismic-scale carbonate platform exposure (Sorrento peninsula, Italy): insights from integrating field survey and digital outcrop model

    NASA Astrophysics Data System (ADS)

    Corradetti, A.; Tavani, S.; Parente, M.; Iannace, A.; Vinci, F.; Pirmez, C.; Torrieri, S.; Giorgioni, M.; Pignalosa, A.; Mazzoli, S.

    2018-03-01

    Through-going joints cutting across beds are often invoked to match large-scale permeability patterns in tight carbonate reservoirs. However, despite the importance of these structures for fluid flow, only few field studies focused on the understanding and estimation of through-going joint dimensional parameters, including spacing and vertical extent in relation to stratigraphy. Recent improvements in the construction of digital models of outcrops can greatly help to overcome many logistic issues, favouring the evaluation of relationships between jointing and stratigraphy at the reservoir scale. In this study, we present the results obtained from integrating field measurements with a digital outcrop model of a carbonate platform reservoir analogue in the Sorrento peninsula (Italy). The outcrop consists of a nearly vertical cliff exposing a monocline of alternating gently-dipping shallow-water limestones and dolostones, crossed by several vertical joints of different size. This study allowed us to define how major through-going joints pass across thick beds (bed thickness > 30 cm), while they arrest against packages made of thinly stratified layers. In essence, through-going joints arrest on "weak" levels, consisting of thinly bedded layers interposed between packages made of thick beds, in the same manner as bed-confined joints arrest on less competent interlayers.

  19. Digital Rocks Portal: a Sustainable Platform for Data Management, Analysis and Remote Visualization of Volumetric Images of Porous Media

    NASA Astrophysics Data System (ADS)

    Prodanovic, M.; Esteva, M.; Ketcham, R. A.

    2017-12-01

    Nanometer to centimeter-scale imaging such as (focused ion beam) scattered electron microscopy, magnetic resonance imaging and X-ray (micro)tomography has since 1990s introduced 2D and 3D datasets of rock microstructure that allow investigation of nonlinear flow and mechanical phenomena on the length scales that are otherwise impervious to laboratory measurements. The numerical approaches that use such images produce various upscaled parameters required by subsurface flow and deformation simulators. All of this has revolutionized our knowledge about grain scale phenomena. However, a lack of data-sharing infrastructure among research groups makes it difficult to integrate different length scales. We have developed a sustainable, open and easy-to-use repository called the Digital Rocks Portal (https://www.digitalrocksportal.org), that (1) organizes images and related experimental measurements of different porous materials, (2) improves access to them for a wider community of engineering or geosciences researchers not necessarily trained in computer science or data analysis. Digital Rocks Portal (NSF EarthCube Grant 1541008) is the first repository for imaged porous microstructure data. It is implemented within the reliable, 24/7 maintained High Performance Computing Infrastructure supported by the Texas Advanced Computing Center (University of Texas at Austin). Long-term storage is provided through the University of Texas System Research Cyber-infrastructure initiative. We show how the data can be documented, referenced in publications via digital object identifiers (see Figure below for examples), visualized, searched for and linked to other repositories. We show recently implemented integration of the remote parallel visualization, bulk upload for large datasets as well as preliminary flow simulation workflow with the pore structures currently stored in the repository. We discuss the issues of collecting correct metadata, data discoverability and repository sustainability.

  20. Mosaic construction, processing, and review of very large electron micrograph composites

    NASA Astrophysics Data System (ADS)

    Vogt, Robert C., III; Trenkle, John M.; Harmon, Laurel A.

    1996-11-01

    A system of programs is described for acquisition, mosaicking, cueing and interactive review of large-scale transmission electron micrograph composite images. This work was carried out as part of a final-phase clinical analysis study of a drug for the treatment of diabetic peripheral neuropathy. MOre than 500 nerve biopsy samples were prepared, digitally imaged, processed, and reviewed. For a given sample, typically 1000 or more 1.5 megabyte frames were acquired, for a total of between 1 and 2 gigabytes of data per sample. These frames were then automatically registered and mosaicked together into a single virtual image composite, which was subsequently used to perform automatic cueing of axons and axon clusters, as well as review and marking by qualified neuroanatomists. Statistics derived from the review process were used to evaluate the efficacy of the drug in promoting regeneration of myelinated nerve fibers. This effort demonstrates a new, entirely digital capability for doing large-scale electron micrograph studies, in which all of the relevant specimen data can be included at high magnification, as opposed to simply taking a random sample of discrete locations. It opens up the possibility of a new era in electron microscopy--one which broadens the scope of questions that this imaging modality can be used to answer.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKinnon, Archibald D.; Thompson, Seth R.; Doroshchuk, Ruslan A.

    mart grid technologies are transforming the electric power grid into a grid with bi-directional flows of both power and information. Operating millions of new smart meters and smart appliances will significantly impact electric distribution systems resulting in greater efficiency. However, the scale of the grid and the new types of information transmitted will potentially introduce several security risks that cannot be addressed by traditional, centralized security techniques. We propose a new bio-inspired cyber security approach. Social insects, such as ants and bees, have developed complex-adaptive systems that emerge from the collective application of simple, light-weight behaviors. The Digital Ants frameworkmore » is a bio-inspired framework that uses mobile light-weight agents. Sensors within the framework use digital pheromones to communicate with each other and to alert each other of possible cyber security issues. All communication and coordination is both localized and decentralized thereby allowing the framework to scale across the large numbers of devices that will exist in the smart grid. Furthermore, the sensors are light-weight and therefore suitable for implementation on devices with limited computational resources. This paper will provide a brief overview of the Digital Ants framework and then present results from test bed-based demonstrations that show that Digital Ants can identify a cyber attack scenario against smart meter deployments.« less

  2. Integrating multisource land use and land cover data

    USGS Publications Warehouse

    Wright, Bruce E.; Tait, Mike; Lins, K.F.; Crawford, J.S.; Benjamin, S.P.; Brown, Jesslyn F.

    1995-01-01

    As part of the U.S. Geological Survey's (USGS) land use and land cover (LULC) program, the USGS in cooperation with the Environmental Systems Research Institute (ESRI) is collecting and integrating LULC data for a standard USGS 1:100,000-scale product. The LULC data collection techniques include interpreting spectrally clustered Landsat Thematic Mapper (TM) images; interpreting 1-meter resolution digital panchromatic orthophoto images; and, for comparison, aggregating locally available large-scale digital data of urban areas. The area selected is the Vancouver, WA-OR quadrangle, which has a mix of urban, rural agriculture, and forest land. Anticipated products include an integrated LULC prototype data set in a standard classification scheme referenced to the USGS digital line graph (DLG) data of the area and prototype software to develop digital LULC data sets.This project will evaluate a draft standard LULC classification system developed by the USGS for use with various source material and collection techniques. Federal, State, and local governments, and private sector groups will have an opportunity to evaluate the resulting prototype software and data sets and to provide recommendations. It is anticipated that this joint research endeavor will increase future collaboration among interested organizations, public and private, for LULC data collection using common standards and tools.

  3. Airborne Digital Sensor System and GPS-aided inertial technology for direct geopositioning in rough terrain

    USGS Publications Warehouse

    Sanchez, Richard D.

    2004-01-01

    High-resolution airborne digital cameras with onboard data collection based on the Global Positioning System (GPS) and inertial navigation systems (INS) technology may offer a real-time means to gather accurate topographic map information by reducing ground control and eliminating aerial triangulation. Past evaluations of this integrated system over relatively flat terrain have proven successful. The author uses Emerge Digital Sensor System (DSS) combined with Applanix Corporation?s Position and Orientation Solutions for Direct Georeferencing to examine the positional mapping accuracy in rough terrain. The positional accuracy documented in this study did not meet large-scale mapping requirements owing to an apparent system mechanical failure. Nonetheless, the findings yield important information on a new approach for mapping in Antarctica and other remote or inaccessible areas of the world.

  4. Non-parametric PCM to ADM conversion. [Pulse Code to Adaptive Delta Modulation

    NASA Technical Reports Server (NTRS)

    Locicero, J. L.; Schilling, D. L.

    1977-01-01

    An all-digital technique to convert pulse code modulated (PCM) signals into adaptive delta modulation (ADM) format is presented. The converter developed is shown to be independent of the statistical parameters of the encoded signal and can be constructed with only standard digital hardware. The structure of the converter is simple enough to be fabricated on a large scale integrated circuit where the advantages of reliability and cost can be optimized. A concise evaluation of this PCM to ADM translation technique is presented and several converters are simulated on a digital computer. A family of performance curves is given which displays the signal-to-noise ratio for sinusoidal test signals subjected to the conversion process, as a function of input signal power for several ratios of ADM rate to Nyquist rate.

  5. Mesofluidic two stage digital valve

    DOEpatents

    Jansen, John F; Love, Lonnie J; Lind, Randall F; Richardson, Bradley S

    2013-12-31

    A mesofluidic scale digital valve system includes a first mesofluidic scale valve having a valve body including a bore, wherein the valve body is configured to cooperate with a solenoid disposed substantially adjacent to the valve body to translate a poppet carried within the bore. The mesofluidic scale digital valve system also includes a second mesofluidic scale valve disposed substantially perpendicular to the first mesofluidic scale valve. The mesofluidic scale digital valve system further includes a control element in communication with the solenoid, wherein the control element is configured to maintain the solenoid in an energized state for a fixed period of time to provide a desired flow rate through an orifice of the second mesofluidic valve.

  6. Watershed boundaries and digital elevation model of Oklahoma derived from 1:100,000-scale digital topographic maps

    USGS Publications Warehouse

    Cederstrand, J.R.; Rea, A.H.

    1995-01-01

    This document provides a general description of the procedures used to develop the data sets included on this compact disc. This compact disc contains watershed boundaries for Oklahoma, a digital elevation model, and other data sets derived from the digital elevation model. The digital elevation model was produced using the ANUDEM software package, written by Michael Hutchinson and licensed from the Centre for Resource and Environmental Studies at The Australian National University. Elevation data (hypsography) and streams (hydrography) from digital versions of the U.S. Geological Survey 1:100,000-scale topographic maps were used by the ANUDEM package to produce a hydrologically conditioned digital elevation model with a 60-meter cell size. This digital elevation model is well suited for drainage-basin delineation using automated techniques. Additional data sets include flow-direction, flow-accumulation, and shaded-relief grids, all derived from the digital elevation model, and the hydrography data set used in producing the digital elevation model. The watershed boundaries derived from the digital elevation model have been edited to be consistent with contours and streams from the U.S. Geological Survey 1:100,000-scale topographic maps. The watershed data set includes boundaries for 11-digit Hydrologic Unit Codes (watersheds) within Oklahoma, and 8-digit Hydrologic Unit Codes (cataloging units) outside Oklahoma. Cataloging-unit boundaries based on 1:250,000-scale maps outside Oklahoma for the Arkansas, Red, and White River basins are included. The other data sets cover Oklahoma, and where available, portions of 1:100,000-scale quadrangles adjoining Oklahoma.

  7. Efficient estimation and large-scale evaluation of lateral chromatic aberration for digital image forensics

    NASA Astrophysics Data System (ADS)

    Gloe, Thomas; Borowka, Karsten; Winkler, Antje

    2010-01-01

    The analysis of lateral chromatic aberration forms another ingredient for a well equipped toolbox of an image forensic investigator. Previous work proposed its application to forgery detection1 and image source identification.2 This paper takes a closer look on the current state-of-the-art method to analyse lateral chromatic aberration and presents a new approach to estimate lateral chromatic aberration in a runtime-efficient way. Employing a set of 11 different camera models including 43 devices, the characteristic of lateral chromatic aberration is investigated in a large-scale. The reported results point to general difficulties that have to be considered in real world investigations.

  8. Tools for Large-Scale Data Analytic Examination of Relational and Epistemic Networks in Engineering Education

    ERIC Educational Resources Information Center

    Madhavan, Krishna; Johri, Aditya; Xian, Hanjun; Wang, G. Alan; Liu, Xiaomo

    2014-01-01

    The proliferation of digital information technologies and related infrastructure has given rise to novel ways of capturing, storing and analyzing data. In this paper, we describe the research and development of an information system called Interactive Knowledge Networks for Engineering Education Research (iKNEER). This system utilizes a framework…

  9. Gaming the System: Culture, Process, and Perspectives Supporting a Game and App Design Curriculum

    ERIC Educational Resources Information Center

    Herro, Danielle

    2015-01-01

    Games and digital media experiences permeate the lives of youth. Researchers have argued the participatory attributes and cognitive benefits of gaming and media production for more than a decade, relying on socio-cultural theory to bolster their claims. Only recently have large-scale efforts ensued towards moving game play and design into formal…

  10. Optimisation and Validation of the ARAMIS Digital Image Correlation System for Use in Large-scale High-strain-rate Events

    DTIC Science & Technology

    2013-08-01

    enamel paint. Under extreme plastic deformation, the relative deformation of the coating could cause the coating to separate resulting in loss of...point for one to be found. If a discontinuity, such as a crack , occurs through the object separating speckle pattern, then the strain data will only

  11. A Smart Partnership: Integrating Educational Technology for Underserved Children in India

    ERIC Educational Resources Information Center

    Charania, Amina; Davis, Niki

    2016-01-01

    This paper explores the evolution of a large multi-stakeholder partnership that has grown since 2011 to scale deep engagement with learning through technology and decrease the digital divide for thousands of underserved school children in India. Using as its basis a case study of an initiative called integrated approach to technology in education…

  12. Microcopying wildland maps for distribution and scanner digitizing

    Treesearch

    Elliot L Amidon; Joyce E. Dye

    1976-01-01

    Maps for wildland resource inventory and managament purposes typically show vegetation types, soils, and other areal information. For field work, maps must be large-scale. For safekeeping and compact storage, however, they can be reduced onto film, ready to be enlarged on demand by office viewers. By meeting certain simple requirements, film images are potential input...

  13. Gender Differences in Processing Speed: A Review of Recent Research

    ERIC Educational Resources Information Center

    Roivainen, Eka

    2011-01-01

    A review of recent large-scale studies on gender differences in processing speed and on the cognitive factors assumed to affect processing speed was performed. It was found that females have an advantage in processing speed tasks involving digits and alphabets as well as in rapid naming tasks while males are faster on reaction time tests and…

  14. Creating a standardized watersheds database for the Lower Rio Grande/Río Bravo, Texas

    USGS Publications Warehouse

    Brown, J.R.; Ulery, Randy L.; Parcher, Jean W.

    2000-01-01

    This report describes the creation of a large-scale watershed database for the lower Rio Grande/Río Bravo Basin in Texas. The watershed database includes watersheds delineated to all 1:24,000-scale mapped stream confluences and other hydrologically significant points, selected watershed characteristics, and hydrologic derivative datasets.Computer technology allows generation of preliminary watershed boundaries in a fraction of the time needed for manual methods. This automated process reduces development time and results in quality improvements in watershed boundaries and characteristics. These data can then be compiled in a permanent database, eliminating the time-consuming step of data creation at the beginning of a project and providing a stable base dataset that can give users greater confidence when further subdividing watersheds.A standardized dataset of watershed characteristics is a valuable contribution to the understanding and management of natural resources. Vertical integration of the input datasets used to automatically generate watershed boundaries is crucial to the success of such an effort. The optimum situation would be to use the digital orthophoto quadrangles as the source of all the input datasets. While the hydrographic data from the digital line graphs can be revised to match the digital orthophoto quadrangles, hypsography data cannot be revised to match the digital orthophoto quadrangles. Revised hydrography from the digital orthophoto quadrangle should be used to create an updated digital elevation model that incorporates the stream channels as revised from the digital orthophoto quadrangle. Computer-generated, standardized watersheds that are vertically integrated with existing digital line graph hydrographic data will continue to be difficult to create until revisions can be made to existing source datasets. Until such time, manual editing will be necessary to make adjustments for man-made features and changes in the natural landscape that are not reflected in the digital elevation model data.

  15. Creating a standardized watersheds database for the lower Rio Grande/Rio Bravo, Texas

    USGS Publications Warehouse

    Brown, Julie R.; Ulery, Randy L.; Parcher, Jean W.

    2000-01-01

    This report describes the creation of a large-scale watershed database for the lower Rio Grande/Rio Bravo Basin in Texas. The watershed database includes watersheds delineated to all 1:24,000-scale mapped stream confluences and other hydrologically significant points, selected watershed characteristics, and hydrologic derivative datasets. Computer technology allows generation of preliminary watershed boundaries in a fraction of the time needed for manual methods. This automated process reduces development time and results in quality improvements in watershed boundaries and characteristics. These data can then be compiled in a permanent database, eliminating the time-consuming step of data creation at the beginning of a project and providing a stable base dataset that can give users greater confidence when further subdividing watersheds. A standardized dataset of watershed characteristics is a valuable contribution to the understanding and management of natural resources. Vertical integration of the input datasets used to automatically generate watershed boundaries is crucial to the success of such an effort. The optimum situation would be to use the digital orthophoto quadrangles as the source of all the input datasets. While the hydrographic data from the digital line graphs can be revised to match the digital orthophoto quadrangles, hypsography data cannot be revised to match the digital orthophoto quadrangles. Revised hydrography from the digital orthophoto quadrangle should be used to create an updated digital elevation model that incorporates the stream channels as revised from the digital orthophoto quadrangle. Computer-generated, standardized watersheds that are vertically integrated with existing digital line graph hydrographic data will continue to be difficult to create until revisions can be made to existing source datasets. Until such time, manual editing will be necessary to make adjustments for man-made features and changes in the natural landscape that are not reflected in the digital elevation model data.

  16. An Investigation of Pre-Service Primary School Teachers' Attitudes towards Digital Technology and Digital Citizenship Levels in Terms of Some Variables

    ERIC Educational Resources Information Center

    Çiftci, Serdar; Aladag, Soner

    2018-01-01

    This study aims at investigating the relationship between pre-service primary school teachers' attitudes towards digital technology and digital citizenship scale levels. The research was designed in descriptive survey model. The data collection tools were "Attitude Scale for Digital Technology" (ASDT) developed by Cabi (2016) and…

  17. A UVM simulation environment for the study, optimization and verification of HL-LHC digital pixel readout chips

    NASA Astrophysics Data System (ADS)

    Marconi, S.; Conti, E.; Christiansen, J.; Placidi, P.

    2018-05-01

    The operating conditions of the High Luminosity upgrade of the Large Hadron Collider are very demanding for the design of next generation hybrid pixel readout chips in terms of particle rate, radiation level and data bandwidth. To this purpose, the RD53 Collaboration has developed for the ATLAS and CMS experiments a dedicated simulation and verification environment using industry-consolidated tools and methodologies, such as SystemVerilog and the Universal Verification Methodology (UVM). This paper presents how the so-called VEPIX53 environment has first guided the design of digital architectures, optimized for processing and buffering very high particle rates, and secondly how it has been reused for the functional verification of the first large scale demonstrator chip designed by the collaboration, which has recently been submitted.

  18. Factorized Runge-Kutta-Chebyshev Methods

    NASA Astrophysics Data System (ADS)

    O'Sullivan, Stephen

    2017-05-01

    The second-order extended stability Factorized Runge-Kutta-Chebyshev (FRKC2) explicit schemes for the integration of large systems of PDEs with diffusive terms are presented. The schemes are simple to implement through ordered sequences of forward Euler steps with complex stepsizes, and easily parallelised for large scale problems on distributed architectures. Preserving 7 digits for accuracy at 16 digit precision, the schemes are theoretically capable of maintaining internal stability for acceleration factors in excess of 6000 with respect to standard explicit Runge-Kutta methods. The extent of the stability domain is approximately the same as that of RKC schemes, and a third longer than in the case of RKL2 schemes. Extension of FRKC methods to fourth-order, by both complex splitting and Butcher composition techniques, is also discussed. A publicly available implementation of FRKC2 schemes may be obtained from maths.dit.ie/frkc

  19. Going all digital in a university hospital: a unified large-scale PACS for multiple departments and hospitals

    NASA Astrophysics Data System (ADS)

    Vogl, Raimund

    2001-08-01

    In 1997, a large PACS was first introduced at Innsbruck University Hospital in the context of a new traumatology centre. In the subsequent years, this initial PACS setting covering only one department was expanded to most of the hospital campus, with currently some 250 viewing stations attached. Constantly connecting new modalities and viewing stations created the demand for several redesigns from the original PACS configuration to cope with the increasing data load. We give an account of these changes necessary to develop a multi hospital PACS and the considerations that lead us there. Issues of personnel for running a large scale PACS are discussed and we give an outlook to the new information systems currently under development for archiving and communication of general medical imaging data and for simple telemedicine networking between several large university hospitals.

  20. GIS applications for military operations in coastal zones

    USGS Publications Warehouse

    Fleming, S.; Jordan, T.; Madden, M.; Usery, E.L.; Welch, R.

    2009-01-01

    In order to successfully support current and future US military operations in coastal zones, geospatial information must be rapidly integrated and analyzed to meet ongoing force structure evolution and new mission directives. Coastal zones in a military-operational environment are complex regions that include sea, land and air features that demand high-volume databases of extreme detail within relatively narrow geographic corridors. Static products in the form of analog maps at varying scales traditionally have been used by military commanders and their operational planners. The rapidly changing battlefield of 21st Century warfare, however, demands dynamic mapping solutions. Commercial geographic information system (GIS) software for military-specific applications is now being developed and employed with digital databases to provide customized digital maps of variable scale, content and symbolization tailored to unique demands of military units. Research conducted by the Center for Remote Sensing and Mapping Science at the University of Georgia demonstrated the utility of GIS-based analysis and digital map creation when developing large-scale (1:10,000) products from littoral warfare databases. The methodology employed-selection of data sources (including high resolution commercial images and Lidar), establishment of analysis/modeling parameters, conduct of vehicle mobility analysis, development of models and generation of products (such as a continuous sea-land DEM and geo-visualization of changing shorelines with tidal levels)-is discussed. Based on observations and identified needs from the National Geospatial-Intelligence Agency, formerly the National Imagery and Mapping Agency, and the Department of Defense, prototype GIS models for military operations in sea, land and air environments were created from multiple data sets of a study area at US Marine Corps Base Camp Lejeune, North Carolina. Results of these models, along with methodologies for developing large-scale littoral warfare databases, aid the National Geospatial-Intelligence Agency in meeting littoral warfare analysis, modeling and map generation requirements for US military organizations. ?? 2008 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS).

  1. GIS applications for military operations in coastal zones

    NASA Astrophysics Data System (ADS)

    Fleming, S.; Jordan, T.; Madden, M.; Usery, E. L.; Welch, R.

    In order to successfully support current and future US military operations in coastal zones, geospatial information must be rapidly integrated and analyzed to meet ongoing force structure evolution and new mission directives. Coastal zones in a military-operational environment are complex regions that include sea, land and air features that demand high-volume databases of extreme detail within relatively narrow geographic corridors. Static products in the form of analog maps at varying scales traditionally have been used by military commanders and their operational planners. The rapidly changing battlefield of 21st Century warfare, however, demands dynamic mapping solutions. Commercial geographic information system (GIS) software for military-specific applications is now being developed and employed with digital databases to provide customized digital maps of variable scale, content and symbolization tailored to unique demands of military units. Research conducted by the Center for Remote Sensing and Mapping Science at the University of Georgia demonstrated the utility of GIS-based analysis and digital map creation when developing large-scale (1:10,000) products from littoral warfare databases. The methodology employed-selection of data sources (including high resolution commercial images and Lidar), establishment of analysis/modeling parameters, conduct of vehicle mobility analysis, development of models and generation of products (such as a continuous sea-land DEM and geo-visualization of changing shorelines with tidal levels)-is discussed. Based on observations and identified needs from the National Geospatial-Intelligence Agency, formerly the National Imagery and Mapping Agency, and the Department of Defense, prototype GIS models for military operations in sea, land and air environments were created from multiple data sets of a study area at US Marine Corps Base Camp Lejeune, North Carolina. Results of these models, along with methodologies for developing large-scale littoral warfare databases, aid the National Geospatial-Intelligence Agency in meeting littoral warfare analysis, modeling and map generation requirements for US military organizations.

  2. Digital Reading Disposition Scale: A Study of Validity and Reliability

    ERIC Educational Resources Information Center

    Bulut, Berker; Karasakaloglu, Nuri

    2018-01-01

    In this study, "a Digital Reading Disposition Scale" was developed to determine undergraduate pre-service teacher students' dispositions towards digital reading, opposed to a preference for printed reading material. Initially, a 20-items trial version of the scale was administered to a total sample of N = 301 undergraduate pre-service…

  3. Discrete-Time Demodulator Architectures for Free-Space Broadband Optical Pulse-Position Modulation

    NASA Technical Reports Server (NTRS)

    Gray, A. A.; Lee, C.

    2004-01-01

    The objective of this work is to develop discrete-time demodulator architectures for broadband optical pulse-position modulation (PPM) that are capable of processing Nyquist or near-Nyquist data rates. These architectures are motivated by the numerous advantages of realizing communications demodulators in digital very large scale integrated (VLSI) circuits. The architectures are developed within a framework that encompasses a large body of work in optical communications, synchronization, and multirate discrete-time signal processing and are constrained by the limitations of the state of the art in digital hardware. This work attempts to create a bridge between theoretical communication algorithms and analysis for deep-space optical PPM and modern digital VLSI. The primary focus of this work is on the synthesis of discrete-time processing architectures for accomplishing the most fundamental functions required in PPM demodulators, post-detection filtering, synchronization, and decision processing. The architectures derived are capable of closely approximating the theoretical performance of the continuous-time algorithms from which they are derived. The work concludes with an outline of the development path that leads to hardware.

  4. Large-scale correlations in gas traced by Mg II absorbers around low-mass galaxies

    NASA Astrophysics Data System (ADS)

    Kauffmann, Guinevere

    2018-03-01

    The physical origin of the large-scale conformity in the colours and specific star formation rates of isolated low-mass central galaxies and their neighbours on scales in excess of 1 Mpc is still under debate. One possible scenario is that gas is heated over large scales by feedback from active galactic nuclei (AGNs), leading to coherent modulation of cooling and star formation between well-separated galaxies. In this Letter, the metal line absorption catalogue of Zhu & Ménard is used to probe gas out to large projected radii around a sample of a million galaxies with stellar masses ˜1010M⊙ and photometric redshifts in the range 0.4 < z < 0.8 selected from Sloan Digital Sky Survey imaging data. This galaxy sample covers an effective volume of 2.2 Gpc3. A statistically significant excess of Mg II absorbers is present around the red-low-mass galaxies compared to their blue counterparts out to projected radii of 10 Mpc. In addition, the equivalent width distribution function of Mg II absorbers around low-mass galaxies is shown to be strongly affected by the presence of a nearby (Rp < 2 Mpc) radio-loud AGNs out to projected radii of 5 Mpc.

  5. Prediction of near-surface soil moisture at large scale by digital terrain modeling and neural networks.

    PubMed

    Lavado Contador, J F; Maneta, M; Schnabel, S

    2006-10-01

    The capability of Artificial Neural Network models to forecast near-surface soil moisture at fine spatial scale resolution has been tested for a 99.5 ha watershed located in SW Spain using several easy to achieve digital models of topographic and land cover variables as inputs and a series of soil moisture measurements as training data set. The study methods were designed in order to determining the potentials of the neural network model as a tool to gain insight into soil moisture distribution factors and also in order to optimize the data sampling scheme finding the optimum size of the training data set. Results suggest the efficiency of the methods in forecasting soil moisture, as a tool to assess the optimum number of field samples, and the importance of the variables selected in explaining the final map obtained.

  6. A new scale for the assessment of conjunctival bulbar redness.

    PubMed

    Macchi, Ilaria; Bunya, Vatinee Y; Massaro-Giordano, Mina; Stone, Richard A; Maguire, Maureen G; Zheng, Yuanjie; Chen, Min; Gee, James; Smith, Eli; Daniel, Ebenezer

    2018-06-05

    Current scales for assessment of bulbar conjunctival redness have limitations for evaluating digital images. We developed a scale suited for evaluating digital images and compared it to the Validated Bulbar Redness (VBR) scale. From a digital image database of 4889 color corrected bulbar conjunctival images, we identified 20 images with varied degrees of redness. These images, ten each of nasal and temporal views, constitute the Digital Bulbar Redness (DBR) scale. The chromaticity of these images was assessed with an established image processing algorithm. Using 100 unique, randomly selected images from the database, three trained, non-physician graders applied the DBR scale and printed VBR scale. Agreement was assessed with weighted Kappa statistics (K w ). The DBR scale scores provide linear increments of 10 from 10-100 when redness is measured objectively with an established image processing algorithm. Exact agreement of all graders was 38% and agreement with no more than a difference of ten units between graders was 91%. K w for agreement between any two graders ranged from 0.57 to 0.73 for the DBR scale and from 0.38 to 0.66 for the VBR scale. The DBR scale allowed direct comparison of digital to digital images, could be used in dim lighting, had both temporal and nasal conjunctival reference images, and permitted viewing reference and test images at the same magnification. The novel DBR scale, with its objective linear chromatic steps, demonstrated improved reproducibility, fewer visualization artifacts and improved ease of use over the VBR scale for assessing conjunctival redness. Copyright © 2018. Published by Elsevier Inc.

  7. Clinical evaluation of a 2K x 2K workstation for primary diagnosis in pediatric radiology

    NASA Astrophysics Data System (ADS)

    Razavi, Mahmood; Sayre, James W.; Simons, Margaret A.; Hamedaninia, Azar; Boechat, Maria I.; Hall, Theodore R.; Kangarloo, Hooshang; Taira, Ricky K.; Chuang, Keh-Shih; Kashifian, Payam

    1991-07-01

    Preliminary results of a large-scale ROC study evaluating the diagnostic performance of digital hardcopy film and 2K X 2K softcopy display for pediatric chest radiographs are presented. The pediatric disease categories studied were pneumothorax, linear atelectasis, air bronchograms, and interstitial disease. Digital images were obtained directly from a computed radiography system. Results from the readings of 239 chest radiographs by 4 radiologists show no significant difference between viewing images on film and softcopy display for the disease categories pneumothorax and air bronchograms. A slight performance edge for softcopy was seen for the disease categories of interstitial disease and linear atelectasis.

  8. Rectangular Array Of Digital Processors For Planning Paths

    NASA Technical Reports Server (NTRS)

    Kemeny, Sabrina E.; Fossum, Eric R.; Nixon, Robert H.

    1993-01-01

    Prototype 24 x 25 rectangular array of asynchronous parallel digital processors rapidly finds best path across two-dimensional field, which could be patch of terrain traversed by robotic or military vehicle. Implemented as single-chip very-large-scale integrated circuit. Excepting processors on edges, each processor communicates with four nearest neighbors along paths representing travel to north, south, east, and west. Each processor contains delay generator in form of 8-bit ripple counter, preset to 1 of 256 possible values. Operation begins with choice of processor representing starting point. Transmits signals to nearest neighbor processors, which retransmits to other neighboring processors, and process repeats until signals propagated across entire field.

  9. Image-Enhancement Aid For The Partially Sighted

    NASA Technical Reports Server (NTRS)

    Lawton, T. A.; Gennery, D. B.

    1989-01-01

    Digital filtering enhances ability to read and to recognize objects. Possible to construct portable vision aid by combining miniature video equipment to observe scene and display images with very-large-scale integrated circuits to implement real-time digital image-data processing. Afflicted observer views scene through magnifier to shift spatial frequencies downward and thereby improves perceived image. However, less magnification needed, larger the scene observed. Thus, one measure of effectiveness of new system is amount of magnification required with and without it. In series of tests, found 27 to 70 percent more magnification needed for afflicted observers to recognize unfiltered words than to recognize filtered words.

  10. The MIDAS processor. [Multivariate Interactive Digital Analysis System for multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.; Gordon, M. F.; Mclaughlin, R. H.; Marshall, R. E.

    1975-01-01

    The MIDAS (Multivariate Interactive Digital Analysis System) processor is a high-speed processor designed to process multispectral scanner data (from Landsat, EOS, aircraft, etc.) quickly and cost-effectively to meet the requirements of users of remote sensor data, especially from very large areas. MIDAS consists of a fast multipipeline preprocessor and classifier, an interactive color display and color printer, and a medium scale computer system for analysis and control. The system is designed to process data having as many as 16 spectral bands per picture element at rates of 200,000 picture elements per second into as many as 17 classes using a maximum likelihood decision rule.

  11. Benford analysis of quantum critical phenomena: First digit provides high finite-size scaling exponent while first two and further are not much better

    NASA Astrophysics Data System (ADS)

    Bera, Anindita; Mishra, Utkarsh; Singha Roy, Sudipto; Biswas, Anindya; Sen(De), Aditi; Sen, Ujjwal

    2018-06-01

    Benford's law is an empirical edict stating that the lower digits appear more often than higher ones as the first few significant digits in statistics of natural phenomena and mathematical tables. A marked proportion of such analyses is restricted to the first significant digit. We employ violation of Benford's law, up to the first four significant digits, for investigating magnetization and correlation data of paradigmatic quantum many-body systems to detect cooperative phenomena, focusing on the finite-size scaling exponents thereof. We find that for the transverse field quantum XY model, behavior of the very first significant digit of an observable, at an arbitrary point of the parameter space, is enough to capture the quantum phase transition in the model with a relatively high scaling exponent. A higher number of significant digits do not provide an appreciable further advantage, in particular, in terms of an increase in scaling exponents. Since the first significant digit of a physical quantity is relatively simple to obtain in experiments, the results have potential implications for laboratory observations in noisy environments.

  12. SPIN ALIGNMENTS OF SPIRAL GALAXIES WITHIN THE LARGE-SCALE STRUCTURE FROM SDSS DR7

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Youcai; Yang, Xiaohu; Luo, Wentao

    Using a sample of spiral galaxies selected from the Sloan Digital Sky Survey Data Release 7 and Galaxy Zoo 2, we investigate the alignment of spin axes of spiral galaxies with their surrounding large-scale structure, which is characterized by the large-scale tidal field reconstructed from the data using galaxy groups above a certain mass threshold. We find that the spin axes only have weak tendencies to be aligned with (or perpendicular to) the intermediate (or minor) axis of the local tidal tensor. The signal is the strongest in a cluster environment where all three eigenvalues of the local tidal tensor aremore » positive. Compared to the alignments between halo spins and the local tidal field obtained in N-body simulations, the above observational results are in best agreement with those for the spins of inner regions of halos, suggesting that the disk material traces the angular momentum of dark matter halos in the inner regions.« less

  13. Source-gated transistors for order-of-magnitude performance improvements in thin-film digital circuits

    NASA Astrophysics Data System (ADS)

    Sporea, R. A.; Trainor, M. J.; Young, N. D.; Shannon, J. M.; Silva, S. R. P.

    2014-03-01

    Ultra-large-scale integrated (ULSI) circuits have benefited from successive refinements in device architecture for enormous improvements in speed, power efficiency and areal density. In large-area electronics (LAE), however, the basic building-block, the thin-film field-effect transistor (TFT) has largely remained static. Now, a device concept with fundamentally different operation, the source-gated transistor (SGT) opens the possibility of unprecedented functionality in future low-cost LAE. With its simple structure and operational characteristics of low saturation voltage, stability under electrical stress and large intrinsic gain, the SGT is ideally suited for LAE analog applications. Here, we show using measurements on polysilicon devices that these characteristics lead to substantial improvements in gain, noise margin, power-delay product and overall circuit robustness in digital SGT-based designs. These findings have far-reaching consequences, as LAE will form the technological basis for a variety of future developments in the biomedical, civil engineering, remote sensing, artificial skin areas, as well as wearable and ubiquitous computing, or lightweight applications for space exploration.

  14. Source-gated transistors for order-of-magnitude performance improvements in thin-film digital circuits

    PubMed Central

    Sporea, R. A.; Trainor, M. J.; Young, N. D.; Shannon, J. M.; Silva, S. R. P.

    2014-01-01

    Ultra-large-scale integrated (ULSI) circuits have benefited from successive refinements in device architecture for enormous improvements in speed, power efficiency and areal density. In large-area electronics (LAE), however, the basic building-block, the thin-film field-effect transistor (TFT) has largely remained static. Now, a device concept with fundamentally different operation, the source-gated transistor (SGT) opens the possibility of unprecedented functionality in future low-cost LAE. With its simple structure and operational characteristics of low saturation voltage, stability under electrical stress and large intrinsic gain, the SGT is ideally suited for LAE analog applications. Here, we show using measurements on polysilicon devices that these characteristics lead to substantial improvements in gain, noise margin, power-delay product and overall circuit robustness in digital SGT-based designs. These findings have far-reaching consequences, as LAE will form the technological basis for a variety of future developments in the biomedical, civil engineering, remote sensing, artificial skin areas, as well as wearable and ubiquitous computing, or lightweight applications for space exploration. PMID:24599023

  15. Development of a Watershed Boundary Dataset for Mississippi

    USGS Publications Warehouse

    Van Wilson, K.; Clair, Michael G.; Turnipseed, D. Phil; Rebich, Richard A.

    2009-01-01

    The U.S. Geological Survey, in cooperation with the Mississippi Department of Environmental Quality, U.S. Department of Agriculture-Natural Resources Conservation Service, Mississippi Department of Transportation, U.S. Department of Agriculture-Forest Service, and the Mississippi Automated Resource Information System, developed a 1:24,000-scale Watershed Boundary Dataset for Mississippi including watershed and subwatershed boundaries, codes, names, and drainage areas. The Watershed Boundary Dataset for Mississippi provides a standard geographical framework for water-resources and selected land-resources planning. The original 8-digit subbasins (hydrologic unit codes) were further subdivided into 10-digit watersheds and 12-digit subwatersheds - the exceptions are the Lower Mississippi River Alluvial Plain (known locally as the Delta) and the Mississippi River inside levees, which were only subdivided into 10-digit watersheds. Also, large water bodies in the Mississippi Sound along the coast were not delineated as small as a typical 12-digit subwatershed. All of the data - including watershed and subwatershed boundaries, hydrologic unit codes and names, and drainage-area data - are stored in a Geographic Information System database.

  16. Digital Marine Bioprospecting: Mining New Neurotoxin Drug Candidates from the Transcriptomes of Cold-Water Sea Anemones

    PubMed Central

    Urbarova, Ilona; Karlsen, Bård Ove; Okkenhaug, Siri; Seternes, Ole Morten; Johansen, Steinar D.; Emblem, Åse

    2012-01-01

    Marine bioprospecting is the search for new marine bioactive compounds and large-scale screening in extracts represents the traditional approach. Here, we report an alternative complementary protocol, called digital marine bioprospecting, based on deep sequencing of transcriptomes. We sequenced the transcriptomes from the adult polyp stage of two cold-water sea anemones, Bolocera tuediae and Hormathia digitata. We generated approximately 1.1 million quality-filtered sequencing reads by 454 pyrosequencing, which were assembled into approximately 120,000 contigs and 220,000 single reads. Based on annotation and gene ontology analysis we profiled the expressed mRNA transcripts according to known biological processes. As a proof-of-concept we identified polypeptide toxins with a potential blocking activity on sodium and potassium voltage-gated channels from digital transcriptome libraries. PMID:23170083

  17. Citizen journalism in a time of crisis: lessons from a large-scale California wildfire

    Treesearch

    S. Gillette; J. Taylor; D.J. Chavez; R. Hodgson; J. Downing

    2007-01-01

    The accessibility of news production tools through consumer communication technology has made it possible for media consumers to become media producers. The evolution of media consumer to media producer has important implications for the shape of public discourse during a time of crisis. Citizen journalists cover crisis events using camera cell phones and digital...

  18. Avionic Data Bus Integration Technology

    DTIC Science & Technology

    1991-12-01

    address the hardware-software interaction between a digital data bus and an avionic system. Very Large Scale Integration (VLSI) ICs and multiversion ...the SCP. In 1984, the Sperry Corporation developed a fault tolerant system which employed multiversion programming, voting, and monitoring for error... MULTIVERSION PROGRAMMING. N-version programming. 226 N-VERSION PROGRAMMING. The independent coding of a number, N, of redundant computer programs that

  19. Systolic VLSI Reed-Solomon Decoder

    NASA Technical Reports Server (NTRS)

    Shao, H. M.; Truong, T. K.; Deutsch, L. J.; Yuen, J. H.

    1986-01-01

    Decoder for digital communications provides high-speed, pipelined ReedSolomon (RS) error-correction decoding of data streams. Principal new feature of proposed decoder is modification of Euclid greatest-common-divisor algorithm to avoid need for time-consuming computations of inverse of certain Galois-field quantities. Decoder architecture suitable for implementation on very-large-scale integrated (VLSI) chips with negative-channel metaloxide/silicon circuitry.

  20. Personal-Level Factors and Google Docs Use in Monmouth County Middle Schools

    ERIC Educational Resources Information Center

    Tetreault, Steven G.

    2014-01-01

    Technology has become an essential part of the world, both in people's personal and professional lives. Digital assessments such as those being implemented in New Jersey as part of the Partnership for Assessment of Readiness for College and Careers (PARCC) will soon be instituted on a large scale; these require students to be able to utilize…

  1. Support of Helicopter 'Free Flight' Operations in the 1996 Olympics

    NASA Technical Reports Server (NTRS)

    Branstetter, James R.; Cooper, Eric G.

    1996-01-01

    The microcosm of activity surrounding the 1996 Olympic Games provided researchers an opportunity for demonstrating state-of-the art technology in the first large-scale deployment of a prototype digital communication/navigation/surveillance system in a confined environment. At the same time it provided an ideal opportunity for transportation officials to showcase the merits of an integrated transportation system in meeting the operational needs to transport time sensitive goods and provide public safety services under real-world conditions. Five aeronautical CNS functions using a digital datalink system were chosen for operational flight testing onboard 91 aircraft, most of them helicopters, participating in the Atlanta Short-Haul Transportation System. These included: GPS-based Automatic Dependent Surveillance, Cockpit Display of Traffic Information, Controller-Pilot Communications, Graphical Weather Information (uplink), and Automated Electronic Pilot Reporting (downlink). Atlanta provided the first opportunity to demonstrate, in an actual operating environment, key datalink functions which would enhance flight safety and situational awareness for the pilot and supplement conventional air traffic control. The knowledge gained from such a large-scale deployment will help system designers in development of a national infrastructure where aircraft would have the ability to navigate autonomously.

  2. Chosen Aspects of the Production of the Basic Map Using Uav Imagery

    NASA Astrophysics Data System (ADS)

    Kedzierski, M.; Fryskowska, A.; Wierzbicki, D.; Nerc, P.

    2016-06-01

    For several years there has been an increasing interest in the use of unmanned aerial vehicles in acquiring image data from a low altitude. Considering the cost-effectiveness of the flight time of UAVs vs. conventional airplanes, the use of the former is advantageous when generating large scale accurate ortophotos. Through the development of UAV imagery, we can update large-scale basic maps. These maps are cartographic products which are used for registration, economic, and strategic planning. On the basis of these maps other cartographic maps are produced, for example maps used building planning. The article presents an assessesment of the usefulness of orthophotos based on UAV imagery to upgrade the basic map. In the research a compact, non-metric camera, mounted on a fixed wing powered by an electric motor was used. The tested area covered flat, agricultural and woodland terrains. The processing and analysis of orthorectification were carried out with the INPHO UASMaster programme. Due to the effect of UAV instability on low-altitude imagery, the use of non-metric digital cameras and the low-accuracy GPS-INS sensors, the geometry of images is visibly lower were compared to conventional digital aerial photos (large values of phi and kappa angles). Therefore, typically, low-altitude images require large along- and across-track direction overlap - usually above 70 %. As a result of the research orthoimages were obtained with a resolution of 0.06 meters and a horizontal accuracy of 0.10m. Digitized basic maps were used as the reference data. The accuracy of orthoimages vs. basic maps was estimated based on the study and on the available reference sources. As a result, it was found that the geometric accuracy and interpretative advantages of the final orthoimages allow the updating of basic maps. It is estimated that such an update of basic maps based on UAV imagery reduces processing time by approx. 40%.

  3. An integrated approach for automated cover-type mapping of large inaccessible areas in Alaska

    USGS Publications Warehouse

    Fleming, Michael D.

    1988-01-01

    The lack of any detailed cover type maps in the state necessitated that a rapid and accurate approach to be employed to develop maps for 329 million acres of Alaska within a seven-year period. This goal has been addressed by using an integrated approach to computer-aided analysis which combines efficient use of field data with the only consistent statewide spatial data sets available: Landsat multispectral scanner data, digital elevation data derived from 1:250 000-scale maps, and 1:60 000-scale color-infrared aerial photographs.

  4. US GeoData Available Through the Internet

    USGS Publications Warehouse

    ,

    2000-01-01

    The U.S. Geological Survey (USGS) offers certain US GeoData data sets through the Internet. They can be retrieved using the World Wide Web or anonymous File Transfer Protocol (FTP). The data bases and their directory paths are as follows: * 1:24,000-scale digital line graph data in SDTS format (/pub/data/DLG/24K) * 1:2,000,000-scale digital line graph data in SDTS format (/pub/data/DLG/2M) * 1:100,000-scale digital line graph data (/pub/data/DLG/100K) * 1:100,000-scale land use and land cover data (/pub/data/LULC/100K) * 1:250,000-scale land use and land cover data (/pub/data/LULC/250K) * 1-degree digital elevation model data (/pub/data/DEM/250)

  5. Using Digital Computer Field Mapping of Outcrops to Examine the Preservation of High-P Rocks During Pervasive, Retrograde Greenschist Fluid Infiltration, Tinos, Cyclades Archipelago, Greece

    NASA Astrophysics Data System (ADS)

    Breeding, C. M.; Ague, J. J.; Broecker, M.

    2001-12-01

    Digital field mapping of outcrops on the island of Tinos, Greece, was undertaken to investigate the nature of retrograde fluid infiltration during exhumation of high-P metamorphic rocks of the Attic-Cycladic blueschist belt. High-resolution digital photographs of outcrops were taken and loaded into graphics editing software on a portable, belt-mounted computer in the field. Geologic features from outcrops were drawn and labeled on the digital images using the software in real-time. The ability to simultaneously identify geologic features in outcrops and digitize those features onto digital photographs in the field allows the creation of detailed, field-verified, outcrop-scale maps that aid in geologic interpretation. During Cretaceous-Eocene subduction in the Cyclades, downgoing crustal material was metamorphosed to eclogite and blueschist facies. Subsequent Oligocene-Miocene exhumation of the high-P rocks was accompanied by pervasive, retrograde fluid infiltration resulting in nearly complete greenschist facies overprinting. On Tinos, most high-P rocks have undergone intense retrogression; however, adjacent to thick marble horizons with completely retrograded contact zones, small (sub km-scale) enclaves of high-P rocks (blueschist and minor eclogite facies) were preserved. Field observations suggest that the remnant high-P zones consist mostly of massive metabasic rocks and minor adjacent metasediments. Within the enclaves, detailed digital outcrop maps reveal that greenschist retrogression increases in intensity outward from the center, implying interaction with a fluid flowing along enclave perimeters. Permeability contrasts could not have been solely responsible for preservation of the high-P rocks, as similar rock suites distal to marble contacts were completely overprinted. We conclude that the retrograded contacts of the marble units served as high-permeability conduits for regional retrograde fluid flow. Pervasive, layer-parallel flow through metasediments would have been drawn into these more permeable flow channels. Deflections in fluid flow paths toward the high flux contacts likely caused retrograde fluids to flow around the enclaves, preserving the zones of "dry," unretrograded high-P rocks near marble horizons. Digital mapping of outcrops is a unique method for direct examination of the relationships between geologic structure, lithology, and mineral assemblage variation in the field. Outcrop mapping in the Attic-Cycladic blueschist belt has revealed that regional fluid flow along contacts can have important implications for the large-scale distribution of mineral assemblages in metamorphic terranes.

  6. The future of medical diagnostics: large digitized databases.

    PubMed

    Kerr, Wesley T; Lau, Edward P; Owens, Gwen E; Trefler, Aaron

    2012-09-01

    The electronic health record mandate within the American Recovery and Reinvestment Act of 2009 will have a far-reaching affect on medicine. In this article, we provide an in-depth analysis of how this mandate is expected to stimulate the production of large-scale, digitized databases of patient information. There is evidence to suggest that millions of patients and the National Institutes of Health will fully support the mining of such databases to better understand the process of diagnosing patients. This data mining likely will reaffirm and quantify known risk factors for many diagnoses. This quantification may be leveraged to further develop computer-aided diagnostic tools that weigh risk factors and provide decision support for health care providers. We expect that creation of these databases will stimulate the development of computer-aided diagnostic support tools that will become an integral part of modern medicine.

  7. A fast low-power optical memory based on coupled micro-ring lasers

    NASA Astrophysics Data System (ADS)

    Hill, Martin T.; Dorren, Harmen J. S.; de Vries, Tjibbe; Leijtens, Xaveer J. M.; den Besten, Jan Hendrik; Smalbrugge, Barry; Oei, Yok-Siang; Binsma, Hans; Khoe, Giok-Djan; Smit, Meint K.

    2004-11-01

    The increasing speed of fibre-optic-based telecommunications has focused attention on high-speed optical processing of digital information. Complex optical processing requires a high-density, high-speed, low-power optical memory that can be integrated with planar semiconductor technology for buffering of decisions and telecommunication data. Recently, ring lasers with extremely small size and low operating power have been made, and we demonstrate here a memory element constructed by interconnecting these microscopic lasers. Our device occupies an area of 18 × 40µm2 on an InP/InGaAsP photonic integrated circuit, and switches within 20ps with 5.5fJ optical switching energy. Simulations show that the element has the potential for much smaller dimensions and switching times. Large numbers of such memory elements can be densely integrated and interconnected on a photonic integrated circuit: fast digital optical information processing systems employing large-scale integration should now be viable.

  8. Entomological Collections in the Age of Big Data.

    PubMed

    Short, Andrew Edward Z; Dikow, Torsten; Moreau, Corrie S

    2018-01-07

    With a million described species and more than half a billion preserved specimens, the large scale of insect collections is unequaled by those of any other group. Advances in genomics, collection digitization, and imaging have begun to more fully harness the power that such large data stores can provide. These new approaches and technologies have transformed how entomological collections are managed and utilized. While genomic research has fundamentally changed the way many specimens are collected and curated, advances in technology have shown promise for extracting sequence data from the vast holdings already in museums. Efforts to mainstream specimen digitization have taken root and have accelerated traditional taxonomic studies as well as distribution modeling and global change research. Emerging imaging technologies such as microcomputed tomography and confocal laser scanning microscopy are changing how morphology can be investigated. This review provides an overview of how the realization of big data has transformed our field and what may lie in store.

  9. Learning about the scale of the solar system using digital planetarium visualizations

    NASA Astrophysics Data System (ADS)

    Yu, Ka Chun; Sahami, Kamran; Dove, James

    2017-07-01

    We studied the use of a digital planetarium for teaching relative distances and sizes in introductory undergraduate astronomy classes. Inspired in part by the classic short film The Powers of Ten and large physical scale models of the Solar System that can be explored on foot, we created lectures using virtual versions of these two pedagogical approaches for classes that saw either an immersive treatment in the planetarium or a non-immersive version in the regular classroom (with N = 973 students participating in total). Students who visited the planetarium had not only the greatest learning gains, but their performance increased with time, whereas students who saw the same visuals projected onto a flat display in their classroom showed less retention over time. The gains seen in the students who visited the planetarium reveal that this medium is a powerful tool for visualizing scale over multiple orders of magnitude. However the modest gains for the students in the regular classroom also show the utility of these visualization approaches for the broader category of classroom physics simulations.

  10. TV Audience Measurement with Big Data.

    PubMed

    Hill, Shawndra

    2014-06-01

    TV audience measurement involves estimating the number of viewers tuned into a TV show at any given time as well as their demographics. First introduced shortly after commercial television broadcasting began in the late 1940s, audience measurement allowed the business of television to flourish by offering networks a way to quantify the monetary value of TV audiences for advertisers, who pay for the estimated number of eyeballs watching during commercials. The first measurement techniques suffered from multiple limitations because reliable, large-scale data were costly to acquire. Yet despite these limitations, measurement standards remained largely unchanged for decades until devices such as cable boxes, video-on-demand boxes, and cell phones, as well as web apps, Internet browser clicks, web queries, and social media activity, resulted in an explosion of digitally available data. TV viewers now leave digital traces that can be used to track almost every aspect of their daily lives, allowing the potential for large-scale aggregation across data sources for individual users and groups and enabling the tracking of more people on more dimensions for more shows. Data are now more comprehensive, available in real time, and cheaper to acquire, enabling accurate and fine-grained TV audience measurement. In this article, I discuss the evolution of audience measurement and what the recent data explosion means for the TV industry and academic research.

  11. A digital photogrammetric method for measuring horizontal surficial movements on the slumgullion earthflow, Hinsdale county, Colorado

    USGS Publications Warehouse

    Powers, P.S.; Chiarle, M.; Savage, W.Z.

    1996-01-01

    The traditional approach to making aerial photographic measurements uses analog or analytic photogrammetric equipment. We have developed a digital method for making measurements from aerial photographs which uses geographic information system (GIS) software, and primarily DOS-based personal computers. This method, which is based on the concept that a direct visual comparison can be made between images derived from two sets of aerial photographs taken at different times, was applied to the surface of the active portion of the Slumgullion earthflow in Colorado to determine horizontal displacement vectors from the movements of visually identifiable objects, such as trees and large rocks. Using this method, more of the slide surface can be mapped in a shorter period of time than using the standard photogrammetric approach. More than 800 horizontal displacement vectors were determined on the active earthflow surface using images produced by our digital photogrammetric technique and 1985 (1:12,000-scale) and 1990 (1:6,000-scale) aerial photographs. The resulting displacement field shows, with a 2-m measurement error (??? 10%), that the fastest moving portion of the landslide underwent 15-29 m of horizontal displacement between 1985 and 1990. Copyright ?? 1996 Elsevier Science Ltd.

  12. Carbon nanotube circuit integration up to sub-20 nm channel lengths.

    PubMed

    Shulaker, Max Marcel; Van Rethy, Jelle; Wu, Tony F; Liyanage, Luckshitha Suriyasena; Wei, Hai; Li, Zuanyi; Pop, Eric; Gielen, Georges; Wong, H-S Philip; Mitra, Subhasish

    2014-04-22

    Carbon nanotube (CNT) field-effect transistors (CNFETs) are a promising emerging technology projected to achieve over an order of magnitude improvement in energy-delay product, a metric of performance and energy efficiency, compared to silicon-based circuits. However, due to substantial imperfections inherent with CNTs, the promise of CNFETs has yet to be fully realized. Techniques to overcome these imperfections have yielded promising results, but thus far only at large technology nodes (1 μm device size). Here we demonstrate the first very large scale integration (VLSI)-compatible approach to realizing CNFET digital circuits at highly scaled technology nodes, with devices ranging from 90 nm to sub-20 nm channel lengths. We demonstrate inverters functioning at 1 MHz and a fully integrated CNFET infrared light sensor and interface circuit at 32 nm channel length. This demonstrates the feasibility of realizing more complex CNFET circuits at highly scaled technology nodes.

  13. Simultaneous Study of Intake and In-Cylinder IC Engine Flow Fields to Provide an Insight into Intake Induced Cyclic Variations

    NASA Astrophysics Data System (ADS)

    Justham, T.; Jarvis, S.; Clarke, A.; Garner, C. P.; Hargrave, G. K.; Halliwell, N. A.

    2006-07-01

    Simultaneous intake and in-cylinder digital particle image velocimetry (DPIV) experimental data is presented for a motored spark ignition (SI) optical internal combustion (IC) engine. Two individual DPIV systems were employed to study the inter-relationship between the intake and in-cylinder flow fields at an engine speed of 1500 rpm. Results for the intake runner velocity field at the time of maximum intake valve lift are compared to incylinder velocity fields later in the same engine cycle. Relationships between flow structures within the runner and cylinder were seen to be strong during the intake stroke but less significant during compression. Cyclic variations within the intake runner were seen to affect the large scale bulk flow motion. The subsequent decay of the large scale motions into smaller scale turbulent structures during the compression stroke appear to reduce the relationship with the intake flow variations.

  14. Exploring the Digital Natives among Pre-Service Teachers in Turkey: A Cross-Cultural Validation of the Digital Native Assessment Scale

    ERIC Educational Resources Information Center

    Teo, Timothy; Kabakçi Yurdakul, Isil; Ursavas, Ömer Faruk

    2016-01-01

    This study aims to explore the digital natives among a sample of pre-service teachers and in the process, examine the validity of a Turkish adaptation of the digital native assessment scale (DNAS) [Teo, T., & Fan, X. (2013). "Coefficient alpha and beyond: Issues and alternatives for educational research." "The Asia-Pacific…

  15. Performance Comparison of the Digital Neuromorphic Hardware SpiNNaker and the Neural Network Simulation Software NEST for a Full-Scale Cortical Microcircuit Model

    PubMed Central

    van Albada, Sacha J.; Rowley, Andrew G.; Senk, Johanna; Hopkins, Michael; Schmidt, Maximilian; Stokes, Alan B.; Lester, David R.; Diesmann, Markus; Furber, Steve B.

    2018-01-01

    The digital neuromorphic hardware SpiNNaker has been developed with the aim of enabling large-scale neural network simulations in real time and with low power consumption. Real-time performance is achieved with 1 ms integration time steps, and thus applies to neural networks for which faster time scales of the dynamics can be neglected. By slowing down the simulation, shorter integration time steps and hence faster time scales, which are often biologically relevant, can be incorporated. We here describe the first full-scale simulations of a cortical microcircuit with biological time scales on SpiNNaker. Since about half the synapses onto the neurons arise within the microcircuit, larger cortical circuits have only moderately more synapses per neuron. Therefore, the full-scale microcircuit paves the way for simulating cortical circuits of arbitrary size. With approximately 80, 000 neurons and 0.3 billion synapses, this model is the largest simulated on SpiNNaker to date. The scale-up is enabled by recent developments in the SpiNNaker software stack that allow simulations to be spread across multiple boards. Comparison with simulations using the NEST software on a high-performance cluster shows that both simulators can reach a similar accuracy, despite the fixed-point arithmetic of SpiNNaker, demonstrating the usability of SpiNNaker for computational neuroscience applications with biological time scales and large network size. The runtime and power consumption are also assessed for both simulators on the example of the cortical microcircuit model. To obtain an accuracy similar to that of NEST with 0.1 ms time steps, SpiNNaker requires a slowdown factor of around 20 compared to real time. The runtime for NEST saturates around 3 times real time using hybrid parallelization with MPI and multi-threading. However, achieving this runtime comes at the cost of increased power and energy consumption. The lowest total energy consumption for NEST is reached at around 144 parallel threads and 4.6 times slowdown. At this setting, NEST and SpiNNaker have a comparable energy consumption per synaptic event. Our results widen the application domain of SpiNNaker and help guide its development, showing that further optimizations such as synapse-centric network representation are necessary to enable real-time simulation of large biological neural networks. PMID:29875620

  16. Can digital pathology result in cost savings? A financial projection for digital pathology implementation at a large integrated health care organization.

    PubMed

    Ho, Jonhan; Ahlers, Stefan M; Stratman, Curtis; Aridor, Orly; Pantanowitz, Liron; Fine, Jeffrey L; Kuzmishin, John A; Montalto, Michael C; Parwani, Anil V

    2014-01-01

    Digital pathology offers potential improvements in workflow and interpretive accuracy. Although currently digital pathology is commonly used for research and education, its clinical use has been limited to niche applications such as frozen sections and remote second opinion consultations. This is mainly due to regulatory hurdles, but also to a dearth of data supporting a positive economic cost-benefit. Large scale adoption of digital pathology and the integration of digital slides into the routine anatomic/surgical pathology "slide less" clinical workflow will occur only if digital pathology will offer a quantifiable benefit, which could come in the form of more efficient and/or higher quality care. As a large academic-based health care organization expecting to adopt digital pathology for primary diagnosis upon its regulatory approval, our institution estimated potential operational cost savings offered by the implementation of an enterprise-wide digital pathology system (DPS). Projected cost savings were calculated for the first 5 years following implementation of a DPS based on operational data collected from the pathology department. Projected savings were based on two factors: (1) Productivity and lab consolidation savings; and (2) avoided treatment costs due to improvements in the accuracy of cancer diagnoses among nonsubspecialty pathologists. Detailed analyses of incremental treatment costs due to interpretive errors, resulting in either a false positive or false negative diagnosis, was performed for melanoma and breast cancer and extrapolated to 10 other common cancers. When phased in over 5-years, total cost savings based on anticipated improvements in pathology productivity and histology lab consolidation were estimated at $12.4 million for an institution with 219,000 annual accessions. The main contributing factors to these savings were gains in pathologist clinical full-time equivalent capacity impacted by improved pathologist productivity and workload distribution. Expanding the current localized specialty sign-out model to an enterprise-wide shared general/subspecialist sign-out model could potentially reduce costs of incorrect treatment by $5.4 million. These calculations were based on annual over and under treatment costs for breast cancer and melanoma estimated to be approximately $26,000 and $11,000/case, respectively, and extrapolated to $21,500/case for other cancer types. The projected 5-year total cost savings for our large academic-based health care organization upon fully implementing a DPS was approximately $18 million. If the costs of digital pathology acquisition and implementation do not exceed this value, the return on investment becomes attractive to hospital administrators. Furthermore, improved patient outcome enabled by this technology strengthens the argument supporting adoption of an enterprise-wide DPS.

  17. Can Digital Pathology Result In Cost Savings? A Financial Projection For Digital Pathology Implementation At A Large Integrated Health Care Organization

    PubMed Central

    Ho, Jonhan; Ahlers, Stefan M.; Stratman, Curtis; Aridor, Orly; Pantanowitz, Liron; Fine, Jeffrey L.; Kuzmishin, John A.; Montalto, Michael C.; Parwani, Anil V.

    2014-01-01

    Background: Digital pathology offers potential improvements in workflow and interpretive accuracy. Although currently digital pathology is commonly used for research and education, its clinical use has been limited to niche applications such as frozen sections and remote second opinion consultations. This is mainly due to regulatory hurdles, but also to a dearth of data supporting a positive economic cost-benefit. Large scale adoption of digital pathology and the integration of digital slides into the routine anatomic/surgical pathology “slide less” clinical workflow will occur only if digital pathology will offer a quantifiable benefit, which could come in the form of more efficient and/or higher quality care. Aim: As a large academic-based health care organization expecting to adopt digital pathology for primary diagnosis upon its regulatory approval, our institution estimated potential operational cost savings offered by the implementation of an enterprise-wide digital pathology system (DPS). Methods: Projected cost savings were calculated for the first 5 years following implementation of a DPS based on operational data collected from the pathology department. Projected savings were based on two factors: (1) Productivity and lab consolidation savings; and (2) avoided treatment costs due to improvements in the accuracy of cancer diagnoses among nonsubspecialty pathologists. Detailed analyses of incremental treatment costs due to interpretive errors, resulting in either a false positive or false negative diagnosis, was performed for melanoma and breast cancer and extrapolated to 10 other common cancers. Results: When phased in over 5-years, total cost savings based on anticipated improvements in pathology productivity and histology lab consolidation were estimated at $12.4 million for an institution with 219,000 annual accessions. The main contributing factors to these savings were gains in pathologist clinical full-time equivalent capacity impacted by improved pathologist productivity and workload distribution. Expanding the current localized specialty sign-out model to an enterprise-wide shared general/subspecialist sign-out model could potentially reduce costs of incorrect treatment by $5.4 million. These calculations were based on annual over and under treatment costs for breast cancer and melanoma estimated to be approximately $26,000 and $11,000/case, respectively, and extrapolated to $21,500/case for other cancer types. Conclusions: The projected 5-year total cost savings for our large academic-based health care organization upon fully implementing a DPS was approximately $18 million. If the costs of digital pathology acquisition and implementation do not exceed this value, the return on investment becomes attractive to hospital administrators. Furthermore, improved patient outcome enabled by this technology strengthens the argument supporting adoption of an enterprise-wide DPS. PMID:25250191

  18. Use of the Wechsler Adult Intelligence Scale Digit Span subtest for malingering detection: a meta-analytic review.

    PubMed

    Jasinski, Lindsey J; Berry, David T R; Shandera, Anni L; Clark, Jessica A

    2011-03-01

    Twenty-four studies utilizing the Wechsler Adult Intelligence Scale (WAIS) Digit Span subtest--either the Reliable Digit Span (RDS) or Age-Corrected Scaled Score (DS-ACSS) variant--for malingering detection were meta-analytically reviewed to evaluate their effectiveness in detecting malingered neurocognitive dysfunction. RDS and DS-ACSS effectively discriminated between honest responders and dissimulators, with average weighted effect sizes of 1.34 and 1.08, respectively. No significant differences were found between RDS and DS-ACSS. Similarly, no differences were found between the Digit Span subtest from the WAIS or Wechsler Memory Scale (WMS). Strong specificity and moderate sensitivity were observed, and optimal cutting scores are recommended.

  19. Tapping the Vast Potential of the Data Deluge in Small-scale Food-Animal Production Businesses: Challenges to Near Real-time Data Analysis and Interpretation.

    PubMed

    Vial, Flavie; Tedder, Andrew

    2017-01-01

    Food-animal production businesses are part of a data-driven ecosystem shaped by stringent requirements for traceability along the value chain and the expanding capabilities of connected products. Within this sector, the generation of animal health intelligence, in particular, in terms of antimicrobial usage, is hindered by the lack of a centralized framework for data storage and usage. In this Perspective, we delimit the 11 processes required for evidence-based decisions and explore processes 3 (digital data acquisition) to 10 (communication to decision-makers) in more depth. We argue that small agribusinesses disproportionally face challenges related to economies of scale given the high price of equipment and services. There are two main areas of concern regarding the collection and usage of digital farm data. First, recording platforms must be developed with the needs and constraints of small businesses in mind and move away from local data storage, which hinders data accessibility and interoperability. Second, such data are unstructured and exhibit properties that can prove challenging to its near real-time preprocessing and analysis in a sector that is largely lagging behind others in terms of computing infrastructure and buying into digital technologies. To complete the digital transformation of this sector, investment in rural digital infrastructure is required alongside the development of new business models to empower small businesses to commit to near real-time data capture. This approach will deliver critical information to fill gaps in our understanding of emerging diseases and antimicrobial resistance in production animals, eventually leading to effective evidence-based policies.

  20. Beyond the School's Boundaries: PoliCultura, a Large-Scale Digital Storytelling Initiative

    ERIC Educational Resources Information Center

    Di Blas, Nicoletta; Paolini, Paolo

    2013-01-01

    Technologies are changing the way we teach and learn in many respects. A relevant and not yet fully explored aspect is that they can support, even entice, students and teachers to go beyond the school boundaries, in spatial and temporal terms. Teachers and learners can keep in touch and work together, when they are not at school; they can access…

  1. A Visual Language for Situational Awareness

    DTIC Science & Technology

    2016-12-01

    listening. The arrival of the information age has delivered the ability to transfer larger volumes of data at far greater rates. Wireless digital... wireless infrastructure for use in large-scale events where domestic power and private wireless networks are overloaded or unavailable. States should...lacking by responders using ANSI INCITS 415 symbols sets.226 When combined with the power of a wireless network, a situational awareness metalanguage is

  2. Malingering in Toxic Exposure. Classification Accuracy of Reliable Digit Span and WAIS-III Digit Span Scaled Scores

    ERIC Educational Resources Information Center

    Greve, Kevin W.; Springer, Steven; Bianchini, Kevin J.; Black, F. William; Heinly, Matthew T.; Love, Jeffrey M.; Swift, Douglas A.; Ciota, Megan A.

    2007-01-01

    This study examined the sensitivity and false-positive error rate of reliable digit span (RDS) and the WAIS-III Digit Span (DS) scaled score in persons alleging toxic exposure and determined whether error rates differed from published rates in traumatic brain injury (TBI) and chronic pain (CP). Data were obtained from the files of 123 persons…

  3. Inflight characterization and correction of Planck/HFI analog to digital converter nonlinearity

    NASA Astrophysics Data System (ADS)

    Sauvé, A.; Couchot, F.; Patanchon, G.; Montier, L.

    2016-07-01

    The Planck Satellite launched in 2009 was targeted to observe the anisotropies of the Cosmic Microwave Back-ground (CMB) to an unprecedented sensitivity. While the Analog to Digital Converter of the HFI (High Frequency Instrument) readout electronics had not been properly characterized on ground, it has been shown to add a systematic nonlinearity effect up to 2% of the cosmological signal. This was a limiting factor for CMB science at large angular scale. We will present the in-flight analysis and method used to characterize and correct this effect down to 0.05% level. We also discuss how to avoid this kind of complex issue for future missions.

  4. The Development of the Digital Addiction Scale for the University Students: Reliability and Validity Study

    ERIC Educational Resources Information Center

    Kesici, Ahmet; Tunç, Nazenin Fidan

    2018-01-01

    This study was carried out to develop a scale for determining the level of digital addiction of the youth. In this study carried out with a group of 687 students from Siirt, Dicle and Erzincan Universities, a draft scale of 28 items based on the interviews with two students who spent a long time with digital tools and their friends, and on the…

  5. Accuracy of a Digital Weight Scale Relative to the Nintendo Wii in Measuring Limb Load Asymmetry

    PubMed Central

    Kumar, NS Senthil; Omar, Baharudin; Joseph, Leonard H; Hamdan, Nor; Htwe, Ohnmar; Hamidun, Nursalbiyah

    2014-01-01

    [Purpose] The aim of the present study was to investigate the accuracy of a digital weight scale relative to the Wii in limb loading measurement during static standing. [Methods] This was a cross-sectional study conducted at a public university teaching hospital. The sample consisted of 24 participants (12 with osteoarthritis and 12 healthy) recruited through convenient sampling. Limb loading measurements were obtained using a digital weight scale and the Nintendo Wii in static standing with three trials under an eyes-open condition. The limb load asymmetry was computed as the symmetry index. [Results] The accuracy of measurement with the digital weight scale relative to the Nintendo Wii was analyzed using the receiver operating characteristic (ROC) curve and Kolmogorov-Smirnov test (K-S test). The area under the ROC curve was found to be 0.67. Logistic regression confirmed the validity of digital weight scale relative to the Nintendo Wii. The D statistics value from the K-S test was found to be 0.16, which confirmed that there was no significant difference in measurement between the equipment. [Conclusion] The digital weight scale is an accurate tool for measuring limb load asymmetry. The low price, easy availability, and maneuverability make it a good potential tool in clinical settings for measuring limb load asymmetry. PMID:25202181

  6. Accuracy of a digital weight scale relative to the nintendo wii in measuring limb load asymmetry.

    PubMed

    Kumar, Ns Senthil; Omar, Baharudin; Joseph, Leonard H; Hamdan, Nor; Htwe, Ohnmar; Hamidun, Nursalbiyah

    2014-08-01

    [Purpose] The aim of the present study was to investigate the accuracy of a digital weight scale relative to the Wii in limb loading measurement during static standing. [Methods] This was a cross-sectional study conducted at a public university teaching hospital. The sample consisted of 24 participants (12 with osteoarthritis and 12 healthy) recruited through convenient sampling. Limb loading measurements were obtained using a digital weight scale and the Nintendo Wii in static standing with three trials under an eyes-open condition. The limb load asymmetry was computed as the symmetry index. [Results] The accuracy of measurement with the digital weight scale relative to the Nintendo Wii was analyzed using the receiver operating characteristic (ROC) curve and Kolmogorov-Smirnov test (K-S test). The area under the ROC curve was found to be 0.67. Logistic regression confirmed the validity of digital weight scale relative to the Nintendo Wii. The D statistics value from the K-S test was found to be 0.16, which confirmed that there was no significant difference in measurement between the equipment. [Conclusion] The digital weight scale is an accurate tool for measuring limb load asymmetry. The low price, easy availability, and maneuverability make it a good potential tool in clinical settings for measuring limb load asymmetry.

  7. Camera-Model Identification Using Markovian Transition Probability Matrix

    NASA Astrophysics Data System (ADS)

    Xu, Guanshuo; Gao, Shang; Shi, Yun Qing; Hu, Ruimin; Su, Wei

    Detecting the (brands and) models of digital cameras from given digital images has become a popular research topic in the field of digital forensics. As most of images are JPEG compressed before they are output from cameras, we propose to use an effective image statistical model to characterize the difference JPEG 2-D arrays of Y and Cb components from the JPEG images taken by various camera models. Specifically, the transition probability matrices derived from four different directional Markov processes applied to the image difference JPEG 2-D arrays are used to identify statistical difference caused by image formation pipelines inside different camera models. All elements of the transition probability matrices, after a thresholding technique, are directly used as features for classification purpose. Multi-class support vector machines (SVM) are used as the classification tool. The effectiveness of our proposed statistical model is demonstrated by large-scale experimental results.

  8. Internet-based transfer of cardiac ultrasound images

    NASA Technical Reports Server (NTRS)

    Firstenberg, M. S.; Greenberg, N. L.; Garcia, M. J.; Morehead, A. J.; Cardon, L. A.; Klein, A. L.; Thomas, J. D.

    2000-01-01

    A drawback to large-scale multicentre studies is the time required for the centralized evaluation of diagnostic images. We evaluated the feasibility of digital transfer of echocardiographic images to a central laboratory for rapid and accurate interpretation. Ten patients undergoing trans-oesophageal echocardiographic scanning at three sites had representative single images and multiframe loops stored digitally. The images were analysed in the ordinary way. All images were then transferred via the Internet to a central laboratory and reanalysed by a different observer. The file sizes were 1.5-72 MByte and the transfer rates achieved were 0.6-4.8 Mbit/min. Quantitative measurements were similar between most on-site and central laboratory measurements (all P > 0.25), although measurements differed for left atrial width and pulmonary venous systolic velocities (both P < 0.05). Digital transfer of echocardiographic images and data to a central laboratory may be useful for multicentre trials.

  9. Word Spotting for Indic Documents to Facilitate Retrieval

    NASA Astrophysics Data System (ADS)

    Bhardwaj, Anurag; Setlur, Srirangaraj; Govindaraju, Venu

    With advances in the field of digitization of printed documents and several mass digitization projects underway, information retrieval and document search have emerged as key research areas. However, most of the current work in these areas is limited to English and a few oriental languages. The lack of efficient solutions for Indic scripts has hampered information extraction from a large body of documents of cultural and historical importance. This chapter presents two relevant topics in this area. First, we describe the use of a script-specific keyword spotting for Devanagari documents that makes use of domain knowledge of the script. Second, we address the needs of a digital library to provide access to a collection of documents from multiple scripts. This requires intelligent solutions which scale across different scripts. We present a script-independent keyword spotting approach for this purpose. Experimental results illustrate the efficacy of our methods.

  10. Individual differences influence two-digit number processing, but not their analog magnitude processing: a large-scale online study.

    PubMed

    Huber, Stefan; Nuerk, Hans-Christoph; Reips, Ulf-Dietrich; Soltanlou, Mojtaba

    2017-12-23

    Symbolic magnitude comparison is one of the most well-studied cognitive processes in research on numerical cognition. However, while the cognitive mechanisms of symbolic magnitude processing have been intensively studied, previous studies have paid less attention to individual differences influencing symbolic magnitude comparison. Employing a two-digit number comparison task in an online setting, we replicated previous effects, including the distance effect, the unit-decade compatibility effect, and the effect of cognitive control on the adaptation to filler items, in a large-scale study in 452 adults. Additionally, we observed that the most influential individual differences were participants' first language, time spent playing computer games and gender, followed by reported alcohol consumption, age and mathematical ability. Participants who used a first language with a left-to-right reading/writing direction were faster than those who read and wrote in the right-to-left direction. Reported playing time for computer games was correlated with faster reaction times. Female participants showed slower reaction times and a larger unit-decade compatibility effect than male participants. Participants who reported never consuming alcohol showed overall slower response times than others. Older participants were slower, but more accurate. Finally, higher grades in mathematics were associated with faster reaction times. We conclude that typical experiments on numerical cognition that employ a keyboard as an input device can also be run in an online setting. Moreover, while individual differences have no influence on domain-specific magnitude processing-apart from age, which increases the decade distance effect-they generally influence performance on a two-digit number comparison task.

  11. A low-cost, scalable, current-sensing digital headstage for high channel count μECoG.

    PubMed

    Trumpis, Michael; Insanally, Michele; Zou, Jialin; Elsharif, Ashraf; Ghomashchi, Ali; Sertac Artan, N; Froemke, Robert C; Viventi, Jonathan

    2017-04-01

    High channel count electrode arrays allow for the monitoring of large-scale neural activity at high spatial resolution. Implantable arrays featuring many recording sites require compact, high bandwidth front-end electronics. In the present study, we investigated the use of a small, light weight, and low cost digital current-sensing integrated circuit for acquiring cortical surface signals from a 61-channel micro-electrocorticographic (μECoG) array. We recorded both acute and chronic μECoG signal from rat auditory cortex using our novel digital current-sensing headstage. For direct comparison, separate recordings were made in the same anesthetized preparations using an analog voltage headstage. A model of electrode impedance explained the transformation between current- and voltage-sensed signals, and was used to reconstruct cortical potential. We evaluated the digital headstage using several metrics of the baseline and response signals. The digital current headstage recorded neural signal with similar spatiotemporal statistics and auditory frequency tuning compared to the voltage signal. The signal-to-noise ratio of auditory evoked responses (AERs) was significantly stronger in the current signal. Stimulus decoding based on true and reconstructed voltage signals were not significantly different. Recordings from an implanted system showed AERs that were detectable and decodable for 52 d. The reconstruction filter mitigated the thermal current noise of the electrode impedance and enhanced overall SNR. We developed and validated a novel approach to headstage acquisition that used current-input circuits to independently digitize 61 channels of μECoG measurements of the cortical field. These low-cost circuits, intended to measure photo-currents in digital imaging, not only provided a signal representing the local cortical field with virtually the same sensitivity and specificity as a traditional voltage headstage but also resulted in a small, light headstage that can easily be scaled to record from hundreds of channels.

  12. A low-cost, scalable, current-sensing digital headstage for high channel count μECoG

    NASA Astrophysics Data System (ADS)

    Trumpis, Michael; Insanally, Michele; Zou, Jialin; Elsharif, Ashraf; Ghomashchi, Ali; Sertac Artan, N.; Froemke, Robert C.; Viventi, Jonathan

    2017-04-01

    Objective. High channel count electrode arrays allow for the monitoring of large-scale neural activity at high spatial resolution. Implantable arrays featuring many recording sites require compact, high bandwidth front-end electronics. In the present study, we investigated the use of a small, light weight, and low cost digital current-sensing integrated circuit for acquiring cortical surface signals from a 61-channel micro-electrocorticographic (μECoG) array. Approach. We recorded both acute and chronic μECoG signal from rat auditory cortex using our novel digital current-sensing headstage. For direct comparison, separate recordings were made in the same anesthetized preparations using an analog voltage headstage. A model of electrode impedance explained the transformation between current- and voltage-sensed signals, and was used to reconstruct cortical potential. We evaluated the digital headstage using several metrics of the baseline and response signals. Main results. The digital current headstage recorded neural signal with similar spatiotemporal statistics and auditory frequency tuning compared to the voltage signal. The signal-to-noise ratio of auditory evoked responses (AERs) was significantly stronger in the current signal. Stimulus decoding based on true and reconstructed voltage signals were not significantly different. Recordings from an implanted system showed AERs that were detectable and decodable for 52 d. The reconstruction filter mitigated the thermal current noise of the electrode impedance and enhanced overall SNR. Significance. We developed and validated a novel approach to headstage acquisition that used current-input circuits to independently digitize 61 channels of μECoG measurements of the cortical field. These low-cost circuits, intended to measure photo-currents in digital imaging, not only provided a signal representing the local cortical field with virtually the same sensitivity and specificity as a traditional voltage headstage but also resulted in a small, light headstage that can easily be scaled to record from hundreds of channels.

  13. A low-cost, scalable, current-sensing digital headstage for high channel count μECoG

    PubMed Central

    Trumpis, Michael; Insanally, Michele; Zou, Jialin; Elsharif, Ashraf; Ghomashchi, Ali; Artan, N. Sertac; Froemke, Robert C.; Viventi, Jonathan

    2017-01-01

    Objective High channel count electrode arrays allow for the monitoring of large-scale neural activity at high spatial resolution. Implantable arrays featuring many recording sites require compact, high bandwidth front-end electronics. In the present study, we investigated the use of a small, light weight, and low cost digital current-sensing integrated circuit for acquiring cortical surface signals from a 61-channel micro-electrocorticographic (μECoG) array. Approach We recorded both acute and chronic μECoG signal from rat auditory cortex using our novel digital current-sensing headstage. For direct comparison, separate recordings were made in the same anesthetized preparations using an analog voltage headstage. A model of electrode impedance explained the transformation between current- and voltage-sensed signals, and was used to reconstruct cortical potential. We evaluated the digital headstage using several metrics of the baseline and response signals. Main results The digital current headstage recorded neural signal with similar spatiotemporal statistics and auditory frequency tuning compared to the voltage signal. The signal-to-noise ratio of auditory evoked responses (AERs) was significantly stronger in the current signal. Stimulus decoding based on true and reconstructed voltage signals were not significantly different. Recordings from an implanted system showed AERs that were detectable and decodable for 52 days. The reconstruction filter mitigated the thermal current noise of the electrode impedance and enhanced overall SNR. Significance We developed and validated a novel approach to headstage acquisition that used current-input circuits to independently digitize 61 channels of μECoG measurements of the cortical field. These low-cost circuits, intended to measure photo-currents in digital imaging, not only provided a signal representing the local cortical field with virtually the same sensitivity and specificity as a traditional voltage headstage but also resulted in a small, light headstage that can easily be scaled to record from hundreds of channels. PMID:28102827

  14. Bathymetric comparisons adjacent to the Louisiana barrier islands: Processes of large-scale change

    USGS Publications Warehouse

    List, J.H.; Jaffe, B.E.; Sallenger, A.H.; Hansen, M.E.

    1997-01-01

    This paper summarizes the results of a comparative bathymetric study encompassing 150 km of the Louisiana barrier-island coast. Bathymetric data surrounding the islands and extending to 12 m water depth were processed from three survey periods: the 1880s, the 1930s, and the 1980s. Digital comparisons between surveys show large-scale, coherent patterns of sea-floor erosion and accretion related to the rapid erosion and disintegration of the islands. Analysis of the sea-floor data reveals two primary processes driving this change: massive longshore transport, in the littoral zone and at shoreface depths; and increased sediment storage in ebb-tidal deltas. Relative sea-level rise, although extraordinarily high in the study area, is shown to be an indirect factor in causing the area's rapid shoreline retreat rates.

  15. A 2D Fourier tool for the analysis of photo-elastic effect in large granular assemblies

    NASA Astrophysics Data System (ADS)

    Leśniewska, Danuta

    2017-06-01

    Fourier transforms are the basic tool in constructing different types of image filters, mainly those reducing optical noise. Some DIC or PIV software also uses frequency space to obtain displacement fields from a series of digital images of a deforming body. The paper presents series of 2D Fourier transforms of photo-elastic transmission images, representing large pseudo 2D granular assembly, deforming under varying boundary conditions. The images related to different scales were acquired using the same image resolution, but taken at different distance from the sample. Fourier transforms of images, representing different stages of deformation, reveal characteristic features at the three (`macro-`, `meso-` and `micro-`) scales, which can serve as a data to study internal order-disorder transition within granular materials.

  16. Alignment between Satellite and Central Galaxies in the SDSS DR7: Dependence on Large-scale Environment

    NASA Astrophysics Data System (ADS)

    Wang, Peng; Luo, Yu; Kang, Xi; Libeskind, Noam I.; Wang, Lei; Zhang, Youcai; Tempel, Elmo; Guo, Quan

    2018-06-01

    The alignment between satellites and central galaxies has been studied in detail both in observational and theoretical works. The widely accepted fact is that satellites preferentially reside along the major axis of their central galaxy. However, the origin and large-scale environmental dependence of this alignment are still unknown. In an attempt to determine these variables, we use data constructed from Sloan Digital Sky Survey DR7 to investigate the large-scale environmental dependence of this alignment with emphasis on examining the alignment’s dependence on the color of the central galaxy. We find a very strong large-scale environmental dependence of the satellite–central alignment (SCA) in groups with blue centrals. Satellites of blue centrals in knots are preferentially located perpendicular to the major axes of the centrals, and the alignment angle decreases with environment, namely, when going from knots to voids. The alignment angle strongly depends on the {}0.1(g-r) color of centrals. We suggest that the SCA is the result of a competition between satellite accretion within large-scale structure (LSS) and galaxy evolution inside host halos. For groups containing red central galaxies, the SCA is mainly determined by the evolution effect, while for blue central dominated groups, the effect of the LSS plays a more important role, especially in knots. Our results provide an explanation for how the SCA forms within different large-scale environments. The perpendicular case in groups and knots with blue centrals may also provide insight into understanding similar polar arrangements, such as the formation of the Milky Way and Centaurus A’s satellite system.

  17. Robustness of spiking Deep Belief Networks to noise and reduced bit precision of neuro-inspired hardware platforms.

    PubMed

    Stromatias, Evangelos; Neil, Daniel; Pfeiffer, Michael; Galluppi, Francesco; Furber, Steve B; Liu, Shih-Chii

    2015-01-01

    Increasingly large deep learning architectures, such as Deep Belief Networks (DBNs) are the focus of current machine learning research and achieve state-of-the-art results in different domains. However, both training and execution of large-scale Deep Networks require vast computing resources, leading to high power requirements and communication overheads. The on-going work on design and construction of spike-based hardware platforms offers an alternative for running deep neural networks with significantly lower power consumption, but has to overcome hardware limitations in terms of noise and limited weight precision, as well as noise inherent in the sensor signal. This article investigates how such hardware constraints impact the performance of spiking neural network implementations of DBNs. In particular, the influence of limited bit precision during execution and training, and the impact of silicon mismatch in the synaptic weight parameters of custom hybrid VLSI implementations is studied. Furthermore, the network performance of spiking DBNs is characterized with regard to noise in the spiking input signal. Our results demonstrate that spiking DBNs can tolerate very low levels of hardware bit precision down to almost two bits, and show that their performance can be improved by at least 30% through an adapted training mechanism that takes the bit precision of the target platform into account. Spiking DBNs thus present an important use-case for large-scale hybrid analog-digital or digital neuromorphic platforms such as SpiNNaker, which can execute large but precision-constrained deep networks in real time.

  18. Robustness of spiking Deep Belief Networks to noise and reduced bit precision of neuro-inspired hardware platforms

    PubMed Central

    Stromatias, Evangelos; Neil, Daniel; Pfeiffer, Michael; Galluppi, Francesco; Furber, Steve B.; Liu, Shih-Chii

    2015-01-01

    Increasingly large deep learning architectures, such as Deep Belief Networks (DBNs) are the focus of current machine learning research and achieve state-of-the-art results in different domains. However, both training and execution of large-scale Deep Networks require vast computing resources, leading to high power requirements and communication overheads. The on-going work on design and construction of spike-based hardware platforms offers an alternative for running deep neural networks with significantly lower power consumption, but has to overcome hardware limitations in terms of noise and limited weight precision, as well as noise inherent in the sensor signal. This article investigates how such hardware constraints impact the performance of spiking neural network implementations of DBNs. In particular, the influence of limited bit precision during execution and training, and the impact of silicon mismatch in the synaptic weight parameters of custom hybrid VLSI implementations is studied. Furthermore, the network performance of spiking DBNs is characterized with regard to noise in the spiking input signal. Our results demonstrate that spiking DBNs can tolerate very low levels of hardware bit precision down to almost two bits, and show that their performance can be improved by at least 30% through an adapted training mechanism that takes the bit precision of the target platform into account. Spiking DBNs thus present an important use-case for large-scale hybrid analog-digital or digital neuromorphic platforms such as SpiNNaker, which can execute large but precision-constrained deep networks in real time. PMID:26217169

  19. Data management strategies for multinational large-scale systems biology projects.

    PubMed

    Wruck, Wasco; Peuker, Martin; Regenbrecht, Christian R A

    2014-01-01

    Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don't Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects.

  20. Data management strategies for multinational large-scale systems biology projects

    PubMed Central

    Peuker, Martin; Regenbrecht, Christian R.A.

    2014-01-01

    Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don’t Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects. PMID:23047157

  1. Digital literacy of youth and young adults with intellectual disability predicted by support needs and social maturity.

    PubMed

    Seok, Soonhwa; DaCosta, Boaventura

    2017-01-01

    This study investigated relationships between digital propensity and support needs as well as predictors of digital propensity in the context of support intensity, age, gender, and social maturity. A total of 118 special education teachers rated the support intensity, digital propensity, and social maturity of 352 students with intellectual disability. Leveraging the Digital Propensity Index, Supports Intensity Scale, and the Social Maturity Scale, descriptive statistics, correlations, multiple regressions, and regression analyses were employed. The findings revealed significant relationships between digital propensity and support needs. In addition, significant predictors of digital propensity were found with regard to support intensity, age, gender, and social maturity.

  2. Full-field inspection of a wind turbine blade using three-dimensional digital image correlation

    NASA Astrophysics Data System (ADS)

    LeBlanc, Bruce; Niezrecki, Christopher; Avitabile, Peter; Chen, Julie; Sherwood, James; Hughes, Scott

    2011-04-01

    Increasing demand and deployment of wind power has led to a significant increase in the number of wind-turbine blades manufactured globally. As the physical size and number of turbines deployed grows, the probability of manufacturing defects being present in composite turbine blade fleets also increases. As both capital blade costs, and operational and maintenance costs, increase for larger turbine systems the need for large-scale inspection and monitoring of the state of structural health of turbine blades during manufacturing and operation critically increase. One method for locating and quantifying manufacturing defects, while also allowing for the in-situ measurement of the structural health of blades, is through the observation of the full-field state of deformation and strain of the blade. Static tests were performed on a nine-meter CX-100 composite turbine blade to extract full-field displacement and strain measurements using threedimensional digital image correlation (3D DIC). Measurements were taken at several angles near the blade root, including along the high-pressure surface, low-pressure surface, and along the trailing edge of the blade. The overall results indicate that the measurement approach can clearly identify failure locations and discontinuities in the blade curvature under load. Post-processing of the data using a stitching technique enables the shape and curvature of the entire blade to be observed for a large-scale wind turbine blade for the first time. The experiment demonstrates the feasibility of the approach and reveals that the technique readily can be scaled up to accommodate utility-scale blades. As long as a trackable pattern is applied to the surface of the blade, measurements can be made in-situ when a blade is on a manufacturing floor, installed in a test fixture, or installed on a rotating turbine. The results demonstrate the great potential of the optical measurement technique and its capability for use in the wind industry for large-area inspection.

  3. Very Large Scale Integrated Circuits for Military Systems.

    DTIC Science & Technology

    1981-01-01

    ABBREVIATIONS A/D Analog-to-digital C AGC Automatic Gain Control A A/J Anti-jam ASP Advanced Signal Processor AU Arithmetic Units C.AD Computer-Aided...ESM) equipments (Ref. 23); in lieu of an adequate automatic proces- sing capability, the function is now performed manually (Ref. 24), which involves...a human operator, displays, etc., and a sacrifice in performance (acquisition speed, saturation signal density). Various automatic processing

  4. Simulation Tools for Digital LSI (Large Scale Integration) Design.

    DTIC Science & Technology

    1983-09-01

    potential paths e:,,t from a t.,sde to t alnd (,A I isl gc’, , that It iioht he coton ilci, ilt d I:eriniiie d f ’ Ti.. ’he o diti ns for %0l’h , i p th...the flag is set during an execution of the code, another iteration is performed; otherwise, the subroutine is finished . The following is an extended

  5. Full-color large-scaled computer-generated holograms for physical and non-physical objects

    NASA Astrophysics Data System (ADS)

    Matsushima, Kyoji; Tsuchiyama, Yasuhiro; Sonobe, Noriaki; Masuji, Shoya; Yamaguchi, Masahiro; Sakamoto, Yuji

    2017-05-01

    Several full-color high-definition CGHs are created for reconstructing 3D scenes including real-existing physical objects. The field of the physical objects are generated or captured by employing three techniques; 3D scanner, synthetic aperture digital holography, and multi-viewpoint images. Full-color reconstruction of high-definition CGHs is realized by RGB color filters. The optical reconstructions are presented for verifying these techniques.

  6. Acquisition of a High Performance Computing Instrument for Big Data Research and Education

    DTIC Science & Technology

    2015-12-03

    Security and Privacy , University of Texas at Dallas, TX, September 16-17, 2014. • Chopade, P., Zhan, J., Community Detection in Large Scale Big Data...Security and Privacy in Communication Networks, Beijing, China, September 24-26, 2014. • Pravin Chopade, Kenneth Flurchick, Justin Zhan and Marwan...Balkirat Kaur, Malcolm Blow, and Justin Zhan, Digital Image Authentication in Social Media, The Sixth ASE International Conference on Privacy

  7. Proceedings of the Annual Meeting of the Association for Education in Journalism and Mass Communication (74th, Boston, Massachusetts, August 7-10, 1991). Part VI: Technology and the Mass Media.

    ERIC Educational Resources Information Center

    Association for Education in Journalism and Mass Communication.

    The Technology and the Media section of the proceedings contains the following 18 papers: "What's Wrong with This Picture?: Attitudes of Photographic Editors at Daily Newspapers and Their Tolerance toward Digital Manipulation" (Shiela Reaves); "Strategies for the Analysis of Large-Scale Databases in Computer-Assisted Investigative…

  8. Accuracy Validation of Large-scale Block Adjustment without Control of ZY3 Images over China

    NASA Astrophysics Data System (ADS)

    Yang, Bo

    2016-06-01

    Mapping from optical satellite images without ground control is one of the goals of photogrammetry. Using 8802 three linear array stereo images (a total of 26406 images) of ZY3 over China, we propose a large-scale and non-control block adjustment method of optical satellite images based on the RPC model, in which a single image is regarded as an adjustment unit to be organized. To overcome the block distortion caused by unstable adjustment without ground control and the excessive accumulation of errors, we use virtual control points created by the initial RPC model of the images as the weighted observations and add them into the adjustment model to refine the adjustment. We use 8000 uniformly distributed high precision check points to evaluate the geometric accuracy of the DOM (Digital Ortho Model) and DSM (Digital Surface Model) production, for which the standard deviations of plane and elevation are 3.6 m and 4.2 m respectively. The geometric accuracy is consistent across the whole block and the mosaic accuracy of neighboring DOM is within a pixel, thus, the seamless mosaic could take place. This method achieves the goal of an accuracy of mapping without ground control better than 5 m for the whole China from ZY3 satellite images.

  9. APDA's Contribution to Current Research and Citizen Science

    NASA Astrophysics Data System (ADS)

    Barker, Thurburn; Castelaz, M. W.; Cline, J. D.; Hudec, R.

    2010-01-01

    The Astronomical Photographical Data Archive (APDA) is dedicated to the collection, restoration, preservation, and digitization of astronomical photographic data that eventually can be accessed via the Internet by the global community of scientists, researchers and students. Located on the Pisgah Astronomical Research Institute campus, APDA now includes collections from North America totaling more than 100,000 photographic plates and films. Two new large scale research projects, and one citizen science project have now been developed from the archived data. One unique photographic data collection covering the southern hemisphere contains the signatures of diffuse interstellar bands (DIBs) within the stellar spectra on objective prism plates. We plan to digitize the spectra, identify the DIBs, and map out the large scale spatial extent of DIBS. The goal is to understand the Galactic environment suitable to the DIB molecules. Another collection contains spectra with nearly the same dispersion as the GAIA Satellite low dispersion slitless spectrophotometers, BP and RP. The plates will be used to develop standards for GAIA spectra. To bring the data from APDA to the general public, we have developed the citizen science project called Stellar Classification Online - Public Exploration (SCOPE). SCOPE allows the citizen scientist to classify up to a half million stars on objective prism plates. We will present the status of each of these projects.

  10. The Pore-scale modeling of multiphase flows in reservoir rocks using the lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Mu, Y.; Baldwin, C. H.; Toelke, J.; Grader, A.

    2011-12-01

    Digital rock physics (DRP) is a new technology to compute the physical and fluid flow properties of reservoir rocks. In this approach, pore scale images of the porous rock are obtained and processed to create highly accurate 3D digital rock sample, and then the rock properties are evaluated by advanced numerical methods at the pore scale. Ingrain's DRP technology is a breakthrough for oil and gas companies that need large volumes of accurate results faster than the current special core analysis (SCAL) laboratories can normally deliver. In this work, we compute the multiphase fluid flow properties of 3D digital rocks using D3Q19 immiscible LBM with two relaxation times (TRT). For efficient implementation on GPU, we improved and reformulated color-gradient model proposed by Gunstensen and Rothmann. Furthermore, we only use one-lattice with the sparse data structure: only allocate memory for pore nodes on GPU. We achieved more than 100 million fluid lattice updates per second (MFLUPS) for two-phase LBM on single Fermi-GPU and high parallel efficiency on Multi-GPUs. We present and discuss our simulation results of important two-phase fluid flow properties, such as capillary pressure and relative permeabilities. We also investigate the effects of resolution and wettability on multiphase flows. Comparison of direct measurement results with the LBM-based simulations shows practical ability of DRP to predict two-phase flow properties of reservoir rock.

  11. Development of a Digital-Based Instrument to Assess Perceived Motor Competence in Children: Face Validity, Test-Retest Reliability, and Internal Consistency

    PubMed Central

    Palmer, Kara K.

    2017-01-01

    Assessing children’s perceptions of their movement abilities (i.e., perceived competence) is traditionally done using picture scales—Pictorial Scale of Perceived Competence and Acceptance for Young Children or Pictorial Scale of Perceived Movement Skill Competence. Pictures fail to capture the temporal components of movement. To address this limitation, we created a digital-based instrument to assess perceived motor competence: the Digital Scale of Perceived Motor Competence. The purpose of this study was to determine the validity, reliability, and internal consistency of the Digital-based Scale of Perceived Motor Skill Competence. The Digital-based Scale of Perceived Motor Skill Competence is based on the twelve fundamental motor skills from the Test of Gross Motor Development-2nd Edition with a similar layout and item structure as the Pictorial Scale of Perceived Movement Skill Competence. Face Validity of the instrument was examined in Phase I (n = 56; Mage = 8.6 ± 0.7 years, 26 girls). Test-retest reliability and internal consistency were assessed in Phase II (n = 54, Mage = 8.7 years ± 0.5 years, 26 girls). Intra-class correlations (ICC) and Cronbach’s alpha were conducted to determine test-retest reliability and internal consistency for all twelve skills along with locomotor and object control subscales. The Digital Scale of Perceived Motor Competence demonstrates excellent test-retest reliability (ICC = 0.83, total; ICC = 0.77, locomotor; ICC = 0.79, object control) and acceptable/good internal consistency (α = 0.62, total; α = 0.57, locomotor; α = 0.49, object control). Findings provide evidence of the reliability of the three level digital-based instrument of perceived motor competence for older children. PMID:29910408

  12. Drainage networks after wildfire

    USGS Publications Warehouse

    Kinner, D.A.; Moody, J.A.

    2005-01-01

    Predicting runoff and erosion from watersheds burned by wildfires requires an understanding of the three-dimensional structure of both hillslope and channel drainage networks. We investigate the small-and large-scale structures of drainage networks using field studies and computer analysis of 30-m digital elevation model. Topologic variables were derived from a composite 30-m DEM, which included 14 order 6 watersheds within the Pikes Peak batholith. Both topologic and hydraulic variables were measured in the field in two smaller burned watersheds (3.7 and 7.0 hectares) located within one of the order 6 watersheds burned by the 1996 Buffalo Creek Fire in Central Colorado. Horton ratios of topologic variables (stream number, drainage area, stream length, and stream slope) for small-scale and large-scale watersheds are shown to scale geometrically with stream order (i.e., to be scale invariant). However, the ratios derived for the large-scale drainage networks could not be used to predict the rill and gully drainage network structure. Hydraulic variables (width, depth, cross-sectional area, and bed roughness) for small-scale drainage networks were found to be scale invariant across 3 to 4 stream orders. The relation between hydraulic radius and cross-sectional area is similar for rills and gullies, suggesting that their geometry can be treated similarly in hydraulic modeling. Additionally, the rills and gullies have relatively small width-to-depth ratios, implying sidewall friction may be important to the erosion and evolutionary process relative to main stem channels.

  13. Three-dimensional hydrogeologic framework model of the Rio Grande transboundary region of New Mexico and Texas, USA, and northern Chihuahua, Mexico

    USGS Publications Warehouse

    Sweetkind, Donald S.

    2017-09-08

    As part of a U.S. Geological Survey study in cooperation with the Bureau of Reclamation, a digital three-dimensional hydrogeologic framework model was constructed for the Rio Grande transboundary region of New Mexico and Texas, USA, and northern Chihuahua, Mexico. This model was constructed to define the aquifer system geometry and subsurface lithologic characteristics and distribution for use in a regional numerical hydrologic model. The model includes five hydrostratigraphic units: river channel alluvium, three informal subdivisions of Santa Fe Group basin fill, and an undivided pre-Santa Fe Group bedrock unit. Model input data were compiled from published cross sections, well data, structure contour maps, selected geophysical data, and contiguous compilations of surficial geology and structural features in the study area. These data were used to construct faulted surfaces that represent the upper and lower subsurface hydrostratigraphic unit boundaries. The digital three-dimensional hydrogeologic framework model is constructed through combining faults, the elevation of the tops of each hydrostratigraphic unit, and boundary lines depicting the subsurface extent of each hydrostratigraphic unit. The framework also compiles a digital representation of the distribution of sedimentary facies within each hydrostratigraphic unit. The digital three-dimensional hydrogeologic model reproduces with reasonable accuracy the previously published subsurface hydrogeologic conceptualization of the aquifer system and represents the large-scale geometry of the subsurface aquifers. The model is at a scale and resolution appropriate for use as the foundation for a numerical hydrologic model of the study area.

  14. The effects of sex, sexual orientation, and digit ratio (2D:4D) on mental rotation performance.

    PubMed

    Peters, Michael; Manning, John T; Reimers, Stian

    2007-04-01

    In spite of the reduced level of experimental control, this large scale study brought some clarity into the relation between mental rotation task (MRT) performance and a number of variables where contradictory associations had previously been reported in the literature. Clear sex differences in MRT were observed for a sample of 134,317 men and 120,783 women, with men outperforming women. There were also MRT differences as a function of sexual orientation: heterosexual men performed better than homosexual men and homosexual women performed better than heterosexual women. Although bisexual men performed better than homosexual men but less well than heterosexual men, no significant differences were observed between bisexual and homosexual women. MRT performance in both men and women peaked in the 20-30 year range, and declined significantly and markedly thereafter. Both men and women showed a significant negative correlation between left and right digit finger ratio and MRT scores, such that individuals with smaller digit ratios (relatively longer ring finger than index finger) performed better than individuals with larger digit ratios.

  15. Interformat reliability of digital psychiatric self-report questionnaires: a systematic review.

    PubMed

    Alfonsson, Sven; Maathz, Pernilla; Hursti, Timo

    2014-12-03

    Research on Internet-based interventions typically use digital versions of pen and paper self-report symptom scales. However, adaptation into the digital format could affect the psychometric properties of established self-report scales. Several studies have investigated differences between digital and pen and paper versions of instruments, but no systematic review of the results has yet been done. This review aims to assess the interformat reliability of self-report symptom scales used in digital or online psychotherapy research. Three databases (MEDLINE, Embase, and PsycINFO) were systematically reviewed for studies investigating the reliability between digital and pen and paper versions of psychiatric symptom scales. From a total of 1504 publications, 33 were included in the review, and interformat reliability of 40 different symptom scales was assessed. Significant differences in mean total scores between formats were found in 10 of 62 analyses. These differences were found in just a few studies, which indicates that the results were due to study effects and sample effects rather than unreliable instruments. The interformat reliability ranged from r=.35 to r=.99; however, the majority of instruments showed a strong correlation between format scores. The quality of the included studies varied, and several studies had insufficient power to detect small differences between formats. When digital versions of self-report symptom scales are compared to pen and paper versions, most scales show high interformat reliability. This supports the reliability of results obtained in psychotherapy research on the Internet and the comparability of the results to traditional psychotherapy research. There are, however, some instruments that consistently show low interformat reliability, suggesting that these conclusions cannot be generalized to all questionnaires. Most studies had at least some methodological issues with insufficient statistical power being the most common issue. Future studies should preferably provide information about the transformation of the instrument into digital format and the procedure for data collection in more detail.

  16. Watershed Boundary Dataset for Mississippi

    USGS Publications Warehouse

    Wilson, K. Van; Clair, Michael G.; Turnipseed, D. Phil; Rebich, Richard A.

    2009-01-01

    The U.S. Geological Survey, in cooperation with the Mississippi Department of Environmental Quality, U.S. Department of Agriculture-Natural Resources Conservation Service, Mississippi Department of Transportation, U.S. Department of Agriculture-Forest Service, and the Mississippi Automated Resource Information System developed a 1:24,000-scale Watershed Boundary Dataset for Mississippi including watershed and subwatershed boundaries, codes, names, and areas. The Watershed Boundary Dataset for Mississippi provides a standard geographical framework for water-resources and selected land-resources planning. The original 8-digit subbasins (Hydrologic Unit Codes) were further subdivided into 10-digit watersheds (62.5 to 391 square miles (mi2)) and 12-digit subwatersheds (15.6 to 62.5 mi2) - the exceptions being the Delta part of Mississippi and the Mississippi River inside levees, which were subdivided into 10-digit watersheds only. Also, large water bodies in the Mississippi Sound along the coast were not delineated as small as a typical 12-digit subwatershed. All of the data - including watershed and subwatershed boundaries, subdivision codes and names, and drainage-area data - are stored in a Geographic Information System database, which are available at: http://ms.water.usgs.gov/. This map shows information on drainage and hydrography in the form of U.S. Geological Survey hydrologic unit boundaries for water-resource 2-digit regions, 4-digit subregions, 6-digit basins (formerly called accounting units), 8-digit subbasins (formerly called cataloging units), 10-digit watershed, and 12-digit subwatersheds in Mississippi. A description of the project study area, methods used in the development of watershed and subwatershed boundaries for Mississippi, and results are presented in Wilson and others (2008). The data presented in this map and by Wilson and others (2008) supersede the data presented for Mississippi by Seaber and others (1987) and U.S. Geological Survey (1977).

  17. Digital Image Correlation Techniques Applied to Large Scale Rocket Engine Testing

    NASA Technical Reports Server (NTRS)

    Gradl, Paul R.

    2016-01-01

    Rocket engine hot-fire ground testing is necessary to understand component performance, reliability and engine system interactions during development. The J-2X upper stage engine completed a series of developmental hot-fire tests that derived performance of the engine and components, validated analytical models and provided the necessary data to identify where design changes, process improvements and technology development were needed. The J-2X development engines were heavily instrumented to provide the data necessary to support these activities which enabled the team to investigate any anomalies experienced during the test program. This paper describes the development of an optical digital image correlation technique to augment the data provided by traditional strain gauges which are prone to debonding at elevated temperatures and limited to localized measurements. The feasibility of this optical measurement system was demonstrated during full scale hot-fire testing of J-2X, during which a digital image correlation system, incorporating a pair of high speed cameras to measure three-dimensional, real-time displacements and strains was installed and operated under the extreme environments present on the test stand. The camera and facility setup, pre-test calibrations, data collection, hot-fire test data collection and post-test analysis and results are presented in this paper.

  18. Tapping the Vast Potential of the Data Deluge in Small-scale Food-Animal Production Businesses: Challenges to Near Real-time Data Analysis and Interpretation

    PubMed Central

    Vial, Flavie; Tedder, Andrew

    2017-01-01

    Food-animal production businesses are part of a data-driven ecosystem shaped by stringent requirements for traceability along the value chain and the expanding capabilities of connected products. Within this sector, the generation of animal health intelligence, in particular, in terms of antimicrobial usage, is hindered by the lack of a centralized framework for data storage and usage. In this Perspective, we delimit the 11 processes required for evidence-based decisions and explore processes 3 (digital data acquisition) to 10 (communication to decision-makers) in more depth. We argue that small agribusinesses disproportionally face challenges related to economies of scale given the high price of equipment and services. There are two main areas of concern regarding the collection and usage of digital farm data. First, recording platforms must be developed with the needs and constraints of small businesses in mind and move away from local data storage, which hinders data accessibility and interoperability. Second, such data are unstructured and exhibit properties that can prove challenging to its near real-time preprocessing and analysis in a sector that is largely lagging behind others in terms of computing infrastructure and buying into digital technologies. To complete the digital transformation of this sector, investment in rural digital infrastructure is required alongside the development of new business models to empower small businesses to commit to near real-time data capture. This approach will deliver critical information to fill gaps in our understanding of emerging diseases and antimicrobial resistance in production animals, eventually leading to effective evidence-based policies. PMID:28932740

  19. Topology Analysis of the Sloan Digital Sky Survey. I. Scale and Luminosity Dependence

    NASA Astrophysics Data System (ADS)

    Park, Changbom; Choi, Yun-Young; Vogeley, Michael S.; Gott, J. Richard, III; Kim, Juhan; Hikage, Chiaki; Matsubara, Takahiko; Park, Myeong-Gu; Suto, Yasushi; Weinberg, David H.; SDSS Collaboration

    2005-11-01

    We measure the topology of volume-limited galaxy samples selected from a parent sample of 314,050 galaxies in the Sloan Digital Sky Survey (SDSS), which is now complete enough to describe the fully three-dimensional topology and its dependence on galaxy properties. We compare the observed genus statistic G(νf) to predictions for a Gaussian random field and to the genus measured for mock surveys constructed from new large-volume simulations of the ΛCDM cosmology. In this analysis we carefully examine the dependence of the observed genus statistic on the Gaussian smoothing scale RG from 3.5 to 11 h-1 Mpc and on the luminosity of galaxies over the range -22.50

  20. Effects of large deep-seated landslides on hillslope morphology, western Southern Alps, New Zealand

    NASA Astrophysics Data System (ADS)

    Korup, Oliver

    2006-03-01

    Morphometric analysis and air photo interpretation highlight geomorphic imprints of large landslides (i.e., affecting ≥1 km2) on hillslopes in the western Southern Alps (WSA), New Zealand. Large landslides attain kilometer-scale runout, affect >50% of total basin relief, and in 70% are slope clearing, and thus relief limiting. Landslide terrain shows lower mean local relief, relief variability, slope angles, steepness, and concavity than surrounding terrain. Measuring mean slope angle smoothes out local landslide morphology, masking any relationship between large landslides and possible threshold hillslopes. Large failures also occurred on low-gradient slopes, indicating persistent low-frequency/high-magnitude hillslope adjustment independent of fluvial bedrock incision. At the basin and hillslope scale, slope-area plots partly constrain the effects of landslides on geomorphic process regimes. Landslide imprints gradually blend with relief characteristics at orogen scale (102 km), while being sensitive to length scales of slope failure, topography, sampling, and digital elevation model resolution. This limits means of automated detection, and underlines the importance of local morphologic contrasts for detecting large landslides in the WSA. Landslide controls on low-order drainage include divide lowering and shifting, formation of headwater basins and hanging valleys, and stream piracy. Volumes typically mobilized, yet still stored in numerous deposits despite high denudation rates, are >107 m3, and theoretically equal to 102 years of basin-wide debris production from historic shallow landslides; lack of absolute ages precludes further estimates. Deposit size and mature forest cover indicate residence times of 101-104 years. On these timescales, large landslides require further attention in landscape evolution models of tectonically active orogens.

  1. A class of systolizable IIR digital filters and its design for proper scaling and minimum output roundoff noise

    NASA Technical Reports Server (NTRS)

    Lei, Shaw-Min; Yao, Kung

    1990-01-01

    A class of infinite impulse response (IIR) digital filters with a systolizable structure is proposed and its synthesis is investigated. The systolizable structure consists of pipelineable regular modules with local connections and is suitable for VLSI implementation. It is capable of achieving high performance as well as high throughput. This class of filter structure provides certain degrees of freedom that can be used to obtain some desirable properties for the filter. Techniques of evaluating the internal signal powers and the output roundoff noise of the proposed filter structure are developed. Based upon these techniques, a well-scaled IIR digital filter with minimum output roundoff noise is designed using a local optimization approach. The internal signals of all the modes of this filter are scaled to unity in the l2-norm sense. Compared to the Rao-Kailath (1984) orthogonal digital filter and the Gray-Markel (1973) normalized-lattice digital filter, this filter has better scaling properties and lower output roundoff noise.

  2. Digital Rocks Portal: a sustainable platform for imaged dataset sharing, translation and automated analysis

    NASA Astrophysics Data System (ADS)

    Prodanovic, M.; Esteva, M.; Hanlon, M.; Nanda, G.; Agarwal, P.

    2015-12-01

    Recent advances in imaging have provided a wealth of 3D datasets that reveal pore space microstructure (nm to cm length scale) and allow investigation of nonlinear flow and mechanical phenomena from first principles using numerical approaches. This framework has popularly been called "digital rock physics". Researchers, however, have trouble storing and sharing the datasets both due to their size and the lack of standardized image types and associated metadata for volumetric datasets. This impedes scientific cross-validation of the numerical approaches that characterize large scale porous media properties, as well as development of multiscale approaches required for correct upscaling. A single research group typically specializes in an imaging modality and/or related modeling on a single length scale, and lack of data-sharing infrastructure makes it difficult to integrate different length scales. We developed a sustainable, open and easy-to-use repository called the Digital Rocks Portal, that (1) organizes images and related experimental measurements of different porous materials, (2) improves access to them for a wider community of geosciences or engineering researchers not necessarily trained in computer science or data analysis. Once widely accepter, the repository will jumpstart productivity and enable scientific inquiry and engineering decisions founded on a data-driven basis. This is the first repository of its kind. We show initial results on incorporating essential software tools and pipelines that make it easier for researchers to store and reuse data, and for educators to quickly visualize and illustrate concepts to a wide audience. For data sustainability and continuous access, the portal is implemented within the reliable, 24/7 maintained High Performance Computing Infrastructure supported by the Texas Advanced Computing Center (TACC) at the University of Texas at Austin. Long-term storage is provided through the University of Texas System Research Cyber-infrastructure initiative.

  3. Sediment erosion and delivery from Toutle River basin after the 1980 eruption of Mount St. Helens: A 30-year perspective

    USGS Publications Warehouse

    Major, Jon J.; Mosbrucker, Adam; Spicer, Kurt R.; Crisafulli, Charles; Dale, V.

    2018-01-01

    Exceptional sediment yields persist in Toutle River valley more than 30 years after the major 1980 eruption of Mount St. Helens. Differencing of decadal-scale digital elevation models shows the elevated load comes largely from persistent lateral channel erosion across the debris-avalanche deposit. Since the mid-1980s, rates of channel-bed-elevation change have diminished, and magnitudes of lateral erosion have outpaced those of channel incision. A digital elevation model of difference from 1999 to 2009 shows erosion across the debris-avalanche deposit is more spatially distributed compared to a model from 1987 to 1999, in which erosion was strongly focused along specific reaches of the channel.

  4. Remote Sensing

    NASA Technical Reports Server (NTRS)

    Rickman, Douglas

    2008-01-01

    Remote sensing is measuring something without touching it. Most methods measure a portion of the electro-magnetic spectrum using energy reflected from or emitted by a material. Moving the instrument away makes it easier to see more at one time. Airplanes are good but satellites are much better. Many things can not be easily measured on the scale of an individual person. Example - measuring all the vegetation growing at one time in even the smallest country. A satellite can see things over large areas repeatedly and in a consistent way. Data from the detector is reported as digital values for a grid that covers some portion of the Earth. Because it is digital and consistent a computer can extract information or enhance the data for a specific purpose.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demaria, N.

    This paper is a review of recent progress of RD53 Collaboration. Results obtained on the study of the radiation effects on 65 nm CMOS have matured enough to define first strategies to adopt in the design of analog and digital circuits. Critical building blocks and analog very front end chains have been designed, tested before and after 5–800 Mrad. Small prototypes of 64×64 pixels with complex digital architectures have been produced, and point to address the main issues of dealing with extremely high pixel rates, while operating at very small in-time thresholds in the analog front end. Lastly, the collaborationmore » is now proceeding at full speed towards the design of a large scale prototype, called RD53A, in 65 nm CMOS technology.« less

  6. Integrated digital inverters based on two-dimensional anisotropic ReS₂ field-effect transistors

    DOE PAGES

    Liu, Erfu; Fu, Yajun; Wang, Yaojia; ...

    2015-05-07

    Semiconducting two-dimensional transition metal dichalcogenides are emerging as top candidates for post-silicon electronics. While most of them exhibit isotropic behaviour, lowering the lattice symmetry could induce anisotropic properties, which are both scientifically interesting and potentially useful. Here we present atomically thin rhenium disulfide (ReS₂) flakes with unique distorted 1T structure, which exhibit in-plane anisotropic properties. We fabricated monolayer and few-layer ReS₂ field-effect transistors, which exhibit competitive performance with large current on/off ratios (~10⁷) and low subthreshold swings (100 mV per decade). The observed anisotropic ratio along two principle axes reaches 3.1, which is the highest among all known two-dimensional semiconductingmore » materials. Furthermore, we successfully demonstrated an integrated digital inverter with good performance by utilizing two ReS₂ anisotropic field-effect transistors, suggesting the promising implementation of large-scale two-dimensional logic circuits. Our results underscore the unique properties of two-dimensional semiconducting materials with low crystal symmetry for future electronic applications.« less

  7. Large-Scale Astrophysical Visualization on Smartphones

    NASA Astrophysics Data System (ADS)

    Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.

    2011-07-01

    Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.

  8. RELIABILITY OF THE DETECTION OF THE BARYON ACOUSTIC PEAK

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MartInez, Vicent J.; Arnalte-Mur, Pablo; De la Cruz, Pablo

    2009-05-01

    The correlation function of the distribution of matter in the universe shows, at large scales, baryon acoustic oscillations, which were imprinted prior to recombination. This feature was first detected in the correlation function of the luminous red galaxies of the Sloan Digital Sky Survey (SDSS). Recently, the final release (DR7) of the SDSS has been made available, and the useful volume is about two times bigger than in the old sample. We present here, for the first time, the redshift-space correlation function of this sample at large scales together with that for one shallower, but denser volume-limited subsample drawn frommore » the Two-Degree Field Redshift Survey. We test the reliability of the detection of the acoustic peak at about 100 h {sup -1} Mpc and the behavior of the correlation function at larger scales by means of careful estimation of errors. We confirm the presence of the peak in the latest data although broader than in previous detections.« less

  9. Structures and Techniques For Implementing and Packaging Complex, Large Scale Microelectromechanical Systems Using Foundry Fabrication Processes.

    DTIC Science & Technology

    1996-06-01

    switches 5-43 Figure 5-27. Mechanical interference between ’Pull Spring’ devices 5-45 Figure 5-28. Array of LIGA mechanical relay switches 5-49...like coating DM Direct metal interconnect technique DMD ™ Digital Micromirror Device EDP Ethylene, diamine, pyrocatechol and water; silicon anisotropic...mechanical systems MOSIS MOS Implementation Service PGA Pin grid array, an electronic die package PZT Lead-zirconate-titanate LIGA Lithographie

  10. Holographic Waveguide Array Rollable Display.

    DTIC Science & Technology

    1997-04-01

    scale lithography for fabrication. Projection systems offer large images, in the range of 40 - 60 inches diagonal, and both front-view and rear-view...Boulder, CO, and a l-D array of digital micromirrors ( DMD ) from Texas Instruments. The linear format permits simple driving electronics and high...TI’s DMD , or a CMOS-SLM. A collimated laser beaming (combine three colors) or a collimated white light beam from a high intensity halogen lamp can be

  11. Privacy and Biometric Passports

    PubMed Central

    Vakalis, Ioannis

    2011-01-01

    This work deals with privacy implications and threats that can emerge with the large-scale use of electronic biometric documents, such the recently introduced electronic passport (e-Passport). A brief introduction to privacy and personal data protection is followed by a presentation of the technical characteristics of the e-Passport. The description includes the digital data structure, and the communication and reading mechanisms of the e-Passport, indicating the possible points and methods of attack. PMID:21380483

  12. Ground-based remote sensing with long lens video camera for upper-stem diameter and other tree crown measurements

    Treesearch

    Neil A. Clark; Sang-Mook Lee

    2004-01-01

    This paper demonstrates how a digital video camera with a long lens can be used with pulse laser ranging in order to collect very large-scale tree crown measurements. The long focal length of the camera lens provides the magnification required for precise viewing of distant points with the trade-off of spatial coverage. Multiple video frames are mosaicked into a single...

  13. Producing Alaska interim land cover maps from Landsat digital and ancillary data

    USGS Publications Warehouse

    Fitzpatrick-Lins, Katherine; Doughty, Eileen Flanagan; Shasby, Mark; Loveland, Thomas R.; Benjamin, Susan

    1987-01-01

    In 1985, the U.S. Geological Survey initiated a research program to produce 1:250,000-scale land cover maps of Alaska using digital Landsat multispectral scanner data and ancillary data and to evaluate the potential of establishing a statewide land cover mapping program using this approach. The geometrically corrected and resampled Landsat pixel data are registered to a Universal Transverse Mercator (UTM) projection, along with arc-second digital elevation model data used as an aid in the final computer classification. Areas summaries of the land cover classes are extracted by merging the Landsat digital classification files with the U.S. Bureau of Land Management's Public Land Survey digital file. Registration of the digital land cover data is verified and control points are identified so that a laser plotter can products screened film separate for printing the classification data at map scale directly from the digital file. The final land cover classification is retained both as a color map at 1:250,000 scale registered to the U.S. Geological Survey base map, with area summaries by township and range on the reverse, and as a digital file where it may be used as a category in a geographic information system.

  14. A parallel algorithm for viewshed analysis in three-dimensional Digital Earth

    NASA Astrophysics Data System (ADS)

    Feng, Wang; Gang, Wang; Deji, Pan; Yuan, Liu; Liuzhong, Yang; Hongbo, Wang

    2015-02-01

    Viewshed analysis, often supported by geographic information systems, is widely used in the three-dimensional (3D) Digital Earth system. Many of the analyzes involve the siting of features and real-timedecision-making. Viewshed analysis is usually performed at a large scale, which poses substantial computational challenges, as geographic datasets continue to become increasingly large. Previous research on viewshed analysis has been generally limited to a single data structure (i.e., DEM), which cannot be used to analyze viewsheds in complicated scenes. In this paper, a real-time algorithm for viewshed analysis in Digital Earth is presented using the parallel computing of graphics processing units (GPUs). An occlusion for each geometric entity in the neighbor space of the viewshed point is generated according to line-of-sight. The region within the occlusion is marked by a stencil buffer within the programmable 3D visualization pipeline. The marked region is drawn with red color concurrently. In contrast to traditional algorithms based on line-of-sight, the new algorithm, in which the viewshed calculation is integrated with the rendering module, is more efficient and stable. This proposed method of viewshed generation is closer to the reality of the virtual geographic environment. No DEM interpolation, which is seen as a computational burden, is needed. The algorithm was implemented in a 3D Digital Earth system (GeoBeans3D) with the DirectX application programming interface (API) and has been widely used in a range of applications.

  15. New directions in medical e-curricula and the use of digital repositories.

    PubMed

    Fleiszer, David M; Posel, Nancy H; Steacy, Sean P

    2004-03-01

    Medical educators involved in the growth of multimedia-enhanced e-curricula are increasingly aware of the need for digital repositories to catalogue, store and ensure access to learning objects that are integrated within their online material. The experience at the Faculty of Medicine at McGill University during initial development of a mainstream electronic curriculum reflects this growing recognition that repositories can facilitate the development of a more comprehensive as well as effective electronic curricula. Also, digital repositories can help to ensure efficient utilization of resources through the use, re-use, and reprocessing of multimedia learning, addressing the potential for collaboration among repositories and increasing available material exponentially. The authors review different approaches to the development of a digital repository application, as well as global and specific issues that should be examined in the initial requirements definition and development phase, to ensure current initiatives meet long-term requirements. Often, decisions regarding creation of e-curricula and associated digital repositories are left to interested faculty and their individual development teams. However, the development of an e-curricula and digital repository is not predominantly a technical exercise, but rather one that affects global pedagogical strategies and curricular content and involves a commitment of large-scale resources. Outcomes of these decisions can have long-term consequences and as such, should involve faculty at the highest levels including the dean.

  16. The Digital Fish Library: Using MRI to Digitize, Database, and Document the Morphological Diversity of Fish

    PubMed Central

    Berquist, Rachel M.; Gledhill, Kristen M.; Peterson, Matthew W.; Doan, Allyson H.; Baxter, Gregory T.; Yopak, Kara E.; Kang, Ning; Walker, H. J.; Hastings, Philip A.; Frank, Lawrence R.

    2012-01-01

    Museum fish collections possess a wealth of anatomical and morphological data that are essential for documenting and understanding biodiversity. Obtaining access to specimens for research, however, is not always practical and frequently conflicts with the need to maintain the physical integrity of specimens and the collection as a whole. Non-invasive three-dimensional (3D) digital imaging therefore serves a critical role in facilitating the digitization of these specimens for anatomical and morphological analysis as well as facilitating an efficient method for online storage and sharing of this imaging data. Here we describe the development of the Digital Fish Library (DFL, http://www.digitalfishlibrary.org), an online digital archive of high-resolution, high-contrast, magnetic resonance imaging (MRI) scans of the soft tissue anatomy of an array of fishes preserved in the Marine Vertebrate Collection of Scripps Institution of Oceanography. We have imaged and uploaded MRI data for over 300 marine and freshwater species, developed a data archival and retrieval system with a web-based image analysis and visualization tool, and integrated these into the public DFL website to disseminate data and associated metadata freely over the web. We show that MRI is a rapid and powerful method for accurately depicting the in-situ soft-tissue anatomy of preserved fishes in sufficient detail for large-scale comparative digital morphology. However these 3D volumetric data require a sophisticated computational and archival infrastructure in order to be broadly accessible to researchers and educators. PMID:22493695

  17. A method for normalizing pathology images to improve feature extraction for quantitative pathology.

    PubMed

    Tam, Allison; Barker, Jocelyn; Rubin, Daniel

    2016-01-01

    With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology images by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. ICHE may be a useful preprocessing step a digital pathology image processing pipeline.

  18. The Digital Fish Library: using MRI to digitize, database, and document the morphological diversity of fish.

    PubMed

    Berquist, Rachel M; Gledhill, Kristen M; Peterson, Matthew W; Doan, Allyson H; Baxter, Gregory T; Yopak, Kara E; Kang, Ning; Walker, H J; Hastings, Philip A; Frank, Lawrence R

    2012-01-01

    Museum fish collections possess a wealth of anatomical and morphological data that are essential for documenting and understanding biodiversity. Obtaining access to specimens for research, however, is not always practical and frequently conflicts with the need to maintain the physical integrity of specimens and the collection as a whole. Non-invasive three-dimensional (3D) digital imaging therefore serves a critical role in facilitating the digitization of these specimens for anatomical and morphological analysis as well as facilitating an efficient method for online storage and sharing of this imaging data. Here we describe the development of the Digital Fish Library (DFL, http://www.digitalfishlibrary.org), an online digital archive of high-resolution, high-contrast, magnetic resonance imaging (MRI) scans of the soft tissue anatomy of an array of fishes preserved in the Marine Vertebrate Collection of Scripps Institution of Oceanography. We have imaged and uploaded MRI data for over 300 marine and freshwater species, developed a data archival and retrieval system with a web-based image analysis and visualization tool, and integrated these into the public DFL website to disseminate data and associated metadata freely over the web. We show that MRI is a rapid and powerful method for accurately depicting the in-situ soft-tissue anatomy of preserved fishes in sufficient detail for large-scale comparative digital morphology. However these 3D volumetric data require a sophisticated computational and archival infrastructure in order to be broadly accessible to researchers and educators.

  19. Cyber Security and Reliability in a Digital Cloud

    DTIC Science & Technology

    2013-01-01

    a higher utilization of servers, lower professional support staff needs, economies of scale for the physical facility, and the flexibility to locate...as  a  system,  the  DoD  can  achieve  the  economies  of scale typically associated with large data centers.  Recommendation 3: The DoD CIO and DISA...providers will help set  standards for secure cloud computing across the  economy .  Recommendation 7: The DoD CIO and DISA should participate in the

  20. Building Virtual Watersheds: A Global Opportunity to Strengthen Resource Management and Conservation.

    PubMed

    Benda, Lee; Miller, Daniel; Barquin, Jose; McCleary, Richard; Cai, TiJiu; Ji, Y

    2016-03-01

    Modern land-use planning and conservation strategies at landscape to country scales worldwide require complete and accurate digital representations of river networks, encompassing all channels including the smallest headwaters. The digital river networks, integrated with widely available digital elevation models, also need to have analytical capabilities to support resource management and conservation, including attributing river segments with key stream and watershed data, characterizing topography to identify landforms, discretizing land uses at scales necessary to identify human-environment interactions, and connecting channels downstream and upstream, and to terrestrial environments. We investigate the completeness and analytical capabilities of national to regional scale digital river networks that are available in five countries: Canada, China, Russia, Spain, and United States using actual resource management and conservation projects involving 12 university, agency, and NGO organizations. In addition, we review one pan-European and one global digital river network. Based on our analysis, we conclude that the majority of the regional, national, and global scale digital river networks in our sample lack in network completeness, analytical capabilities or both. To address this limitation, we outline a general framework to build as complete as possible digital river networks and to integrate them with available digital elevation models to create robust analytical capabilities (e.g., virtual watersheds). We believe this presents a global opportunity for in-country agencies, or international players, to support creation of virtual watersheds to increase environmental problem solving, broaden access to the watershed sciences, and strengthen resource management and conservation in countries worldwide.

  1. Building Virtual Watersheds: A Global Opportunity to Strengthen Resource Management and Conservation

    NASA Astrophysics Data System (ADS)

    Benda, Lee; Miller, Daniel; Barquin, Jose; McCleary, Richard; Cai, TiJiu; Ji, Y.

    2016-03-01

    Modern land-use planning and conservation strategies at landscape to country scales worldwide require complete and accurate digital representations of river networks, encompassing all channels including the smallest headwaters. The digital river networks, integrated with widely available digital elevation models, also need to have analytical capabilities to support resource management and conservation, including attributing river segments with key stream and watershed data, characterizing topography to identify landforms, discretizing land uses at scales necessary to identify human-environment interactions, and connecting channels downstream and upstream, and to terrestrial environments. We investigate the completeness and analytical capabilities of national to regional scale digital river networks that are available in five countries: Canada, China, Russia, Spain, and United States using actual resource management and conservation projects involving 12 university, agency, and NGO organizations. In addition, we review one pan-European and one global digital river network. Based on our analysis, we conclude that the majority of the regional, national, and global scale digital river networks in our sample lack in network completeness, analytical capabilities or both. To address this limitation, we outline a general framework to build as complete as possible digital river networks and to integrate them with available digital elevation models to create robust analytical capabilities (e.g., virtual watersheds). We believe this presents a global opportunity for in-country agencies, or international players, to support creation of virtual watersheds to increase environmental problem solving, broaden access to the watershed sciences, and strengthen resource management and conservation in countries worldwide.

  2. Land-use in Amazonia and the Cerrado of Brazil: State of Knowledge and GIS Database

    NASA Technical Reports Server (NTRS)

    Nepstad, Daniel C.

    1997-01-01

    We have assembled datasets to strengthen the LargeScale Biosphere Atmosphere Experiment in Amazonia (LBA). These datasets can now be accessed through the Woods Hole Research Center homepage (www.whrc.org), and will soon be linked to the Pre-LBA homepages of the Brazilian Space Research Institute's Center for Weather and Climate Prediction (Instituto de Pesquisas Espaciais, Centro de Previsao de Tempo e Estudos Climaticos, INPE/CPTEC) and through the Oak Ridge National Laboratory, Distributed Active Archive Center (ORNL/DMC). Some of the datasets that we are making available involved new field research and/or the digitization of data available in Brazilian government agencies. For example, during the grant period we conducted interviews at 1,100 sawmills across Amazonia to determine their production of sawn timber, and their harvest intensities. These data provide the basis for the first quantitative assessment of the area of forest affected each year by selective logging (Nepstad et al, submitted to Nature). We digitized the locations of all of the rural households in the State of Para that have been mapped by the Brazilian malaria combat agency (SUCAM). We also mapped and digitized areas of deforestation in the state of Tocantins, which is comprised largely of savanna (cerrado), an ecosystem that has been routinely excluded from deforestation mapping exercises.

  3. Digital Sequences and a Time Reversal-Based Impact Region Imaging and Localization Method

    PubMed Central

    Qiu, Lei; Yuan, Shenfang; Mei, Hanfei; Qian, Weifeng

    2013-01-01

    To reduce time and cost of damage inspection, on-line impact monitoring of aircraft composite structures is needed. A digital monitor based on an array of piezoelectric transducers (PZTs) is developed to record the impact region of impacts on-line. It is small in size, lightweight and has low power consumption, but there are two problems with the impact alarm region localization method of the digital monitor at the current stage. The first one is that the accuracy rate of the impact alarm region localization is low, especially on complex composite structures. The second problem is that the area of impact alarm region is large when a large scale structure is monitored and the number of PZTs is limited which increases the time and cost of damage inspections. To solve the two problems, an impact alarm region imaging and localization method based on digital sequences and time reversal is proposed. In this method, the frequency band of impact response signals is estimated based on the digital sequences first. Then, characteristic signals of impact response signals are constructed by sinusoidal modulation signals. Finally, the phase synthesis time reversal impact imaging method is adopted to obtain the impact region image. Depending on the image, an error ellipse is generated to give out the final impact alarm region. A validation experiment is implemented on a complex composite wing box of a real aircraft. The validation results show that the accuracy rate of impact alarm region localization is approximately 100%. The area of impact alarm region can be reduced and the number of PZTs needed to cover the same impact monitoring region is reduced by more than a half. PMID:24084123

  4. OMERO and Bio-Formats 5: flexible access to large bioimaging datasets at scale

    NASA Astrophysics Data System (ADS)

    Moore, Josh; Linkert, Melissa; Blackburn, Colin; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gillen, Kenneth; Leigh, Roger; Li, Simon; Lindner, Dominik; Moore, William J.; Patterson, Andrew J.; Pindelski, Blazej; Ramalingam, Balaji; Rozbicki, Emil; Tarkowska, Aleksandra; Walczysko, Petr; Allan, Chris; Burel, Jean-Marie; Swedlow, Jason

    2015-03-01

    The Open Microscopy Environment (OME) has built and released Bio-Formats, a Java-based proprietary file format conversion tool and OMERO, an enterprise data management platform under open source licenses. In this report, we describe new versions of Bio-Formats and OMERO that are specifically designed to support large, multi-gigabyte or terabyte scale datasets that are routinely collected across most domains of biological and biomedical research. Bio- Formats reads image data directly from native proprietary formats, bypassing the need for conversion into a standard format. It implements the concept of a file set, a container that defines the contents of multi-dimensional data comprised of many files. OMERO uses Bio-Formats to read files natively, and provides a flexible access mechanism that supports several different storage and access strategies. These new capabilities of OMERO and Bio-Formats make them especially useful for use in imaging applications like digital pathology, high content screening and light sheet microscopy that create routinely large datasets that must be managed and analyzed.

  5. Detection of BRCA1 gross rearrangements by droplet digital PCR.

    PubMed

    Preobrazhenskaya, Elena V; Bizin, Ilya V; Kuligina, Ekatherina Sh; Shleykina, Alla Yu; Suspitsin, Evgeny N; Zaytseva, Olga A; Anisimova, Elena I; Laptiev, Sergey A; Gorodnova, Tatiana V; Belyaev, Alexey M; Imyanitov, Evgeny N; Sokolenko, Anna P

    2017-10-01

    Large genomic rearrangements (LGRs) constitute a significant share of pathogenic BRCA1 mutations. Multiplex ligation-dependent probe amplification (MLPA) is a leading method for LGR detection; however, it is entirely based on the use of commercial kits, includes relatively time-consuming hybridization step, and is not convenient for large-scale screening of recurrent LGRs. We developed and validated the droplet digital PCR (ddPCR) assay, which covers the entire coding region of BRCA1 gene and is capable to precisely quantitate the copy number for each exon. 141 breast cancer (BC) patients, who demonstrated evident clinical features of hereditary BC but turned out to be negative for founder BRCA1/2 mutations, were subjected to the LGR analysis. Four patients with LGR were identified, with three cases of exon 8 deletion and one women carrying the deletion of exons 5-7. Excellent concordance with MLPA test was observed. Exon 8 copy number was tested in additional 720 BC and 184 ovarian cancer (OC) high-risk patients, and another four cases with the deletion were revealed; MLPA re-analysis demonstrated that exon 8 loss was a part of a larger genetic alteration in two cases, while the remaining two patients had isolated defect of exon 8. Long-range PCR and next generation sequencing of DNA samples carrying exon 8 deletion revealed two types of recurrent LGRs. Droplet digital PCR is a reliable tool for the detection of large genomic rearrangements.

  6. An unbalanced spectra classification method based on entropy

    NASA Astrophysics Data System (ADS)

    Liu, Zhong-bao; Zhao, Wen-juan

    2017-05-01

    How to solve the problem of distinguishing the minority spectra from the majority of the spectra is quite important in astronomy. In view of this, an unbalanced spectra classification method based on entropy (USCM) is proposed in this paper to deal with the unbalanced spectra classification problem. USCM greatly improves the performances of the traditional classifiers on distinguishing the minority spectra as it takes the data distribution into consideration in the process of classification. However, its time complexity is exponential with the training size, and therefore, it can only deal with the problem of small- and medium-scale classification. How to solve the large-scale classification problem is quite important to USCM. It can be easily obtained by mathematical computation that the dual form of USCM is equivalent to the minimum enclosing ball (MEB), and core vector machine (CVM) is introduced, USCM based on CVM is proposed to deal with the large-scale classification problem. Several comparative experiments on the 4 subclasses of K-type spectra, 3 subclasses of F-type spectra and 3 subclasses of G-type spectra from Sloan Digital Sky Survey (SDSS) verify USCM and USCM based on CVM perform better than kNN (k nearest neighbor) and SVM (support vector machine) in dealing with the problem of rare spectra mining respectively on the small- and medium-scale datasets and the large-scale datasets.

  7. Digital health for the End TB Strategy: developing priority products and making them work

    PubMed Central

    Timimi, Hazim; Kurosinski, Pascal; Migliori, Giovanni Battista; Van Gemert, Wayne; Denkinger, Claudia; Isaacs, Chris; Story, Alistair; Garfein, Richard S.; do Valle Bastos, Luis Gustavo; Yassin, Mohammed A.; Rusovich, Valiantsin; Skrahina, Alena; Van Hoi, Le; Broger, Tobias; Abubakar, Ibrahim; Hayward, Andrew; Thomas, Bruce V.; Temesgen, Zelalem; Quraishi, Subhi; von Delft, Dalene; Jaramillo, Ernesto; Weyer, Karin; Raviglione, Mario C.

    2016-01-01

    In 2014, the World Health Organization (WHO) developed the End TB Strategy in response to a World Health Assembly Resolution requesting Member States to end the worldwide epidemic of tuberculosis (TB) by 2035. For the strategy's objectives to be realised, the next 20 years will need novel solutions to address the challenges posed by TB to health professionals, and to affected people and communities. Information and communication technology presents opportunities for innovative approaches to support TB efforts in patient care, surveillance, programme management and electronic learning. The effective application of digital health products at a large scale and their continued development need the engagement of TB patients and their caregivers, innovators, funders, policy-makers, advocacy groups, and affected communities. In April 2015, WHO established its Global Task Force on Digital Health for TB to advocate and support the development of digital health innovations in global efforts to improve TB care and prevention. We outline the group's approach to stewarding this process in alignment with the three pillars of the End TB Strategy. The supplementary material of this article includes target product profiles, as developed by early 2016, defining nine priority digital health concepts and products that are strategically positioned to enhance TB action at the country level. PMID:27230443

  8. Exploring the dimensionality of digit span.

    PubMed

    Bowden, Stephen C; Petrauskas, Vilija M; Bardenhagen, Fiona J; Meade, Catherine E; Simpson, Leonie C

    2013-04-01

    The Digit Span subtest from the Wechsler Scales is used to measure Freedom from Distractibility or Working Memory. Some published research suggests that Digit Span forward should be interpreted differently from Digit Span backward. The present study explored the dimensionality of the Wechsler Memory Scale-III Digit Span (forward and backward) items in a sample of heterogeneous neuroscience patients (n = 267) using confirmatory factor analysis (CFA) for dichotomous items. Results suggested that four correlated factors underlie Digit Span, reflecting easy and hard items in both forward and backward presentation orders. The model for Digit Span was then cross-validated in a seizure disorders sample (n = 223) by replication of the CFA and by examination of measurement invariance. Measurement invariance tests of the precise numerical generalization of trait estimation across groups. Results supported measurement invariance and it was concluded that forward and backward digit span scores should be interpreted as measures of the same cognitive ability.

  9. The BOEING 777 - concurrent engineering and digital pre-assembly

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abarbanel, B.

    The processes created on the 777 for checking designs were called {open_quotes}digital pre-assembly{close_quotes}. Using FlyThru(tm), a spin-off of a Boeing advanced computing research project, engineers were able to view up to 1500 models (15000 solids) in 3d traversing that data at high speed. FlyThru(tm) was rapidly deployed in 1991 to meet the needs of the 777 for large scale product visualization and verification. The digital pre-assembly process has bad fantastic results. The 777 has had far fewer assembly and systems problems compared to previous airplane programs. Today, FlyThru(tm) is installed on hundreds of workstations on almost every airplane program, andmore » is being used on Space Station, F22, AWACS, and other defense projects. It`s applications have gone far beyond just design review. In many ways, FlyThru is a Data Warehouse supported by advanced tools for analysis. It is today being integrated with Knowledge Based Engineering geometry generation tools.« less

  10. High-resolution digital brain atlases: a Hubble telescope for the brain.

    PubMed

    Jones, Edward G; Stone, James M; Karten, Harvey J

    2011-05-01

    We describe implementation of a method for digitizing at microscopic resolution brain tissue sections containing normal and experimental data and for making the content readily accessible online. Web-accessible brain atlases and virtual microscopes for online examination can be developed using existing computer and internet technologies. Resulting databases, made up of hierarchically organized, multiresolution images, enable rapid, seamless navigation through the vast image datasets generated by high-resolution scanning. Tools for visualization and annotation of virtual microscope slides enable remote and universal data sharing. Interactive visualization of a complete series of brain sections digitized at subneuronal levels of resolution offers fine grain and large-scale localization and quantification of many aspects of neural organization and structure. The method is straightforward and replicable; it can increase accessibility and facilitate sharing of neuroanatomical data. It provides an opportunity for capturing and preserving irreplaceable, archival neurohistological collections and making them available to all scientists in perpetuity, if resources could be obtained from hitherto uninterested agencies of scientific support. © 2011 New York Academy of Sciences.

  11. An interdisciplinary analysis of multispectral satellite data for selected cover types in the Colorado Mountains, using automatic data processing techniques

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M. (Principal Investigator)

    1975-01-01

    The author has reported the following significant results. A data set containing SKYLAB, LANDSAT, and topographic data has been overlayed, registered, and geometrically corrected to a scale of 1:24,000. After geometrically correcting both sets of data, the SKYLAB data were overlayed on the LANDSAT data. Digital topographic data were then obtained, reformatted, and a data channel containing elevation information was then digitally overlayed onto the LANDSAT and SKYLAB spectral data. The 14,039 square kilometers involving 2,113, 776 LANDSAT pixels represents a relatively large data set available for digital analysis. The overlayed data set enables investigators to numerically analyze and compare two sources of spectral data and topographic data from any point in the scene. This capability is new and it will permit a numerical comparison of spectral response with elevation, slope, and aspect. Utilization of the spectral and topographic data together to obtain more accurate classifications of the various cover types present is feasible.

  12. A nanocryotron comparator can connect single-flux-quantum circuits to conventional electronics

    NASA Astrophysics Data System (ADS)

    Zhao, Qing-Yuan; McCaughan, Adam N.; Dane, Andrew E.; Berggren, Karl K.; Ortlepp, Thomas

    2017-04-01

    Integration with conventional electronics offers a straightforward and economical approach to upgrading existing superconducting technologies, such as scaling up superconducting detectors into large arrays and combining single flux quantum (SFQ) digital circuits with semiconductor logic gates and memories. However, direct output signals from superconducting devices (e.g., Josephson junctions) are usually not compatible with the input requirements of conventional devices (e.g., transistors). Here, we demonstrate the use of a single three-terminal superconducting-nanowire device, called the nanocryotron (nTron), as a digital comparator to combine SFQ circuits with mature semiconductor circuits such as complementary metal oxide semiconductor (CMOS) circuits. Since SFQ circuits can digitize output signals from general superconducting devices and CMOS circuits can interface existing CMOS-compatible electronics, our results demonstrate the feasibility of a general architecture that uses an nTron as an interface to realize a ‘super-hybrid’ system consisting of superconducting detectors, superconducting quantum electronics, CMOS logic gates and memories, and other conventional electronics.

  13. Stress distribution retrieval in granular materials: A multi-scale model and digital image correlation measurements

    NASA Astrophysics Data System (ADS)

    Bruno, Luigi; Decuzzi, Paolo; Gentile, Francesco

    2016-01-01

    The promise of nanotechnology lies in the possibility of engineering matter on the nanoscale and creating technological interfaces that, because of their small scales, may directly interact with biological objects, creating new strategies for the treatment of pathologies that are otherwise beyond the reach of conventional medicine. Nanotechnology is inherently a multiscale, multiphenomena challenge. Fundamental understanding and highly accurate predictive methods are critical to successful manufacturing of nanostructured materials, bio/mechanical devices and systems. In biomedical engineering, and in the mechanical analysis of biological tissues, classical continuum approaches are routinely utilized, even if these disregard the discrete nature of tissues, that are an interpenetrating network of a matrix (the extra cellular matrix, ECM) and a generally large but finite number of cells with a size falling in the micrometer range. Here, we introduce a nano-mechanical theory that accounts for the-non continuum nature of bio systems and other discrete systems. This discrete field theory, doublet mechanics (DM), is a technique to model the mechanical behavior of materials over multiple scales, ranging from some millimeters down to few nanometers. In the paper, we use this theory to predict the response of a granular material to an external applied load. Such a representation is extremely attractive in modeling biological tissues which may be considered as a spatial set of a large number of particulate (cells) dispersed in an extracellular matrix. Possibly more important of this, using digital image correlation (DIC) optical methods, we provide an experimental verification of the model.

  14. Optomechanical System Development of the AWARE Gigapixel Scale Camera

    NASA Astrophysics Data System (ADS)

    Son, Hui S.

    Electronic focal plane arrays (FPA) such as CMOS and CCD sensors have dramatically improved to the point that digital cameras have essentially phased out film (except in very niche applications such as hobby photography and cinema). However, the traditional method of mating a single lens assembly to a single detector plane, as required for film cameras, is still the dominant design used in cameras today. The use of electronic sensors and their ability to capture digital signals that can be processed and manipulated post acquisition offers much more freedom of design at system levels and opens up many interesting possibilities for the next generation of computational imaging systems. The AWARE gigapixel scale camera is one such computational imaging system. By utilizing a multiscale optical design, in which a large aperture objective lens is mated with an array of smaller, well corrected relay lenses, we are able to build an optically simple system that is capable of capturing gigapixel scale images via post acquisition stitching of the individual pictures from the array. Properly shaping the array of digital cameras allows us to form an effectively continuous focal surface using off the shelf (OTS) flat sensor technology. This dissertation details developments and physical implementations of the AWARE system architecture. It illustrates the optomechanical design principles and system integration strategies we have developed through the course of the project by summarizing the results of the two design phases for AWARE: AWARE-2 and AWARE-10. These systems represent significant advancements in the pursuit of scalable, commercially viable snapshot gigapixel imaging systems and should serve as a foundation for future development of such systems.

  15. An Analytic Creativity Assessment Scale for Digital Game Story Design: Construct Validity, Internal Consistency and Interrater Reliability

    ERIC Educational Resources Information Center

    Chuang, Tsung-Yen; Huang, Yun-Hsuan

    2015-01-01

    Mobile technology has rapidly made digital games a popular entertainment to this digital generation, and thus digital game design received considerable attention in both the game industry and design education. Digital game design involves diverse dimensions in which digital game story design (DGSD) particularly attracts our interest, as the…

  16. Electron Density Profiles of the Topside Ionosphere

    NASA Technical Reports Server (NTRS)

    Huang, Xue-Qin; Reinsch, Bodo W.; Bilitza, Dieter; Benson, Robert F.

    2002-01-01

    The existing uncertainties about the electron density profiles in the topside ionosphere, i.e., in the height region from h,F2 to - 2000 km, require the search for new data sources. The ISIS and Alouette topside sounder satellites from the sixties to the eighties recorded millions of ionograms but most were not analyzed in terms of electron density profiles. In recent years an effort started to digitize the analog recordings to prepare the ionograms for computerized analysis. As of November 2001 about 350000 ionograms have been digitized from the original 7-track analog tapes. These data are available in binary and CDF format from the anonymous ftp site of the National Space Science Data Center. A search site and browse capabilities on CDAWeb assist the scientific usage of these data. All information and access links can be found at http://nssdc.gsfc.nasa.gov/space/isis/isis- status.htm1. This paper describes the ISIS data restoration effort and shows how the digital ionograms are automatically processed into electron density profiles from satellite orbit altitude (1400 km for ISIS-2) down to the F peak. Because of the large volume of data an automated processing algorithm is imperative. The TOPside Ionogram Scaler with True height algorithm TOPIST software developed for this task is successfully scaling - 70% of the ionograms. An <> is available to manually scale the more difficult ionograms. The automated processing of the digitized ISIS ionograms is now underway, producing a much-needed database of topside electron density profiles for ionospheric modeling covering more than one solar cycle.

  17. DeepScope: Nonintrusive Whole Slide Saliency Annotation and Prediction from Pathologists at the Microscope

    PubMed Central

    Schaumberg, Andrew J.; Sirintrapun, S. Joseph; Al-Ahmadie, Hikmat A.; Schüffler, Peter J.; Fuchs, Thomas J.

    2018-01-01

    Modern digital pathology departments have grown to produce whole-slide image data at petabyte scale, an unprecedented treasure chest for medical machine learning tasks. Unfortunately, most digital slides are not annotated at the image level, hindering large-scale application of supervised learning. Manual labeling is prohibitive, requiring pathologists with decades of training and outstanding clinical service responsibilities. This problem is further aggravated by the United States Food and Drug Administration’s ruling that primary diagnosis must come from a glass slide rather than a digital image. We present the first end-to-end framework to overcome this problem, gathering annotations in a nonintrusive manner during a pathologist’s routine clinical work: (i) microscope-specific 3D-printed commodity camera mounts are used to video record the glass-slide-based clinical diagnosis process; (ii) after routine scanning of the whole slide, the video frames are registered to the digital slide; (iii) motion and observation time are estimated to generate a spatial and temporal saliency map of the whole slide. Demonstrating the utility of these annotations, we train a convolutional neural network that detects diagnosis-relevant salient regions, then report accuracy of 85.15% in bladder and 91.40% in prostate, with 75.00% accuracy when training on prostate but predicting in bladder, despite different pathologists examining the different tissues. When training on one patient but testing on another, AUROC in bladder is 0.79±0.11 and in prostate is 0.96±0.04. Our tool is available at https://bitbucket.org/aschaumberg/deepscope PMID:29601065

  18. The cosmological principle is not in the sky

    NASA Astrophysics Data System (ADS)

    Park, Chan-Gyung; Hyun, Hwasu; Noh, Hyerim; Hwang, Jai-chan

    2017-08-01

    The homogeneity of matter distribution at large scales, known as the cosmological principle, is a central assumption in the standard cosmological model. The case is testable though, thus no longer needs to be a principle. Here we perform a test for spatial homogeneity using the Sloan Digital Sky Survey Luminous Red Galaxies (LRG) sample by counting galaxies within a specified volume with the radius scale varying up to 300 h-1 Mpc. We directly confront the large-scale structure data with the definition of spatial homogeneity by comparing the averages and dispersions of galaxy number counts with allowed ranges of the random distribution with homogeneity. The LRG sample shows significantly larger dispersions of number counts than the random catalogues up to 300 h-1 Mpc scale, and even the average is located far outside the range allowed in the random distribution; the deviations are statistically impossible to be realized in the random distribution. This implies that the cosmological principle does not hold even at such large scales. The same analysis of mock galaxies derived from the N-body simulation, however, suggests that the LRG sample is consistent with the current paradigm of cosmology, thus the simulation is also not homogeneous in that scale. We conclude that the cosmological principle is neither in the observed sky nor demanded to be there by the standard cosmological world model. This reveals the nature of the cosmological principle adopted in the modern cosmology paradigm, and opens a new field of research in theoretical cosmology.

  19. Performance/price estimates for cortex-scale hardware: a design space exploration.

    PubMed

    Zaveri, Mazad S; Hammerstrom, Dan

    2011-04-01

    In this paper, we revisit the concept of virtualization. Virtualization is useful for understanding and investigating the performance/price and other trade-offs related to the hardware design space. Moreover, it is perhaps the most important aspect of a hardware design space exploration. Such a design space exploration is a necessary part of the study of hardware architectures for large-scale computational models for intelligent computing, including AI, Bayesian, bio-inspired and neural models. A methodical exploration is needed to identify potentially interesting regions in the design space, and to assess the relative performance/price points of these implementations. As an example, in this paper we investigate the performance/price of (digital and mixed-signal) CMOS and hypothetical CMOL (nanogrid) technology based hardware implementations of human cortex-scale spiking neural systems. Through this analysis, and the resulting performance/price points, we demonstrate, in general, the importance of virtualization, and of doing these kinds of design space explorations. The specific results suggest that hybrid nanotechnology such as CMOL is a promising candidate to implement very large-scale spiking neural systems, providing a more efficient utilization of the density and storage benefits of emerging nano-scale technologies. In general, we believe that the study of such hypothetical designs/architectures will guide the neuromorphic hardware community towards building large-scale systems, and help guide research trends in intelligent computing, and computer engineering. Copyright © 2010 Elsevier Ltd. All rights reserved.

  20. Galaxy-scale Bars in Late-type Sloan Digital Sky Survey Galaxies Do Not Influence the Average Accretion Rates of Supermassive Black Holes

    NASA Astrophysics Data System (ADS)

    Goulding, A. D.; Matthaey, E.; Greene, J. E.; Hickox, R. C.; Alexander, D. M.; Forman, W. R.; Jones, C.; Lehmer, B. D.; Griffis, S.; Kanek, S.; Oulmakki, M.

    2017-07-01

    Galaxy-scale bars are expected to provide an effective means for driving material toward the central region in spiral galaxies, and possibly feeding supermassive black holes (BHs). Here we present a statistically complete study of the effect of bars on average BH accretion. From a well-selected sample of 50,794 spiral galaxies (with {M}* ˜ 0.2{--}30× {10}10 {M}⊙ ) extracted from the Sloan Digital Sky Survey Galaxy Zoo 2 project, we separate those sources considered to contain galaxy-scale bars from those that do not. Using archival data taken by the Chandra X-ray Observatory, we identify X-ray luminous ({L}{{X}}≳ {10}41 {erg} {{{s}}}-1) active galactic nuclei and perform an X-ray stacking analysis on the remaining X-ray undetected sources. Through X-ray stacking, we derive a time-averaged look at accretion for galaxies at fixed stellar mass and star-formation rate, finding that the average nuclear accretion rates of galaxies with bar structures are fully consistent with those lacking bars ({\\dot{M}}{acc}≈ 3× {10}-5 {M}⊙ yr-1). Hence, we robustly conclude that large-scale bars have little or no effect on the average growth of BHs in nearby (z< 0.15) galaxies over gigayear timescales.

  1. The workshop. [use and application of remotely sensed data

    NASA Technical Reports Server (NTRS)

    Wake, W. H.

    1981-01-01

    The plan is presented for a two day workshop held to provide educational and training experience in the reading, interpretation, and application of LANDSAT and correlated larger scale imagery, digital printout maps, and other collateral material for a large number of participants with widely diverse levels of expertise, backgrounds, and occupations in government, industry, and education. The need for using surface truth field studies with correlated aerial imagery in solving real world problems was demonstrated.

  2. New dynamic FET logic and serial memory circuits for VLSI GaAs technology

    NASA Technical Reports Server (NTRS)

    Eldin, A. G.

    1991-01-01

    The complexity of GaAs field effect transistor (FET) very large scale integration (VLSI) circuits is limited by the maximum power dissipation while the uniformity of the device parameters determines the functional yield. In this work, digital GaAs FET circuits are presented that eliminate the DC power dissipation and reduce the area to 50% of that of the conventional static circuits. Its larger tolerance to device parameter variations results in higher functional yield.

  3. Multi-discipline resource inventory of soils, vegetation and geology

    NASA Technical Reports Server (NTRS)

    Simonson, G. H. (Principal Investigator); Paine, D. P.; Lawrence, R. D.; Norgren, J. A.; Pyott, W. Y.; Herzog, J. H.; Murray, R. J.; Rogers, R.

    1973-01-01

    The author has identified the following significant results. Computer classification of natural vegetation, in the vicinity of Big Summit Prairie, Crook County, Oregon was carried out using MSS digital data. Impure training sets, representing eleven vegetation types plus water, were selected from within the area to be classified. Close correlations were visually observed between vegetation types mapped from the large scale photographs and the computer classification of the ERTS data (Frame 1021-18151, 13 August 1972).

  4. Multi-scale learning based segmentation of glands in digital colonrectal pathology images.

    PubMed

    Gao, Yi; Liu, William; Arjun, Shipra; Zhu, Liangjia; Ratner, Vadim; Kurc, Tahsin; Saltz, Joel; Tannenbaum, Allen

    2016-02-01

    Digital histopathological images provide detailed spatial information of the tissue at micrometer resolution. Among the available contents in the pathology images, meso-scale information, such as the gland morphology, texture, and distribution, are useful diagnostic features. In this work, focusing on the colon-rectal cancer tissue samples, we propose a multi-scale learning based segmentation scheme for the glands in the colon-rectal digital pathology slides. The algorithm learns the gland and non-gland textures from a set of training images in various scales through a sparse dictionary representation. After the learning step, the dictionaries are used collectively to perform the classification and segmentation for the new image.

  5. Multi-scale learning based segmentation of glands in digital colonrectal pathology images

    NASA Astrophysics Data System (ADS)

    Gao, Yi; Liu, William; Arjun, Shipra; Zhu, Liangjia; Ratner, Vadim; Kurc, Tahsin; Saltz, Joel; Tannenbaum, Allen

    2016-03-01

    Digital histopathological images provide detailed spatial information of the tissue at micrometer resolution. Among the available contents in the pathology images, meso-scale information, such as the gland morphology, texture, and distribution, are useful diagnostic features. In this work, focusing on the colon-rectal cancer tissue samples, we propose a multi-scale learning based segmentation scheme for the glands in the colon-rectal digital pathology slides. The algorithm learns the gland and non-gland textures from a set of training images in various scales through a sparse dictionary representation. After the learning step, the dictionaries are used collectively to perform the classification and segmentation for the new image.

  6. LiDAR DTMs and anthropogenic feature extraction: testing the feasibility of geomorphometric parameters in floodplains

    NASA Astrophysics Data System (ADS)

    Sofia, G.; Tarolli, P.; Dalla Fontana, G.

    2012-04-01

    In floodplains, massive investments in land reclamation have always played an important role in the past for flood protection. In these contexts, human alteration is reflected by artificial features ('Anthropogenic features'), such as banks, levees or road scarps, that constantly increase and change, in response to the rapid growth of human populations. For these areas, various existing and emerging applications require up-to-date, accurate and sufficiently attributed digital data, but such information is usually lacking, especially when dealing with large-scale applications. More recently, National or Local Mapping Agencies, in Europe, are moving towards the generation of digital topographic information that conforms to reality and are highly reliable and up to date. LiDAR Digital Terrain Models (DTMs) covering large areas are readily available for public authorities, and there is a greater and more widespread interest in the application of such information by agencies responsible for land management for the development of automated methods aimed at solving geomorphological and hydrological problems. Automatic feature recognition based upon DTMs can offer, for large-scale applications, a quick and accurate method that can help in improving topographic databases, and that can overcome some of the problems associated with traditional, field-based, geomorphological mapping, such as restrictions on access, and constraints of time or costs. Although anthropogenic features as levees and road scarps are artificial structures that actually do not belong to what is usually defined as the bare ground surface, they are implicitly embedded in digital terrain models (DTMs). Automatic feature recognition based upon DTMs, therefore, can offer a quick and accurate method that does not require additional data, and that can help in improving flood defense asset information, flood modeling or other applications. In natural contexts, morphological indicators derived from high resolution topography have been proven to be reliable for feasible applications. The use of statistical operators as thresholds for these geomorphic parameters, furthermore, showed a high reliability for feature extraction in mountainous environments. The goal of this research is to test if these morphological indicators and objective thresholds can be feasible also in floodplains, where features assume different characteristics and other artificial disturbances might be present. In the work, three different geomorphic parameters are tested and applied at different scales on a LiDAR DTM of typical alluvial plain's area in the North East of Italy. The box-plot is applied to identify the threshold for feature extraction, and a filtering procedure is proposed, to improve the quality of the final results. The effectiveness of the different geomorphic parameters is analyzed, comparing automatically derived features with the surveyed ones. The results highlight the capability of high resolution topography, geomorphic indicators and statistical thresholds for anthropogenic features extraction and characterization in a floodplains context.

  7. Towards large-scale mapping of urban three-dimensional structure using Landsat imagery and global elevation datasets

    NASA Astrophysics Data System (ADS)

    Wang, P.; Huang, C.

    2017-12-01

    The three-dimensional (3D) structure of buildings and infrastructures is fundamental to understanding and modelling of the impacts and challenges of urbanization in terms of energy use, carbon emissions, and earthquake vulnerabilities. However, spatially detailed maps of urban 3D structure have been scarce, particularly in fast-changing developing countries. We present here a novel methodology to map the volume of buildings and infrastructures at 30 meter resolution using a synergy of Landsat imagery and openly available global digital surface models (DSMs), including the Shuttle Radar Topography Mission (SRTM), ASTER Global Digital Elevation Map (GDEM), ALOS World 3D - 30m (AW3D30), and the recently released global DSM from the TanDEM-X mission. Our method builds on the concept of object-based height profile to extract height metrics from the DSMs and use a machine learning algorithm to predict height and volume from the height metrics. We have tested this algorithm in the entire England and assessed our result using Lidar measurements in 25 England cities. Our initial assessments achieved a RMSE of 1.4 m (R2 = 0.72) for building height and a RMSE of 1208.7 m3 (R2 = 0.69) for building volume, demonstrating the potential of large-scale applications and fully automated mapping of urban structure.

  8. High-performance holographic technologies for fluid-dynamics experiments

    PubMed Central

    Orlov, Sergei S.; Abarzhi, Snezhana I.; Oh, Se Baek; Barbastathis, George; Sreenivasan, Katepalli R.

    2010-01-01

    Modern technologies offer new opportunities for experimentalists in a variety of research areas of fluid dynamics. Improvements are now possible in the state-of-the-art in precision, dynamic range, reproducibility, motion-control accuracy, data-acquisition rate and information capacity. These improvements are required for understanding complex turbulent flows under realistic conditions, and for allowing unambiguous comparisons to be made with new theoretical approaches and large-scale numerical simulations. One of the new technologies is high-performance digital holography. State-of-the-art motion control, electronics and optical imaging allow for the realization of turbulent flows with very high Reynolds number (more than 107) on a relatively small laboratory scale, and quantification of their properties with high space–time resolutions and bandwidth. In-line digital holographic technology can provide complete three-dimensional mapping of the flow velocity and density fields at high data rates (over 1000 frames per second) over a relatively large spatial area with high spatial (1–10 μm) and temporal (better than a few nanoseconds) resolution, and can give accurate quantitative description of the fluid flows, including those of multi-phase and unsteady conditions. This technology can be applied in a variety of problems to study fundamental properties of flow–particle interactions, rotating flows, non-canonical boundary layers and Rayleigh–Taylor mixing. Some of these examples are discussed briefly. PMID:20211881

  9. Design, construction and commissioning of the Digital Hadron Calorimeter—DHCAL

    NASA Astrophysics Data System (ADS)

    Adams, C.; Bambaugh, A.; Bilki, B.; Butler, J.; Corriveau, F.; Cundiff, T.; Drake, G.; Francis, K.; Furst, B.; Guarino, V.; Haberichter, B.; Hazen, E.; Hoff, J.; Holm, S.; Kreps, A.; DeLurgio, P.; Matijas, Z.; Dal Monte, L.; Mucia, N.; Norbeck, E.; Northacker, D.; Onel, Y.; Pollack, B.; Repond, J.; Schlereth, J.; Skrzecz, F.; Smith, J. R.; Trojand, D.; Underwood, D.; Velasco, M.; Walendziak, J.; Wood, K.; Wu, S.; Xia, L.; Zhang, Q.; Zhao, A.

    2016-07-01

    A novel hadron calorimeter is being developed for future lepton colliding beam detectors. The calorimeter is optimized for the application of Particle Flow Algorithms (PFAs) to the measurement of hadronic jets and features a very finely segmented readout with 1 × 1 cm2 cells. The active media of the calorimeter are Resistive Plate Chambers (RPCs) with a digital, i.e. one-bit, readout. To first order the energy of incident particles in this calorimeter is reconstructed as being proportional to the number of pads with a signal over a given threshold. A large-scale prototype calorimeter with approximately 500,000 readout channels has been built and underwent extensive testing in the Fermilab and CERN test beams. This paper reports on the design, construction, and commissioning of this prototype calorimeter.

  10. Design of a fault tolerant airborne digital computer. Volume 2: Computational requirements and technology

    NASA Technical Reports Server (NTRS)

    Ratner, R. S.; Shapiro, E. B.; Zeidler, H. M.; Wahlstrom, S. E.; Clark, C. B.; Goldberg, J.

    1973-01-01

    This final report summarizes the work on the design of a fault tolerant digital computer for aircraft. Volume 2 is composed of two parts. Part 1 is concerned with the computational requirements associated with an advanced commercial aircraft. Part 2 reviews the technology that will be available for the implementation of the computer in the 1975-1985 period. With regard to the computation task 26 computations have been categorized according to computational load, memory requirements, criticality, permitted down-time, and the need to save data in order to effect a roll-back. The technology part stresses the impact of large scale integration (LSI) on the realization of logic and memory. Also considered was module interconnection possibilities so as to minimize fault propagation.

  11. Digital stereo photogrammetry for grain-scale monitoring of fluvial surfaces: Error evaluation and workflow optimisation

    NASA Astrophysics Data System (ADS)

    Bertin, Stephane; Friedrich, Heide; Delmas, Patrice; Chan, Edwin; Gimel'farb, Georgy

    2015-03-01

    Grain-scale monitoring of fluvial morphology is important for the evaluation of river system dynamics. Significant progress in remote sensing and computer performance allows rapid high-resolution data acquisition, however, applications in fluvial environments remain challenging. Even in a controlled environment, such as a laboratory, the extensive acquisition workflow is prone to the propagation of errors in digital elevation models (DEMs). This is valid for both of the common surface recording techniques: digital stereo photogrammetry and terrestrial laser scanning (TLS). The optimisation of the acquisition process, an effective way to reduce the occurrence of errors, is generally limited by the use of commercial software. Therefore, the removal of evident blunders during post processing is regarded as standard practice, although this may introduce new errors. This paper presents a detailed evaluation of a digital stereo-photogrammetric workflow developed for fluvial hydraulic applications. The introduced workflow is user-friendly and can be adapted to various close-range measurements: imagery is acquired with two Nikon D5100 cameras and processed using non-proprietary "on-the-job" calibration and dense scanline-based stereo matching algorithms. Novel ground truth evaluation studies were designed to identify the DEM errors, which resulted from a combination of calibration errors, inaccurate image rectifications and stereo-matching errors. To ensure optimum DEM quality, we show that systematic DEM errors must be minimised by ensuring a good distribution of control points throughout the image format during calibration. DEM quality is then largely dependent on the imagery utilised. We evaluated the open access multi-scale Retinex algorithm to facilitate the stereo matching, and quantified its influence on DEM quality. Occlusions, inherent to any roughness element, are still a major limiting factor to DEM accuracy. We show that a careful selection of the camera-to-object and baseline distance reduces errors in occluded areas and that realistic ground truths help to quantify those errors.

  12. A Call to Action for Research in Digital Learning: Learning without Limits of Time, Place, Path, Pace…or Evidence

    ERIC Educational Resources Information Center

    Cavanaugh, Cathy; Sessums, Christopher; Drexler, Wendy

    2015-01-01

    This essay is a call for rethinking our approach to research in digital learning. It plots a path founded in social trends and advances in education. A brief review of these trends and advances is followed by discussion of what flattened research might look like at scale. Scaling research in digital learning is crucial to advancing understanding…

  13. Geologic Map of the Tucson and Nogales Quadrangles, Arizona (Scale 1:250,000): A Digital Database

    USGS Publications Warehouse

    Peterson, J.A.; Berquist, J.R.; Reynolds, S.J.; Page-Nedell, S. S.; Digital database by Oland, Gustav P.; Hirschberg, Douglas M.

    2001-01-01

    The geologic map of the Tucson-Nogales 1:250,000 scale quadrangle (Peterson and others, 1990) was digitized by U.S. Geological Survey staff and University of Arizona contractors at the Southwest Field Office, Tucson, Arizona, in 2000 for input into a geographic information system (GIS). The database was created for use as a basemap in a decision support system designed by the National Industrial Minerals and Surface Processes project. The resulting digital geologic map database can be queried in many ways to produce a variety of geologic maps. Digital base map data files (topography, roads, towns, rivers and lakes, etc.) are not included; they may be obtained from a variety of commercial and government sources. Additionally, point features, such as strike and dip, were not captured from the original paper map and are not included in the database. This database is not meant to be used or displayed at any scale larger than 1:250,000 (for example, 1:100,000 or 1:24,000). The digital geologic map graphics and plot files that are provided in the digital package are representations of the digital database. They are not designed to be cartographic products.

  14. Digital scale converter

    DOEpatents

    Upton, Richard G.

    1978-01-01

    A digital scale converter is provided for binary coded decimal (BCD) conversion. The converter may be programmed to convert a BCD value of a first scale to the equivalent value of a second scale according to a known ratio. The value to be converted is loaded into a first BCD counter and counted down to zero while a second BCD counter registers counts from zero or an offset value depending upon the conversion. Programmable rate multipliers are used to generate pulses at selected rates to the counters for the proper conversion ratio. The value present in the second counter at the time the first counter is counted to the zero count is the equivalent value of the second scale. This value may be read out and displayed on a conventional seven-segment digital display.

  15. Helicopter rotor and engine sizing for preliminary performance estimation

    NASA Technical Reports Server (NTRS)

    Talbot, P. D.; Bowles, J. V.; Lee, H. C.

    1986-01-01

    Methods are presented for estimating some of the more fundamental design variables of single-rotor helicopters (tip speed, blade area, disk loading, and installed power) based on design requirements (speed, weight, fuselage drag, and design hover ceiling). The well-known constraints of advancing-blade compressibility and retreating-blade stall are incorporated into the estimation process, based on an empirical interpretation of rotor performance data from large-scale wind-tunnel tests. Engine performance data are presented and correlated with a simple model usable for preliminary design. When approximate results are required quickly, these methods may be more convenient to use and provide more insight than large digital computer programs.

  16. A "Social Bitcoin" could sustain a democratic digital world

    NASA Astrophysics Data System (ADS)

    Kleineberg, Kaj-Kolja; Helbing, Dirk

    2016-12-01

    A multidimensional financial system could provide benefits for individuals, companies, and states. Instead of top-down control, which is destined to eventually fail in a hyperconnected world, a bottom-up creation of value can unleash creative potential and drive innovations. Multiple currency dimensions can represent different externalities and thus enable the design of incentives and feedback mechanisms that foster the ability of complex dynamical systems to self-organize and lead to a more resilient society and sustainable economy. Modern information and communication technologies play a crucial role in this process, as Web 2.0 and online social networks promote cooperation and collaboration on unprecedented scales. Within this contribution, we discuss how one dimension of a multidimensional currency system could represent socio-digital capital (Social Bitcoins) that can be generated in a bottom-up way by individuals who perform search and navigation tasks in a future version of the digital world. The incentive to mine Social Bitcoins could sustain digital diversity, which mitigates the risk of totalitarian control by powerful monopolies of information and can create new business opportunities needed in times where a large fraction of current jobs is estimated to disappear due to computerisation.

  17. An engineering methodology for implementing and testing VLSI (Very Large Scale Integrated) circuits

    NASA Astrophysics Data System (ADS)

    Corliss, Walter F., II

    1989-03-01

    The engineering methodology for producing a fully tested VLSI chip from a design layout is presented. A 16-bit correlator, NPS CORN88, that was previously designed, was used as a vehicle to demonstrate this methodology. The study of the design and simulation tools, MAGIC and MOSSIM II, was the focus of the design and validation process. The design was then implemented and the chip was fabricated by MOSIS. This fabricated chip was then used to develop a testing methodology for using the digital test facilities at NPS. NPS CORN88 was the first full custom VLSI chip, designed at NPS, to be tested with the NPS digital analysis system, Tektronix DAS 9100 series tester. The capabilities and limitations of these test facilities are examined. NPS CORN88 test results are included to demonstrate the capabilities of the digital test system. A translator, MOS2DAS, was developed to convert the MOSSIM II simulation program to the input files required by the DAS 9100 device verification software, 91DVS. Finally, a tutorial for using the digital test facilities, including the DAS 9100 and associated support equipments, is included as an appendix.

  18. Comprehensive geo-spatial data creation for Qassim region in the KSA

    NASA Astrophysics Data System (ADS)

    Alrajhi, M.; Hawarey, M.

    2009-04-01

    The General Directorate for Surveying and Mapping (GDSM) of the Deputy Ministry for Land and Surveying (DMLS) of the Ministry of Municipal and Rural Affairs (MOMRA) in the Kingdom of Saudi Arabia (KSA) has the exclusive mandate to carry out aerial photography and produce large-scale detailed maps for about 220 cities and villages in the KSA. This presentation is about the comprehensive geo-spatial data creation for the Qassim region, North KSA, that was founded on country-wide horizontal geodetic ground control using Global Navigation Satellite Systems (GNSS) within the MOMRA's Terrestrial Reference Frame 2000 (MTRF2000) that is tied to International Terrestrial Reference Frame 2000 (ITRF2000) Epoch 2004.0, and vertical geodetic ground control using precise digital leveling in reference to Jeddah 1969 mean sea level, and included aerial photography of 1,505 km2 at 1:5,500 scale, 4,081 km2 at scale 22,500 and 22,224 km2 at 1:45,000 scale, full aerial triangulation, production of orthophoto maps at scale of 1:10,000 (463 sheets) for 22,224 km2, and production of GIS-oriented highly-detailed digital line maps in various formats at scales of 1:1,000 (1,534 sheets) and 1:2,500 (383 sheets) for 1,150 km2, 1:10,000 (161 sheets) for 7,700 km2, and 1:20,000 (130 sheets) for 22,000 km2. While aerial photography lasted from Feb 2003 thru May 2003, the line mapping continued May 2005 until December 2008.

  19. Rational design of stealthy hyperuniform two-phase media with tunable order

    NASA Astrophysics Data System (ADS)

    DiStasio, Robert A.; Zhang, Ge; Stillinger, Frank H.; Torquato, Salvatore

    2018-02-01

    Disordered stealthy hyperuniform materials are exotic amorphous states of matter that have attracted recent attention because of their novel structural characteristics (hidden order at large length scales) and physical properties, including desirable photonic and transport properties. It is therefore useful to devise algorithms that enable one to design a wide class of such amorphous configurations at will. In this paper, we present several algorithms enabling the systematic identification and generation of discrete (digitized) stealthy hyperuniform patterns with a tunable degree of order, paving the way towards the rational design of disordered materials endowed with novel thermodynamic and physical properties. To quantify the degree of order or disorder of the stealthy systems, we utilize the discrete version of the τ order metric, which accounts for the underlying spatial correlations that exist across all relevant length scales in a given digitized two-phase (or, equivalently, a two-spin state) system of interest. Our results impinge on a myriad of fields, ranging from physics, materials science and engineering, visual perception, and information theory to modern data science.

  20. Implementing a National Scottish Digital Health & Wellbeing Service at Scale: A Qualitative Study of Stakeholders' Views.

    PubMed

    Agbakoba, Ruth; McGee-Lennon, Marilyn; Bouamrane, Matt-Mouley; Watson, Nicholas; Mair, Frances

    2015-01-01

    Digital technologies are being used as part of international efforts to revolutionize healthcare in order to meet increasing demands such as the rising burden of chronic disease and ageing populations. In Scotland there is a government push towards a national service (Living It Up) as a single point of reference where citizens can access information, products and services to support their health and wellbeing. The aim of the study is to examine implementation issues including the challenges or facilitators which can help to sustain this intervention. We gathered data in three ways: a) participant observation to gain an understanding of LiU (N=16); b) in-depth interviews (N=21) with stakeholders involved in the process; and c) analysis of documentary evidence about the progress of the implementation (N=45). Barriers included the need to "work at risk" due to delays in financing, inadequate infrastructure and skill-set deficiencies, whilst facilitators included trusted relationships, champions and a push towards normalisation. The findings suggest that a Scottish ehealth service is achievable but identifies key considerations for future large scale initiatives.

  1. Laser jetting of femto-liter metal droplets for high resolution 3D printed structures

    NASA Astrophysics Data System (ADS)

    Zenou, M.; Sa'Ar, A.; Kotler, Z.

    2015-11-01

    Laser induced forward transfer (LIFT) is employed in a special, high accuracy jetting regime, by adequately matching the sub-nanosecond pulse duration to the metal donor layer thickness. Under such conditions, an effective solid nozzle is formed, providing stability and directionality to the femto-liter droplets which are printed from a large gap in excess of 400 μm. We illustrate the wide applicability of this method by printing several 3D metal objects. First, very high aspect ratio (A/R > 20), micron scale, copper pillars in various configuration, upright and arbitrarily bent, then a micron scale 3D object composed of gold and copper. Such a digital printing method could serve the generation of complex, multi-material, micron-scale, 3D materials and novel structures.

  2. Spatial structures of stream and hillslope drainage networks following gully erosion after wildfire

    USGS Publications Warehouse

    Moody, J.A.; Kinner, D.A.

    2006-01-01

    The drainage networks of catchment areas burned by wildfire were analysed at several scales. The smallest scale (1-1000 m2) representative of hillslopes, and the small scale (1000 m2 to 1 km2), representative of small catchments, were characterized by the analysis of field measurements. The large scale (1-1000 km2), representative of perennial stream networks, was derived from a 30-m digital elevation model and analysed by computer analysis. Scaling laws used to describe large-scale drainage networks could be extrapolated to the small scale but could not describe the smallest scale of drainage structures observed in the hillslope region. The hillslope drainage network appears to have a second-order effect that reduces the number of order 1 and order 2 streams predicted by the large-scale channel structure. This network comprises two spatial patterns of rills with width-to-depth ratios typically less than 10. One pattern is parallel rills draining nearly planar hillslope surfaces, and the other pattern is three to six converging rills draining the critical source area uphill from an order 1 channel head. The magnitude of this critical area depends on infiltration, hillslope roughness and critical shear stress for erosion of sediment, all of which can be substantially altered by wildfire. Order 1 and 2 streams were found to constitute the interface region, which is altered by a disturbance, like wildfire, from subtle unchannelized drainages in unburned catchments to incised drainages. These drainages are characterized by gullies also with width-to-depth ratios typically less than 10 in burned catchments. The regions (hillslope, interface and chanel) had different drainage network structures to collect and transfer water and sediment. Copyright ?? 2005 John Wiley & Sons, Ltd.

  3. Additional Results of Glaze Icing Scaling in SLD Conditions

    NASA Technical Reports Server (NTRS)

    Tsao, Jen-Ching

    2016-01-01

    New guidance of acceptable means of compliance with the super-cooled large drops (SLD) conditions has been issued by the U.S. Department of Transportation's Federal Aviation Administration (FAA) in its Advisory Circular AC 25-28 in November 2014. The Part 25, Appendix O is developed to define a representative icing environment for super-cooled large drops. Super-cooled large drops, which include freezing drizzle and freezing rain conditions, are not included in Appendix C. This paper reports results from recent glaze icing scaling tests conducted in NASA Glenn Icing Research Tunnel (IRT) to evaluate how well the scaling methods recommended for Appendix C conditions might apply to SLD conditions. The models were straight NACA 0012 wing sections. The reference model had a chord of 72 inches and the scale model had a chord of 21 inches. Reference tests were run with airspeeds of 100 and 130.3 knots and with MVD's of 85 and 170 microns. Two scaling methods were considered. One was based on the modified Ruff method with scale velocity found by matching the Weber number W (sub eL). The other was proposed and developed by Feo specifically for strong glaze icing conditions, in which the scale liquid water content and velocity were found by matching reference and scale values of the non-dimensional water-film thickness expression and the film Weber number W (sub ef). All tests were conducted at 0 degrees angle of arrival. Results will be presented for stagnation freezing fractions of 0.2 and 0.3. For non-dimensional reference and scale ice shape comparison, a new post-scanning ice shape digitization procedure was developed for extracting 2-dimensional ice shape profiles at any selected span-wise location from the high fidelity 3-dimensional scanned ice shapes obtained in the IRT.

  4. Additional Results of Glaze Icing Scaling in SLD Conditions

    NASA Technical Reports Server (NTRS)

    Tsao, Jen-Ching

    2016-01-01

    New guidance of acceptable means of compliance with the super-cooled large drops (SLD) conditions has been issued by the U.S. Department of Transportation's Federal Aviation Administration (FAA) in its Advisory Circular AC 25-28 in November 2014. The Part 25, Appendix O is developed to define a representative icing environment for super-cooled large drops. Super-cooled large drops, which include freezing drizzle and freezing rain conditions, are not included in Appendix C. This paper reports results from recent glaze icing scaling tests conducted in NASA Glenn Icing Research Tunnel (IRT) to evaluate how well the scaling methods recommended for Appendix C conditions might apply to SLD conditions. The models were straight NACA 0012 wing sections. The reference model had a chord of 72 in. and the scale model had a chord of 21 in. Reference tests were run with airspeeds of 100 and 130.3 kn and with MVD's of 85 and 170 micron. Two scaling methods were considered. One was based on the modified Ruff method with scale velocity found by matching the Weber number WeL. The other was proposed and developed by Feo specifically for strong glaze icing conditions, in which the scale liquid water content and velocity were found by matching reference and scale values of the nondimensional water-film thickness expression and the film Weber number Wef. All tests were conducted at 0 deg AOA. Results will be presented for stagnation freezing fractions of 0.2 and 0.3. For nondimensional reference and scale ice shape comparison, a new post-scanning ice shape digitization procedure was developed for extracting 2-D ice shape profiles at any selected span-wise location from the high fidelity 3-D scanned ice shapes obtained in the IRT.

  5. High-grade video compression of echocardiographic studies: a multicenter validation study of selected motion pictures expert groups (MPEG)-4 algorithms.

    PubMed

    Barbier, Paolo; Alimento, Marina; Berna, Giovanni; Celeste, Fabrizio; Gentile, Francesco; Mantero, Antonio; Montericcio, Vincenzo; Muratori, Manuela

    2007-05-01

    Large files produced by standard compression algorithms slow down spread of digital and tele-echocardiography. We validated echocardiographic video high-grade compression with the new Motion Pictures Expert Groups (MPEG)-4 algorithms with a multicenter study. Seven expert cardiologists blindly scored (5-point scale) 165 uncompressed and compressed 2-dimensional and color Doppler video clips, based on combined diagnostic content and image quality (uncompressed files as references). One digital video and 3 MPEG-4 algorithms (WM9, MV2, and DivX) were used, the latter at 3 compression levels (0%, 35%, and 60%). Compressed file sizes decreased from 12 to 83 MB to 0.03 to 2.3 MB (1:1051-1:26 reduction ratios). Mean SD of differences was 0.81 for intraobserver variability (uncompressed and digital video files). Compared with uncompressed files, only the DivX mean score at 35% (P = .04) and 60% (P = .001) compression was significantly reduced. At subcategory analysis, these differences were still significant for gray-scale and fundamental imaging but not for color or second harmonic tissue imaging. Original image quality, session sequence, compression grade, and bitrate were all independent determinants of mean score. Our study supports use of MPEG-4 algorithms to greatly reduce echocardiographic file sizes, thus facilitating archiving and transmission. Quality evaluation studies should account for the many independent variables that affect image quality grading.

  6. Interoperability, Scaling, and the Digital Libraries Research Agenda.

    ERIC Educational Resources Information Center

    Lynch, Clifford; Garcia-Molina, Hector

    1996-01-01

    Summarizes reports and activities at the Information Infrastructure Technology and Applications workshop on digital libraries (Reston, Virginia, August 22, 1995). Defines digital library roles and identifies areas of needed research, including: interoperability; protocols for digital objects; collection management; interface design; human-computer…

  7. JCE Digital Library Grand Opening

    ERIC Educational Resources Information Center

    Journal of Chemical Education, 2004

    2004-01-01

    The National Science, Technology, Engineering and Mathematical Education Digital Library (NSDL), inaugurated in December 2002, is developed to promote science education on a comprehensive scale. The Journal of Chemical, Education (JCE) Digital Library, incorporated into NSDL, contains its own collections of digital resources for chemistry…

  8. Readiness for Delivering Digital Health at Scale: Lessons From a Longitudinal Qualitative Evaluation of a National Digital Health Innovation Program in the United Kingdom.

    PubMed

    Lennon, Marilyn R; Bouamrane, Matt-Mouley; Devlin, Alison M; O'Connor, Siobhan; O'Donnell, Catherine; Chetty, Ula; Agbakoba, Ruth; Bikker, Annemieke; Grieve, Eleanor; Finch, Tracy; Watson, Nicholas; Wyke, Sally; Mair, Frances S

    2017-02-16

    Digital health has the potential to support care delivery for chronic illness. Despite positive evidence from localized implementations, new technologies have proven slow to become accepted, integrated, and routinized at scale. The aim of our study was to examine barriers and facilitators to implementation of digital health at scale through the evaluation of a £37m national digital health program: ‟Delivering Assisted Living Lifestyles at Scale" (dallas) from 2012-2015. The study was a longitudinal qualitative, multi-stakeholder, implementation study. The methods included interviews (n=125) with key implementers, focus groups with consumers and patients (n=7), project meetings (n=12), field work or observation in the communities (n=16), health professional survey responses (n=48), and cross program documentary evidence on implementation (n=215). We used a sociological theory called normalization process theory (NPT) and a longitudinal (3 years) qualitative framework analysis approach. This work did not study a single intervention or population. Instead, we evaluated the processes (of designing and delivering digital health), and our outcomes were the identified barriers and facilitators to delivering and mainstreaming services and products within the mixed sector digital health ecosystem. We identified three main levels of issues influencing readiness for digital health: macro (market, infrastructure, policy), meso (organizational), and micro (professional or public). Factors hindering implementation included: lack of information technology (IT) infrastructure, uncertainty around information governance, lack of incentives to prioritize interoperability, lack of precedence on accountability within the commercial sector, and a market perceived as difficult to navigate. Factors enabling implementation were: clinical endorsement, champions who promoted digital health, and public and professional willingness. Although there is receptiveness to digital health, barriers to mainstreaming remain. Our findings suggest greater investment in national and local infrastructure, implementation of guidelines for the safe and transparent use and assessment of digital health, incentivization of interoperability, and investment in upskilling of professionals and the public would help support the normalization of digital health. These findings will enable researchers, health care practitioners, and policy makers to understand the current landscape and the actions required in order to prepare the market and accelerate uptake, and use of digital health and wellness services in context and at scale. ©Marilyn R Lennon, Matt-Mouley Bouamrane, Alison M Devlin, Siobhan O'Connor, Catherine O'Donnell, Ula Chetty, Ruth Agbakoba, Annemieke Bikker, Eleanor Grieve, Tracy Finch, Nicholas Watson, Sally Wyke, Frances S Mair. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 16.02.2017.

  9. The influence of cognitive load on spatial search performance.

    PubMed

    Longstaffe, Kate A; Hood, Bruce M; Gilchrist, Iain D

    2014-01-01

    During search, executive function enables individuals to direct attention to potential targets, remember locations visited, and inhibit distracting information. In the present study, we investigated these executive processes in large-scale search. In our tasks, participants searched a room containing an array of illuminated locations embedded in the floor. The participants' task was to press the switches at the illuminated locations on the floor so as to locate a target that changed color when pressed. The perceptual salience of the search locations was manipulated by having some locations flashing and some static. Participants were more likely to search at flashing locations, even when they were explicitly informed that the target was equally likely to be at any location. In large-scale search, attention was captured by the perceptual salience of the flashing lights, leading to a bias to explore these targets. Despite this failure of inhibition, participants were able to restrict returns to previously visited locations, a measure of spatial memory performance. Participants were more able to inhibit exploration to flashing locations when they were not required to remember which locations had previously been visited. A concurrent digit-span memory task further disrupted inhibition during search, as did a concurrent auditory attention task. These experiments extend a load theory of attention to large-scale search, which relies on egocentric representations of space. High cognitive load on working memory leads to increased distractor interference, providing evidence for distinct roles for the executive subprocesses of memory and inhibition during large-scale search.

  10. Virtual Cultural Landscape Laboratory Based on Internet GIS Technology

    NASA Astrophysics Data System (ADS)

    Bill, R.

    2012-07-01

    In recent years the transfer of old documents (books, paintings, maps etc.) from analogue to digital form has gained enormous importance. Numerous interventions are concentrated in the digitalisation of library collections, but also commercial companies like Microsoft or Google try to convert large analogue stocks such as books, paintings, etc. in digital form. Data in digital form can be much easier made accessible to a large user community, especially to the interested scientific community. The aim of the described research project is to set up a virtual research environment for interdisciplinary research focusing on the landscape of the historical Mecklenburg in the north-east of Germany. Georeferenced old maps from 1786 and 1890 covering complete Mecklenburg should be combined with current geo-information, satellite and aerial imagery to support spatio-temporal research aspects in different scales in space (regional 1:200,000 to local 1:25.000) and time (nearly 250 years in three time steps, the last 30 years also in three time slices). The Virtual Laboratory for Cultural Landscape Research (VKLandLab) is designed and developed by the Chair of Geodesy and Geoinformatics, hosted at the Computing Centre (ITMZ) and linked to the Digital Library (UB) at Rostock University. VKLandLab includes new developments such as wikis, blogs, data tagging, etc. and proven components already integrated in various data-related infrastructures such as InternetGIS, data repositories and authentication structures. The focus is to build a data-related infrastructure and a work platform that supports students as well as researchers from different disciplines in their research in space and time.

  11. Scaling Sap Flow Results Over Wide Areas Using High-Resolution Aerial Multispectral Digital Imaging, Leaf Area Index (LAI) and MODIS Satellite Imagery in Saltcedar Stands on the Lower Colorado River

    NASA Astrophysics Data System (ADS)

    Murray, R.; Neale, C.; Nagler, P. L.; Glenn, E. P.

    2008-12-01

    Heat-balance sap flow sensors provide direct estimates of water movement through plant stems and can be used to accurately measure leaf-level transpiration (EL) and stomatal conductance (GS) over time scales ranging from 20-minutes to a month or longer in natural stands of plants. However, their use is limited to relatively small branches on shrubs or trees, as the gauged stem section needs to be uniformly heated by the heating coil to produce valid measurements. This presents a scaling problem in applying the results to whole plants, stands of plants, and larger landscape areas. We used high-resolution aerial multispectral digital imaging with green, red and NIR bands as a bridge between ground measurements of EL and GS, and MODIS satellite imagery of a flood plain on the Lower Colorado River dominated by saltcedar (Tamarix ramosissima). Saltcedar is considered to be a high-water-use plant, and saltcedar removal programs have been proposed to salvage water. Hence, knowledge of actual saltcedar ET rates is needed on western U.S. rivers. Scaling EL and GS to large landscape units requires knowledge of leaf area index (LAI) over large areas. We used a LAI model developed for riparian habitats on Bosque del Apache, New Mexico, to estimate LAI at our study site on the Colorado River. We compared the model estimates to ground measurements of LAI, determined with a Li-Cor LAI-2000 Plant Canopy Analyzer calibrated by leaf harvesting to determine Specific Leaf Area (SLA) (m2 leaf area per g dry weight leaves) of the different species on the floodplain. LAI could be adequately predicted from NDVI from aerial multispectral imagery and could be cross-calibrated with MODIS NDVI and EVI. Hence, we were able to project point measurements of sap flow and LAI over multiple years and over large areas of floodplain using aerial multispectral imagery as a bridge between ground and satellite data. The methods are applicable to riparian corridors throughout the western U.S.

  12. A method for normalizing pathology images to improve feature extraction for quantitative pathology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tam, Allison; Barker, Jocelyn; Rubin, Daniel

    Purpose: With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. Methods: To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology imagesmore » by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. Results: The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. Conclusions: ICHE may be a useful preprocessing step a digital pathology image processing pipeline.« less

  13. Digital and social media opportunities for dietary behaviour change.

    PubMed

    McGloin, Aileen F; Eslami, Sara

    2015-05-01

    The way that people communicate, consume media and seek and receive information is changing. Forty per cent of the world's population now has an internet connection, the average global social media penetration is 39% and 1·5 billion people have internet access via mobile phone. This large-scale move in population use of digital, social and mobile media presents an unprecedented opportunity to connect with individuals on issues concerning health. The present paper aims to investigate these opportunities in relation to dietary behaviour change. Several aspects of the digital environment could support behaviour change efforts, including reach, engagement, research, segmentation, accessibility and potential to build credibility, trust, collaboration and advocacy. There are opportunities to influence behaviour online using similar techniques to traditional health promotion programmes; to positively affect health-related knowledge, skills and self-efficacy. The abundance of data on citizens' digital behaviours, whether through search behaviour, global positioning system tracking, or via demographics and interests captured through social media profiles, offer exciting opportunities for effectively targeting relevant health messages. The digital environment presents great possibilities but also great challenges. Digital communication is uncontrolled, multi-way and co-created and concerns remain in relation to inequalities, privacy, misinformation and lack of evaluation. Although web-based, social-media-based and mobile-based studies tend to show positive results for dietary behaviour change, methodologies have yet to be developed that go beyond basic evaluation criteria and move towards true measures of behaviour change. Novel approaches are necessary both in the digital promotion of behaviour change and in its measurement.

  14. Stochastic Downscaling of Digital Elevation Models

    NASA Astrophysics Data System (ADS)

    Rasera, Luiz Gustavo; Mariethoz, Gregoire; Lane, Stuart N.

    2016-04-01

    High-resolution digital elevation models (HR-DEMs) are extremely important for the understanding of small-scale geomorphic processes in Alpine environments. In the last decade, remote sensing techniques have experienced a major technological evolution, enabling fast and precise acquisition of HR-DEMs. However, sensors designed to measure elevation data still feature different spatial resolution and coverage capabilities. Terrestrial altimetry allows the acquisition of HR-DEMs with centimeter to millimeter-level precision, but only within small spatial extents and often with dead ground problems. Conversely, satellite radiometric sensors are able to gather elevation measurements over large areas but with limited spatial resolution. In the present study, we propose an algorithm to downscale low-resolution satellite-based DEMs using topographic patterns extracted from HR-DEMs derived for example from ground-based and airborne altimetry. The method consists of a multiple-point geostatistical simulation technique able to generate high-resolution elevation data from low-resolution digital elevation models (LR-DEMs). Initially, two collocated DEMs with different spatial resolutions serve as an input to construct a database of topographic patterns, which is also used to infer the statistical relationships between the two scales. High-resolution elevation patterns are then retrieved from the database to downscale a LR-DEM through a stochastic simulation process. The output of the simulations are multiple equally probable DEMs with higher spatial resolution that also depict the large-scale geomorphic structures present in the original LR-DEM. As these multiple models reflect the uncertainty related to the downscaling, they can be employed to quantify the uncertainty of phenomena that are dependent on fine topography, such as catchment hydrological processes. The proposed methodology is illustrated for a case study in the Swiss Alps. A swissALTI3D HR-DEM (with 5 m resolution) and a SRTM-derived LR-DEM from the Western Alps are used to downscale a SRTM-based LR-DEM from the eastern part of the Alps. The results show that the method is capable of generating multiple high-resolution synthetic DEMs that reproduce the spatial structure and statistics of the original DEM.

  15. Sequences, stratigraphy and scenarios: what can we say about the fossil record of the earliest tetrapods?

    PubMed

    Friedman, Matt; Brazeau, Martin D

    2011-02-07

    Past research on the emergence of digit-bearing tetrapods has led to the widely accepted premise that this important evolutionary event occurred during the Late Devonian. The discovery of convincing digit-bearing tetrapod trackways of early Middle Devonian age in Poland has upset this orthodoxy, indicating that current scenarios which link the timing of the origin of digited tetrapods to specific events in Earth history are likely to be in error. Inspired by this find, we examine the fossil record of early digit-bearing tetrapods and their closest fish-like relatives from a statistical standpoint. We find that the Polish trackways force a substantial reconsideration of the nature of the early tetrapod record when only body fossils are considered. However, the effect is less drastic (and often not statistically significant) when other reliably dated trackways that were previously considered anachronistic are taken into account. Using two approaches, we find that 95 per cent credible and confidence intervals for the origin of digit-bearing tetrapods extend into the Early Devonian and beyond, spanning late Emsian to mid Ludlow. For biologically realistic diversity models, estimated genus-level preservation rates for Devonian digited tetrapods and their relatives range from 0.025 to 0.073 per lineage-million years, an order of magnitude lower than species-level rates for groups typically considered to have dense records. Available fossils of early digited tetrapods and their immediate relatives are adequate for documenting large-scale patterns of character acquisition associated with the origin of terrestriality, but low preservation rates coupled with clear geographical and stratigraphic sampling biases caution against building scenarios for the origin of digits and terrestrialization tied to the provenance of particular specimens or faunas.

  16. Digital Isostatic Gravity Map of the Nevada Test Site and Vicinity, Nye, Lincoln, and Clark Counties, Nevada, and Inyo County, California

    USGS Publications Warehouse

    Ponce, David A.; Mankinen, E.A.; Davidson, J.G.; Morin, R.L.; Blakely, R.J.

    2000-01-01

    An isostatic gravity map of the Nevada Test Site area was prepared from publicly available gravity data (Ponce, 1997) and from gravity data recently collected by the U.S. Geological Survey (Mankinen and others, 1999; Morin and Blakely, 1999). Gravity data were processed using standard gravity data reduction techniques. Southwest Nevada is characterized by gravity anomalies that reflect the distribution of pre-Cenozoic carbonate rocks, thick sequences of volcanic rocks, and thick alluvial basins. In addition, regional gravity data reveal the presence of linear features that reflect large-scale faults whereas detailed gravity data can indicate the presence of smaller-scale faults.

  17. Non-invasive measurement of proppant pack deformation

    DOE PAGES

    Walsh, Stuart D. C.; Smith, Megan; Carroll, Susan A.; ...

    2016-05-26

    In this study, we describe a method to non-invasively study the movement of proppant packs at the sub-fracture scale by applying three-dimensional digital image correlation techniques to X-ray tomography data. Proppant movement is tracked in a fractured core of Marcellus shale placed under a series of increasing confining pressures up to 10,000 psi. The analysis reveals the sudden failure of a region of the proppant pack, accompanied by the large-scale rearrangement of grains across the entire fracture surface. The failure of the pack coincides with the appearance of vortex-like grain motions similar to features observed in biaxial compression of twomore » dimensional granular assemblies.« less

  18. Application of a Silicon Compiler to VLSI (Very Large Scale Integrated Circuits) Design of Digital Pipelined Multipliers.

    DTIC Science & Technology

    1984-06-01

    programming environment and then dumped, as described in the Franz Lisp manual [Ref. 13]. A synopsis of the functional elements which make up this LISP...the average system usage rate. Lines i4 and 15 reflect a function of Franz Lisp wherein past used storage locations are reclaimed for the available... Franz Lisp Opus 38. Also included in this distribu- tion are two library files containing the bonding Fad a layouts in CIF, and a library file

  19. VLSI (Very Large Scale Integrated) Design of a 16 Bit Very Fast Pipelined Carry Look Ahead Adder.

    DTIC Science & Technology

    1983-09-01

    the ability for systems engineers to custom design digital integrated circuits. Until recently, the design of integrated circuits has been...traditionally carried out by a select group of logic designers working in semiconductor laboratories. Systems engineers had to "make do" or "fit in" the...products of these labs to realize their designs. The systems engineers had little participation in the actual design of the chip. The MED and CONWAY design

  20. Wetland mapping from digitized aerial photography. [Sheboygen Marsh, Sheboygen County, Wisconsin

    NASA Technical Reports Server (NTRS)

    Scarpace, F. L.; Quirk, B. K.; Kiefer, R. W.; Wynn, S. L.

    1981-01-01

    Computer assisted interpretation of small scale aerial imagery was found to be a cost effective and accurate method of mapping complex vegetation patterns if high resolution information is desired. This type of technique is suited for problems such as monitoring changes in species composition due to environmental factors and is a feasible method of monitoring and mapping large areas of wetlands. The technique has the added advantage of being in a computer compatible form which can be transformed into any georeference system of interest.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, David Edward

    A description of the development of the mc_runjob software package used to manage large scale computing tasks for the D0 Experiment at Fermilab is presented, along with a review of the Digital Front End Trigger electronics and the software used to control them. A tracking study is performed on detector data to determine that the D0 Experiment can detect charged B mesons, and that these results are in accordance with current results. B mesons are found by searching for the decay channel B ± → J / Ψ K ± .

  2. Theoretical and experimental studies in support of the geophysical fluid flow experiment

    NASA Technical Reports Server (NTRS)

    Hart, J.; Toomre, J.; Gilman, P.

    1984-01-01

    Computer programming was completed for digital acquisition of temperature and velocity data generated by the Geophysical Fluid Flow Cell (GFFC) during the upcoming Spacelab 3 mission. A set of scenarios was developed which covers basic electro-hydrodynamic instability, highly supercritical convection with isothermal boundaries, convection with imposed thermal forcing, and some stably stratified runs to look at large-scale thermohaline ocean circulations. The extent to which the GFFC experimental results apply to more complicated circumstances within the Sun or giant planets was assessed.

  3. Surface features of central North America: a synoptic view from computer graphics

    USGS Publications Warehouse

    Pike, R.J.

    1991-01-01

    A digital shaded-relief image of the 48 contiguous United States shows the details of large- and small-scale landforms, including several linear trends. The features faithfully reflect tectonism, continental glaciation, fluvial activity, volcanism, and other surface-shaping events and processes. The new map not only depicts topography accurately and in its true complexity, but does so in one synoptic view that provides a regional context for geologic analysis unobscured by clouds, culture, vegetation, or artistic constraints. -Author

  4. Architectures and algorithms for digital image processing; Proceedings of the Meeting, Cannes, France, December 5, 6, 1985

    NASA Technical Reports Server (NTRS)

    Duff, Michael J. B. (Editor); Siegel, Howard J. (Editor); Corbett, Francis J. (Editor)

    1986-01-01

    The conference presents papers on the architectures, algorithms, and applications of image processing. Particular attention is given to a very large scale integration system for image reconstruction from projections, a prebuffer algorithm for instant display of volume data, and an adaptive image sequence filtering scheme based on motion detection. Papers are also presented on a simple, direct practical method of sensing local motion and analyzing local optical flow, image matching techniques, and an automated biological dosimetry system.

  5. A review of aspects relating to the improvement of holographic memory technology

    NASA Astrophysics Data System (ADS)

    Vyukhina, N. N.; Gibin, I. S.; Dombrovsky, V. A.; Dombrovsky, S. A.; Pankov, B. N.; Pen, E. F.; Potapov, A. N.; Sinyukov, A. M.; Tverdokhleb, P. E.; Shelkovnikov, V. V.

    1996-06-01

    Results of studying a holographic memory to write/read digital data pages are presented. The research has been carried out in Novosibirsk, Russia. Great attention was paid to methods of improving recording density and the reliability of data reading, the development of 'dry' photopolymers that provide recording of superimposed three-dimensional phase holograms, and the designing of parallel optic input large-scale integration (LSI) for reading and logical processing of data arriving from the holographic memory.

  6. Flow Control via a Single Spanwise Wire on the Surface of a Stationary Cylinder

    NASA Astrophysics Data System (ADS)

    Ekmekci, Alis; Rockwell, Donald

    2007-11-01

    The flow structure arising from a single spanwise wire attached along the surface of a circular stationary cylinder is investigated experimentally via a cinema technique of digital particle image velocimetry (DPIV). Consideration is given to wires that have smaller and larger scales than the thickness of the unperturbed boundary layer that develops around the cylinder prior to flow separation. The wires have diameters that are 1% and 3% of the cylinder diameter. Over a certain range of angular positions with respect to the approach flow, both small- and large-scale wires show important global effects on the entire near-wake. Two critical angles are identified on the basis of the near-wake structure. These critical angles are associated with extension and contraction of the near-wake, relative to the wake in absence of the effect of a surface disturbance. The critical angle of the wire that yields near-wake extension is associated with bistable oscillations of the separating shear layer, at irregular time intervals, much longer that the time scale associated with classical Karman vortex shedding. Moreover, for the large scale wire, in specific cases, either attenuation or enhancement of the Karman mode of vortex formation is observed.

  7. Full-scale high-speed ``Edgerton'' retroreflective shadowgraphy of gunshots

    NASA Astrophysics Data System (ADS)

    Settles, Gary

    2005-11-01

    Almost 1/2 century ago, H. E. ``Doc'' Edgerton demonstrated a simple and elegant direct-shadowgraph technique for imaging large-scale events like explosions and gunshots. Only a retroreflective screen, flashlamp illumination, and an ordinary view camera were required. Retroreflective shadowgraphy has seen occasional use since then, but its unique combination of large scale, simplicity and portability has barely been tapped. It functions well in environments hostile to most optical diagnostics, such as full-scale outdoor daylight ballistics and explosives testing. Here, shadowgrams cast upon a 2.4 m square retroreflective screen are imaged by a Photron Fastcam APX-RS digital camera that is capable of megapixel image resolution at 3000 frames/sec up to 250,000 frames/sec at lower resolution. Microsecond frame exposures are used to examine the external ballistics of several firearms, including a high-powered rifle, an AK-47 submachine gun, and several pistols and revolvers. Muzzle blast phenomena and the mechanism of gunpowder residue deposition on the shooter's hands are clearly visualized. In particular, observing the firing of a pistol with and without a silencer (suppressor) suggests that some of the muzzle blast energy is converted by the silencer into supersonic jet noise.

  8. Stereophotogrammetry in studies of riparian vegetation dynamics

    NASA Astrophysics Data System (ADS)

    Hortobagyi, Borbala; Vautier, Franck; Corenblit, Dov; Steiger, Johannes

    2014-05-01

    Riparian vegetation responds to hydrogeomorphic disturbances and also controls sediment deposition and erosion. Spatio-temporal riparian vegetation dynamics within fluvial corridors have been quantified in many studies using aerial photographs and GIS. However, this approach does not allow the consideration of woody vegetation growth rates (i.e. vertical dimension) which are fundamental when studying feedbacks between the processes of fluvial landform construction and vegetation establishment and succession. We built 3D photogrammetric models of vegetation height based on aerial argentic and digital photographs from sites of the Allier and Garonne Rivers (France). The models were realized at two different spatial scales and with two different methods. The "large" scale corresponds to the reach of the river corridor on the Allier river (photograph taken in 2009) and the "small" scale to river bars of the Allier (photographs taken in 2002, 2009) and Garonne Rivers (photographs taken in 2000, 2002, 2006 and 2010). At the corridor scale, we generated vegetation height models using an automatic procedure. This method is fast but can only be used with digital photographs. At the bar scale, we constructed the models manually using a 3D visualization on the screen. This technique showed good results for digital and also argentic photographs but is very time-consuming. A diachronic study was performed in order to investigate vegetation succession by distinguishing three different classes according to the vegetation height: herbs (<1 m), shrubs (1-4 m) or trees (>4 m). Both methods, i.e. automatic and manual, were employed to study the evolution of the three vegetation classes and the recruitment of new vegetation patches. A comparison was conducted between the vegetation height given by models (automatic and manual) and the vegetation height measured in the field. The manually produced models (small scale) were of a precision of 0.5-1 m, allowing the quantification of woody vegetation growth rates. Thus, our results show that the manual method we developed is accurate to quantify vegetation growth rates at small scales, whereas the less accurate automatic method is appropriate to study vegetation succession at the corridor scale. Both methods are complementary and will contribute to a further exploration of the mutual relationships between hydrogeomorphic processes, topography and vegetation dynamics within alluvial systems, adding the quantification of the vertical dimension of riparian vegetation to their spatio-temporal characteristics.

  9. Development of distortion measurement system for large deployable antenna via photogrammetry in vacuum and cryogenic environment

    NASA Astrophysics Data System (ADS)

    Zhang, Pengsong; Jiang, Shanping; Yang, Linhua; Zhang, Bolun

    2018-01-01

    In order to meet the requirement of high precision thermal distortion measurement foraΦ4.2m deployable mesh antenna of satellite in vacuum and cryogenic environment, based on Digital Close-range Photogrammetry and Space Environment Test Technology of Spacecraft, a large scale antenna distortion measurement system under vacuum and cryogenic environment is developed in this paper. The antenna Distortion measurement system (ADMS) is the first domestic independently developed thermal distortion measurement system for large antenna, which has successfully solved non-contact high precision distortion measurement problem in large spacecraft structure under vacuum and cryogenic environment. The measurement accuracy of ADMS is better than 50 μm/5m, which has reached international advanced level. The experimental results show that the measurement system has great advantages in large structural measurement of spacecrafts, and also has broad application prospects in space or other related fields.

  10. Integrated digital inverters based on two-dimensional anisotropic ReS2 field-effect transistors

    PubMed Central

    Liu, Erfu; Fu, Yajun; Wang, Yaojia; Feng, Yanqing; Liu, Huimei; Wan, Xiangang; Zhou, Wei; Wang, Baigeng; Shao, Lubin; Ho, Ching-Hwa; Huang, Ying-Sheng; Cao, Zhengyi; Wang, Laiguo; Li, Aidong; Zeng, Junwen; Song, Fengqi; Wang, Xinran; Shi, Yi; Yuan, Hongtao; Hwang, Harold Y.; Cui, Yi; Miao, Feng; Xing, Dingyu

    2015-01-01

    Semiconducting two-dimensional transition metal dichalcogenides are emerging as top candidates for post-silicon electronics. While most of them exhibit isotropic behaviour, lowering the lattice symmetry could induce anisotropic properties, which are both scientifically interesting and potentially useful. Here we present atomically thin rhenium disulfide (ReS2) flakes with unique distorted 1T structure, which exhibit in-plane anisotropic properties. We fabricated monolayer and few-layer ReS2 field-effect transistors, which exhibit competitive performance with large current on/off ratios (∼107) and low subthreshold swings (100 mV per decade). The observed anisotropic ratio along two principle axes reaches 3.1, which is the highest among all known two-dimensional semiconducting materials. Furthermore, we successfully demonstrated an integrated digital inverter with good performance by utilizing two ReS2 anisotropic field-effect transistors, suggesting the promising implementation of large-scale two-dimensional logic circuits. Our results underscore the unique properties of two-dimensional semiconducting materials with low crystal symmetry for future electronic applications. PMID:25947630

  11. Digital health for the End TB Strategy: developing priority products and making them work.

    PubMed

    Falzon, Dennis; Timimi, Hazim; Kurosinski, Pascal; Migliori, Giovanni Battista; Van Gemert, Wayne; Denkinger, Claudia; Isaacs, Chris; Story, Alistair; Garfein, Richard S; do Valle Bastos, Luis Gustavo; Yassin, Mohammed A; Rusovich, Valiantsin; Skrahina, Alena; Van Hoi, Le; Broger, Tobias; Abubakar, Ibrahim; Hayward, Andrew; Thomas, Bruce V; Temesgen, Zelalem; Quraishi, Subhi; von Delft, Dalene; Jaramillo, Ernesto; Weyer, Karin; Raviglione, Mario C

    2016-07-01

    In 2014, the World Health Organization (WHO) developed the End TB Strategy in response to a World Health Assembly Resolution requesting Member States to end the worldwide epidemic of tuberculosis (TB) by 2035. For the strategy's objectives to be realised, the next 20 years will need novel solutions to address the challenges posed by TB to health professionals, and to affected people and communities. Information and communication technology presents opportunities for innovative approaches to support TB efforts in patient care, surveillance, programme management and electronic learning. The effective application of digital health products at a large scale and their continued development need the engagement of TB patients and their caregivers, innovators, funders, policy-makers, advocacy groups, and affected communities.In April 2015, WHO established its Global Task Force on Digital Health for TB to advocate and support the development of digital health innovations in global efforts to improve TB care and prevention. We outline the group's approach to stewarding this process in alignment with the three pillars of the End TB Strategy. The supplementary material of this article includes target product profiles, as developed by early 2016, defining nine priority digital health concepts and products that are strategically positioned to enhance TB action at the country level. The content of this work is ©the authors or their employers. Design and branding are ©ERS 2016.

  12. Temperature sensitivity study of eddy current and digital gauge probes for nuclear fuel rod oxide measurement

    NASA Astrophysics Data System (ADS)

    Beck, Faith R.; Lind, R. Paul; Smith, James A.

    2018-04-01

    Novel fuels are part of the nationwide effort to reduce the enrichment of Uranium for energy production. Performance of such fuels is determined by irradiating their surfaces. To test irradiated samples, the instrumentation must operate remotely. The plate checker used in this experiment at Idaho National Lab (INL) performs non-destructive testing on fuel rod and plate geometries with two different types of sensors: eddy current and digital thickness gauges. The sensors measure oxide growth and total sample thickness on research fuels, respectively. Sensor measurement accuracy is crucial because even 10 microns of error is significant when determining the viability of an experimental fuel. One parameter known to affect the eddy current and thickness gauge sensors is temperature. Since both sensor accuracies depend on the ambient temperature of the system, the plate checker has been characterized for these sensitivities. The manufacturer of the digital gauge probes has noted a rather large coefficient of thermal expansion for their linear scale. It should also be noted that the accuracy of the digital gauge probes are specified at 20°C, which is approximately 7°C cooler than the average hot-cell temperature. In this work, the effect of temperature on the eddy current and digital gauge probes is studied, and thickness measurements are given as empirical functions of temperature.

  13. Development of a 3D Stream Network and Topography for Improved Large-Scale Hydraulic Modeling

    NASA Astrophysics Data System (ADS)

    Saksena, S.; Dey, S.; Merwade, V.

    2016-12-01

    Most digital elevation models (DEMs) used for hydraulic modeling do not include channel bed elevations. As a result, the DEMs are complimented with additional bathymetric data for accurate hydraulic simulations. Existing methods to acquire bathymetric information through field surveys or through conceptual models are limited to reach-scale applications. With an increasing focus on large scale hydraulic modeling of rivers, a framework to estimate and incorporate bathymetry for an entire stream network is needed. This study proposes an interpolation-based algorithm to estimate bathymetry for a stream network by modifying the reach-based empirical River Channel Morphology Model (RCMM). The effect of a 3D stream network that includes river bathymetry is then investigated by creating a 1D hydraulic model (HEC-RAS) and 2D hydrodynamic model (Integrated Channel and Pond Routing) for the Upper Wabash River Basin in Indiana, USA. Results show improved simulation of flood depths and storage in the floodplain. Similarly, the impact of river bathymetry incorporation is more significant in the 2D model as compared to the 1D model.

  14. DigitalCrust – a 4D data system of material properties for transforming research on crustal fluid flow

    USGS Publications Warehouse

    Fan, Yin; Richard, Steve; Bristol, R. Sky; Peters, Shanan; Ingebritsen, Steven E.; Moosdorf, Nils; Packman, Aaron I.; Gleeson, Tom; Zazlavsky, Ilya; Peckham, Scott; Murdoch, Larry; Cardiff, Michael; Tarboton, David; Jones, Norm; Hooper, Richard; Arrigo, Jennifer; Gochis, David; Olson, John

    2015-01-01

    Fluid circulation in the Earth's crust plays an essential role in surface, near surface, and deep crustal processes. Flow pathways are driven by hydraulic gradients but controlled by material permeability, which varies over many orders of magnitude and changes over time. Although millions of measurements of crustal properties have been made, including geophysical imaging and borehole tests, this vast amount of data and information has not been integrated into a comprehensive knowledge system. A community data infrastructure is needed to improve data access, enable large-scale synthetic analyses, and support representations of the subsurface in Earth system models. Here, we describe the motivation, vision, challenges, and an action plan for a community-governed, four-dimensional data system of the Earth's crustal structure, composition, and material properties from the surface down to the brittle–ductile transition. Such a system must not only be sufficiently flexible to support inquiries in many different domains of Earth science, but it must also be focused on characterizing the physical crustal properties of permeability and porosity, which have not yet been synthesized at a large scale. The DigitalCrust is envisioned as an interactive virtual exploration laboratory where models can be calibrated with empirical data and alternative hypotheses can be tested at a range of spatial scales. It must also support a community process for compiling and harmonizing models into regional syntheses of crustal properties. Sustained peer review from multiple disciplines will allow constant refinement in the ability of the system to inform science questions and societal challenges and to function as a dynamic library of our knowledge of Earth's crust.

  15. Applying a statewide geospatial leaching tool for assessing soil vulnerability ratings for agrochemicals across the contiguous United States.

    PubMed

    Ki, Seo Jin; Ray, Chittaranjan; Hantush, Mohamed M

    2015-06-15

    A large-scale leaching assessment tool not only illustrates soil (or groundwater) vulnerability in unmonitored areas, but also can identify areas of potential concern for agrochemical contamination. This study describes the methodology of how the statewide leaching tool in Hawaii modified recently for use with pesticides and volatile organic compounds can be extended to the national assessment of soil vulnerability ratings. For this study, the tool was updated by extending the soil and recharge maps to cover the lower 48 states in the United States (US). In addition, digital maps of annual pesticide use (at a national scale) as well as detailed soil properties and monthly recharge rates (at high spatial and temporal resolutions) were used to examine variations in the leaching (loads) of pesticides for the upper soil horizons. Results showed that the extended tool successfully delineated areas of high to low vulnerability to selected pesticides. The leaching potential was high for picloram, medium for simazine, and low to negligible for 2,4-D and glyphosate. The mass loadings of picloram moving below 0.5 m depth increased greatly in northwestern and central US that recorded its extensive use in agricultural crops. However, in addition to the amount of pesticide used, annual leaching load of atrazine was also affected by other factors that determined the intrinsic aquifer vulnerability such as soil and recharge properties. Spatial and temporal resolutions of digital maps had a great effect on the leaching potential of pesticides, requiring a trade-off between data availability and accuracy. Potential applications of this tool include the rapid, large-scale vulnerability assessments for emerging contaminants which are hard to quantify directly through vadose zone models due to lack of full environmental data. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. USGS standard quadrangle maps for emergency response

    USGS Publications Warehouse

    Moore, Laurence R.

    2009-01-01

    The 1:24,000-scale topographic quadrangle was the primary product of the U.S. Geological Survey's (USGS) National Mapping Program from 1947-1992. This map series includes about 54,000 map sheets for the conterminous United States, and is the only uniform map series ever produced that covers this area at such a large scale. This map series partially was revised under several programs, starting as early as 1968, but these programs were not adequate to keep the series current. Through the 1990s the emphasis of the USGS mapping program shifted away from topographic maps and toward more specialized digital data products. Topographic map revision dropped off rapidly after 1999, and stopped completely by 2004. Since 2001, emergency-response and homeland security requirement have revived the question of whether a standard national topographic series is needed. Emergencies such as Hurricane Katrina in 2005 and California wildfires in 2007-08 demonstrated that familiar maps are important to first responders. Maps that have a standard scale, extent, and grids help reduce confusion and save time in emergencies. Traditional maps are designed to allow the human brain to quickly process large amounts of information, and depend on artistic layout and design that cannot be fully automated. In spite of technical advances, creating a traditional, general-purpose topographic map is still expensive. Although the content and layout of traditional topographic maps probably is still desirable, the preferred packaging and delivery of maps has changed. Digital image files are now desired by most users, but to be useful to the emergency-response community, these files must be easy to view and easy to print without specialized geographic information system expertise or software.

  17. Wavelet-enabled progressive data Access and Storage Protocol (WASP)

    NASA Astrophysics Data System (ADS)

    Clyne, J.; Frank, L.; Lesperance, T.; Norton, A.

    2015-12-01

    Current practices for storing numerical simulation outputs hail from an era when the disparity between compute and I/O performance was not as great as it is today. The memory contents for every sample, computed at every grid point location, are simply saved at some prescribed temporal frequency. Though straightforward, this approach fails to take advantage of the coherency in neighboring grid points that invariably exists in numerical solutions to mathematical models. Exploiting such coherence is essential to digital multimedia; DVD-Video, digital cameras, streaming movies and audio are all possible today because of transform-based compression schemes that make substantial reductions in data possible by taking advantage of the strong correlation between adjacent samples in both space and time. Such methods can also be exploited to enable progressive data refinement in a manner akin to that used in ubiquitous digital mapping applications: views from far away are shown in coarsened detail to provide context, and can be progressively refined as the user zooms in on a localized region of interest. The NSF funded WASP project aims to provide a common, NetCDF-compatible software framework for supporting wavelet-based, multi-scale, progressive data, enabling interactive exploration of large data sets for the geoscience communities. This presentation will provide an overview of this work in progress to develop community cyber-infrastructure for the efficient analysis of very large data sets.

  18. Student and Faculty Inter-Generational Digital Divide: Fact or Fiction?

    ERIC Educational Resources Information Center

    Salajan, Florin D.; Schonwetter, Dieter J.; Cleghorn, Blaine M.

    2010-01-01

    This article analyzes the digital native-digital immigrant dichotomy based on the results of a small-scale study conducted at the University of Toronto, Faculty of Dentistry, regarding students' and faculty members' perceptions toward the implementation of digital learning technologies in the curriculum. The first element chosen for measurement…

  19. Computer analysis of digital sky surveys using citizen science and manual classification

    NASA Astrophysics Data System (ADS)

    Kuminski, Evan; Shamir, Lior

    2015-01-01

    As current and future digital sky surveys such as SDSS, LSST, DES, Pan-STARRS and Gaia create increasingly massive databases containing millions of galaxies, there is a growing need to be able to efficiently analyze these data. An effective way to do this is through manual analysis, however, this may be insufficient considering the extremely vast pipelines of astronomical images generated by the present and future surveys. Some efforts have been made to use citizen science to classify galaxies by their morphology on a larger scale than individual or small groups of scientists can. While these citizen science efforts such as Zooniverse have helped obtain reasonably accurate morphological information about large numbers of galaxies, they cannot scale to provide complete analysis of billions of galaxy images that will be collected by future ventures such as LSST. Since current forms of manual classification cannot scale to the masses of data collected by digital sky surveys, it is clear that in order to keep up with the growing databases some form of automation of the data analysis will be required, and will work either independently or in combination with human analysis such as citizen science. Here we describe a computer vision method that can automatically analyze galaxy images and deduce galaxy morphology. Experiments using Galaxy Zoo 2 data show that the performance of the method increases as the degree of agreement between the citizen scientists gets higher, providing a cleaner dataset. For several morphological features, such as the spirality of the galaxy, the algorithm agreed with the citizen scientists on around 95% of the samples. However, the method failed to analyze some of the morphological features such as the number of spiral arms, and provided accuracy of just ~36%.

  20. The use of multi temporal LiDAR to assess basin-scale erosion and deposition following the catastrophic January 2011 Lockyer flood, SE Queensland, Australia

    NASA Astrophysics Data System (ADS)

    Croke, Jacky; Todd, Peter; Thompson, Chris; Watson, Fiona; Denham, Robert; Khanal, Giri

    2013-02-01

    Advances in remote sensing and digital terrain processing now allow for a sophisticated analysis of spatial and temporal changes in erosion and deposition. Digital elevation models (DEMs) can now be constructed and differenced to produce DEMs of Difference (DoD), which are used to assess net landscape change for morphological budgeting. To date this has been most effectively achieved in gravel-bed rivers over relatively small spatial scales. If the full potential of the technology is to be realised, additional studies are required at larger scales and across a wider range of geomorphic features. This study presents an assessment of the basin-scale spatial patterns of erosion, deposition, and net morphological change that resulted from a catastrophic flood event in the Lockyer Creek catchment of SE Queensland (SEQ) in January 2011. Multitemporal Light Detection and Ranging (LiDAR) DEMs were used to construct a DoD that was then combined with a one-dimensional flow hydraulic model HEC-RAS to delineate five major geomorphic landforms, including inner-channel area, within-channel benches, macrochannel banks, and floodplain. The LiDAR uncertainties were quantified and applied together with a probabilistic representation of uncertainty thresholded at a conservative 95% confidence interval. The elevation change distribution (ECD) for the 100-km2 study area indicates a magnitude of elevation change spanning almost 10 m but the mean elevation change of 0.04 m confirms that a large part of the landscape was characterised by relatively low magnitude changes over a large spatial area. Mean elevation changes varied by geomorphic feature and only two, the within-channel benches and macrochannel banks, were net erosional with an estimated combined loss of 1,815,149 m3 of sediment. The floodplain was the zone of major net deposition but mean elevation changes approached the defined critical limit of uncertainty. Areal and volumetric ECDs for this extreme event provide a representative expression of the balance between erosion and deposition, and importantly sediment redistribution, which is extremely difficult to quantify using more traditional channel planform or cross-sectional surveys. The ability of LiDAR to make a rapid and accurate assessment of key geomorphic processes over large spatial scales contributes to our understanding of key processes and, as demonstrated here, to the assessment of major geomorphological hazards such as extreme flood events.

  1. Readiness for Delivering Digital Health at Scale: Lessons From a Longitudinal Qualitative Evaluation of a National Digital Health Innovation Program in the United Kingdom

    PubMed Central

    Lennon, Marilyn R; Bouamrane, Matt-Mouley; Devlin, Alison M; O'Connor, Siobhan; O'Donnell, Catherine; Chetty, Ula; Agbakoba, Ruth; Bikker, Annemieke; Grieve, Eleanor; Finch, Tracy; Watson, Nicholas; Wyke, Sally

    2017-01-01

    Background Digital health has the potential to support care delivery for chronic illness. Despite positive evidence from localized implementations, new technologies have proven slow to become accepted, integrated, and routinized at scale. Objective The aim of our study was to examine barriers and facilitators to implementation of digital health at scale through the evaluation of a £37m national digital health program: ‟Delivering Assisted Living Lifestyles at Scale” (dallas) from 2012-2015. Methods The study was a longitudinal qualitative, multi-stakeholder, implementation study. The methods included interviews (n=125) with key implementers, focus groups with consumers and patients (n=7), project meetings (n=12), field work or observation in the communities (n=16), health professional survey responses (n=48), and cross program documentary evidence on implementation (n=215). We used a sociological theory called normalization process theory (NPT) and a longitudinal (3 years) qualitative framework analysis approach. This work did not study a single intervention or population. Instead, we evaluated the processes (of designing and delivering digital health), and our outcomes were the identified barriers and facilitators to delivering and mainstreaming services and products within the mixed sector digital health ecosystem. Results We identified three main levels of issues influencing readiness for digital health: macro (market, infrastructure, policy), meso (organizational), and micro (professional or public). Factors hindering implementation included: lack of information technology (IT) infrastructure, uncertainty around information governance, lack of incentives to prioritize interoperability, lack of precedence on accountability within the commercial sector, and a market perceived as difficult to navigate. Factors enabling implementation were: clinical endorsement, champions who promoted digital health, and public and professional willingness. Conclusions Although there is receptiveness to digital health, barriers to mainstreaming remain. Our findings suggest greater investment in national and local infrastructure, implementation of guidelines for the safe and transparent use and assessment of digital health, incentivization of interoperability, and investment in upskilling of professionals and the public would help support the normalization of digital health. These findings will enable researchers, health care practitioners, and policy makers to understand the current landscape and the actions required in order to prepare the market and accelerate uptake, and use of digital health and wellness services in context and at scale. PMID:28209558

  2. Mapping Vegetation Community Types in a Highly-Disturbed Landscape: Integrating Hiearchical Object-Based Image Analysis with Digital Surface Models

    NASA Astrophysics Data System (ADS)

    Snavely, Rachel A.

    Focusing on the semi-arid and highly disturbed landscape of San Clemente Island, California, this research tests the effectiveness of incorporating a hierarchal object-based image analysis (OBIA) approach with high-spatial resolution imagery and light detection and range (LiDAR) derived canopy height surfaces for mapping vegetation communities. The study is part of a large-scale research effort conducted by researchers at San Diego State University's (SDSU) Center for Earth Systems Analysis Research (CESAR) and Soil Ecology and Restoration Group (SERG), to develop an updated vegetation community map which will support both conservation and management decisions on Naval Auxiliary Landing Field (NALF) San Clemente Island. Trimble's eCognition Developer software was used to develop and generate vegetation community maps for two study sites, with and without vegetation height data as input. Overall and class-specific accuracies were calculated and compared across the two classifications. The highest overall accuracy (approximately 80%) was observed with the classification integrating airborne visible and near infrared imagery having very high spatial resolution with a LiDAR derived canopy height model. Accuracies for individual vegetation classes differed between both classification methods, but were highest when incorporating the LiDAR digital surface data. The addition of a canopy height model, however, yielded little difference in classification accuracies for areas of very dense shrub cover. Overall, the results show the utility of the OBIA approach for mapping vegetation with high spatial resolution imagery, and emphasizes the advantage of both multi-scale analysis and digital surface data for accuracy characterizing highly disturbed landscapes. The integrated imagery and digital canopy height model approach presented both advantages and limitations, which have to be considered prior to its operational use in mapping vegetation communities.

  3. New Data on the Topside Electron Density Distribution

    NASA Technical Reports Server (NTRS)

    Huang, Xue-Qin; Reinisch, Bodo; Bilitza, Dieter; Benson, Robert F.

    2001-01-01

    The existing uncertainties about the electron density profiles in the topside ionosphere, i.e., in the height region from hmF2 to approx. 2000 km, require the search for new data sources. The ISIS and Alouette topside sounder satellites from the sixties to the eighties recorded millions of ionograms and most were not analyzed in terms of electron density profiles. In recent years an effort started to digitize the analog recordings to prepare the ionograms for computerized analysis. As of November 2001 about 350,000 ionograms have been digitized from the original 7-track analog tapes. These data are available in binary and CDF format from the anonymous ftp site of the National Space Science Data Center. A search site and browse capabilities on CDAWeb assist the scientific usage of these data. All information and access links can be found at http://nssdc.gsfc.nasa.gov/space/isis/isis-status.html. This paper describes the ISIS data restoration effort and shows how the digital ionograms are automatically processed into electron density profiles from satellite orbit altitude (1400 km for ISIS-2) down to the F peak. Because of the large volume of data an automated processing algorithm is imperative. The automatic topside ionogram scaler with true height algorithm TOPIST software developed for this task is successfully scaling approx.70 % of the ionograms. An 'editing process' is available to manually scale the more difficult ionograms. The automated processing of the digitized ISIS ionograms is now underway, producing a much-needed database of topside electron density profiles for ionospheric modeling covering more than one solar cycle. The ISIS data restoration efforts are supported through NASA's Applied Systems and Information Research Program.

  4. A Case Study of the De Novo Evolution of a Complex Odometric Behavior in Digital Organisms

    PubMed Central

    Grabowski, Laura M.; Bryson, David M.; Dyer, Fred C.; Pennock, Robert T.; Ofria, Charles

    2013-01-01

    Investigating the evolution of animal behavior is difficult. The fossil record leaves few clues that would allow us to recapitulate the path that evolution took to build a complex behavior, and the large population sizes and long time scales required prevent us from re-evolving such behaviors in a laboratory setting. We present results of a study in which digital organisms–self-replicating computer programs that are subject to mutations and selection–evolved in different environments that required information about past experience for fitness-enhancing behavioral decisions. One population evolved a mechanism for step-counting, a surprisingly complex odometric behavior that was only indirectly related to enhancing fitness. We examine in detail the operation of the evolved mechanism and the evolutionary transitions that produced this striking example of a complex behavior. PMID:23577113

  5. Recent progress of RD53 Collaboration towards next generation Pixel Read-Out Chip for HL-LHC

    DOE PAGES

    Demaria, N.

    2016-12-21

    This paper is a review of recent progress of RD53 Collaboration. Results obtained on the study of the radiation effects on 65 nm CMOS have matured enough to define first strategies to adopt in the design of analog and digital circuits. Critical building blocks and analog very front end chains have been designed, tested before and after 5–800 Mrad. Small prototypes of 64×64 pixels with complex digital architectures have been produced, and point to address the main issues of dealing with extremely high pixel rates, while operating at very small in-time thresholds in the analog front end. Lastly, the collaborationmore » is now proceeding at full speed towards the design of a large scale prototype, called RD53A, in 65 nm CMOS technology.« less

  6. Hysteresis-Free Carbon Nanotube Field-Effect Transistors.

    PubMed

    Park, Rebecca S; Hills, Gage; Sohn, Joon; Mitra, Subhasish; Shulaker, Max M; Wong, H-S Philip

    2017-05-23

    While carbon nanotube (CNT) field-effect transistors (CNFETs) promise high-performance and energy-efficient digital systems, large hysteresis degrades these potential CNFET benefits. As hysteresis is caused by traps surrounding the CNTs, previous works have shown that clean interfaces that are free of traps are important to minimize hysteresis. Our previous findings on the sources and physics of hysteresis in CNFETs enabled us to understand the influence of gate dielectric scaling on hysteresis. To begin with, we validate through simulations how scaling the gate dielectric thickness results in greater-than-expected benefits in reducing hysteresis. Leveraging this insight, we experimentally demonstrate reducing hysteresis to <0.5% of the gate-source voltage sweep range using a very large-scale integration compatible and solid-state technology, simply by fabricating CNFETs with a thin effective oxide thickness of 1.6 nm. However, even with negligible hysteresis, large subthreshold swing is still observed in the CNFETs with multiple CNTs per transistor. We show that the cause of large subthreshold swing is due to threshold voltage variation between individual CNTs. We also show that the source of this threshold voltage variation is not explained solely by variations in CNT diameters (as is often ascribed). Rather, other factors unrelated to the CNTs themselves (i.e., process variations, random fixed charges at interfaces) are a significant factor in CNT threshold voltage variations and thus need to be further improved.

  7. Large-scale Estimates of Leaf Area Index from Active Remote Sensing Laser Altimetry

    NASA Astrophysics Data System (ADS)

    Hopkinson, C.; Mahoney, C.

    2016-12-01

    Leaf area index (LAI) is a key parameter that describes the spatial distribution of foliage within forest canopies which in turn control numerous relationships between the ground, canopy, and atmosphere. The retrieval of LAI has demonstrated success by in-situ (digital) hemispherical photography (DHP) and airborne laser scanning (ALS) data; however, field and ALS acquisitions are often spatially limited (100's km2) and costly. Large-scale (>1000's km2) retrievals have been demonstrated by optical sensors, however, accuracies remain uncertain due to the sensor's inability to penetrate the canopy. The spaceborne Geoscience Laser Altimeter System (GLAS) provides a possible solution in retrieving large-scale derivations whilst simultaneously penetrating the canopy. LAI retrieved by multiple DHP from 6 Australian sites, representing a cross-section of Australian ecosystems, were employed to model ALS LAI, which in turn were used to infer LAI from GLAS data at 5 other sites. An optimally filtered GLAS dataset was then employed in conjunction with a host of supplementary data to build a Random Forest (RF) model to infer predictions (and uncertainties) of LAI at a 250 m resolution across the forested regions of Australia. Predictions were validated against ALS-based LAI from 20 sites (R2=0.64, RMSE=1.1 m2m-2); MODIS-based LAI were also assessed against these sites (R2=0.30, RMSE=1.78 m2m-2) to demonstrate the strength of GLAS-based predictions. The large-scale nature of current predictions was also leveraged to demonstrate large-scale relationships of LAI with other environmental characteristics, such as: canopy height, elevation, and slope. The need for such wide-scale quantification of LAI is key in the assessment and modification of forest management strategies across Australia. Such work also assists Australia's Terrestrial Ecosystem Research Network, in fulfilling their government issued mandates.

  8. Digital tissue and what it may reveal about the brain.

    PubMed

    Morgan, Josh L; Lichtman, Jeff W

    2017-10-30

    Imaging as a means of scientific data storage has evolved rapidly over the past century from hand drawings, to photography, to digital images. Only recently can sufficiently large datasets be acquired, stored, and processed such that tissue digitization can actually reveal more than direct observation of tissue. One field where this transformation is occurring is connectomics: the mapping of neural connections in large volumes of digitized brain tissue.

  9. High Scalability Video ISR Exploitation

    DTIC Science & Technology

    2012-10-01

    Surveillance, ARGUS) on the National Image Interpretability Rating Scale (NIIRS) at level 6. Ultra-high quality cameras like the Digital Cinema 4K (DC-4K...Scale (NIIRS) at level 6. Ultra-high quality cameras like the Digital Cinema 4K (DC-4K), which recognizes objects smaller than people, will be available...purchase ultra-high quality cameras like the Digital Cinema 4K (DC-4K) for use in the field. However, even if such a UAV sensor with a DC-4K was flown

  10. Highly linear, sensitive analog-to-digital converter

    NASA Technical Reports Server (NTRS)

    Cox, J.; Finley, W. R.

    1969-01-01

    Analog-to-digital converter converts 10 volt full scale input signal into 13 bit digital output. Advantages include high sensitivity, linearity, low quantitizing error, high resistance to mechanical shock and vibration loads, and temporary data storage capabilities.

  11. Examining unusual digit span performance in a population of postsecondary students assessed for academic difficulties.

    PubMed

    Harrison, Allyson G; Rosenblum, Yoni; Currie, Shannon

    2010-09-01

    Methods of identifying poor test-related motivation using the Wechsler Adult Intelligence Scale Digit Span subtest are based on identification of performance patterns that are implausible if the test taker is investing full effort. No studies to date, however, have examined the specificity of such measures, particularly when evaluating persons with either known or suspected learning or attention disorders. This study investigated performance of academically challenged students on three measures embedded in the Wechsler Adult Intelligence Scale-III, namely, low Digit Span, high Vocabulary-Digit span (Voc-DS), and low Reliable Digit Span scores. Evaluating subjects believed to be investing full effort in testing, it was found that both Digit Span and Reliable Digit Span had high specificity, although both showed relatively lower sensitivity. In contrast, VOC-DS was especially weak in both sensitivity and specificity, with an apparent false positive rate of 28%. Use of VOC-DS is therefore not appropriate for those with a history of learning or attention problems.

  12. Carbon nanotube transistor based high-frequency electronics

    NASA Astrophysics Data System (ADS)

    Schroter, Michael

    At the nanoscale carbon nanotubes (CNTs) have higher carrier mobility and carrier velocity than most incumbent semiconductors. Thus CNT based field-effect transistors (FETs) are being considered as strong candidates for replacing existing MOSFETs in digital applications. In addition, the predicted high intrinsic transit frequency and the more recent finding of ways to achieve highly linear transfer characteristics have inspired investigations on analog high-frequency (HF) applications. High linearity is extremely valuable for an energy efficient usage of the frequency spectrum, particularly in mobile communications. Compared to digital applications, the much more relaxed constraints for CNT placement and lithography combined with already achieved operating frequencies of at least 10 GHz for fabricated devices make an early entry in the low GHz HF market more feasible than in large-scale digital circuits. Such a market entry would be extremely beneficial for funding the development of production CNTFET based process technology. This talk will provide an overview on the present status and feasibility of HF CNTFET technology will be given from an engineering point of view, including device modeling, experimental results, and existing roadblocks. Carbon nanotube transistor based high-frequency electronics.

  13. Carbon nanotube transistor based high-frequency electronics

    NASA Astrophysics Data System (ADS)

    Schroter, Michael

    At the nanoscale carbon nanotubes (CNTs) have higher carrier mobility and carrier velocity than most incumbent semiconductors. Thus CNT based field-effect transistors (FETs) are being considered as strong candidates for replacing existing MOSFETs in digital applications. In addition, the predicted high intrinsic transit frequency and the more recent finding of ways to achieve highly linear transfer characteristics have inspired investigations on analog high-frequency (HF) applications. High linearity is extremely valuable for an energy efficient usage of the frequency spectrum, particularly in mobile communications. Compared to digital applications, the much more relaxed constraints for CNT placement and lithography combined with already achieved operating frequencies of at least 10 GHz for fabricated devices make an early entry in the low GHz HF market more feasible than in large-scale digital circuits. Such a market entry would be extremely beneficial for funding the development of production CNTFET based process technology. This talk will provide an overview on the present status and feasibility of HF CNTFET technology will be given from an engineering point of view, including device modeling, experimental results, and existing roadblocks.

  14. The Engineer Topographic Laboratories /ETL/ hybrid optical/digital image processor

    NASA Astrophysics Data System (ADS)

    Benton, J. R.; Corbett, F.; Tuft, R.

    1980-01-01

    An optical-digital processor for generalized image enhancement and filtering is described. The optical subsystem is a two-PROM Fourier filter processor. Input imagery is isolated, scaled, and imaged onto the first PROM; this input plane acts like a liquid gate and serves as an incoherent-to-coherent converter. The image is transformed onto a second PROM which also serves as a filter medium; filters are written onto the second PROM with a laser scanner in real time. A solid state CCTV camera records the filtered image, which is then digitized and stored in a digital image processor. The operator can then manipulate the filtered image using the gray scale and color remapping capabilities of the video processor as well as the digital processing capabilities of the minicomputer.

  15. Multiple Intelligence and Digital Learning Awareness of Prospective B.Ed Teachers

    ERIC Educational Resources Information Center

    Gracious, F. L. Antony; Shyla, F. L. Jasmine Anne

    2012-01-01

    The present study Multiple Intelligence and Digital Learning Awareness of prospective B.Ed teachers was probed to find the relationship between Multiple Intelligence and Digital Learning Awareness of Prospective B.Ed Teachers. Data for the study were collected using self made Multiple Intelligence Inventory and Digital Learning Awareness Scale.…

  16. Financing a large-scale picture archival and communication system.

    PubMed

    Goldszal, Alberto F; Bleshman, Michael H; Bryan, R Nick

    2004-01-01

    An attempt to finance a large-scale multi-hospital picture archival and communication system (PACS) solely based on cost savings from current film operations is reported. A modified Request for Proposal described the technical requirements, PACS architecture, and performance targets. The Request for Proposal was complemented by a set of desired financial goals-the main one being the ability to use film savings to pay for the implementation and operation of the PACS. Financing of the enterprise-wide PACS was completed through an operating lease agreement including all PACS equipment, implementation, service, and support for an 8-year term, much like a complete outsourcing. Equipment refreshes, both hardware and software, are included. Our agreement also linked the management of the digital imaging operation (PACS) and the traditional film printing, shifting the operational risks of continued printing and costs related to implementation delays to the PACS vendor. An additional optimization step provided the elimination of the negative film budget variances in the beginning of the project when PACS costs tend to be higher than film and film-related expenses. An enterprise-wide PACS has been adopted to achieve clinical workflow improvements and cost savings. PACS financing was solely based on film savings, which included the entire digital solution (PACS) and any residual film printing. These goals were achieved with simultaneous elimination of any over-budget scenarios providing a non-negative cash flow in each year of an 8-year term.

  17. Dynamic displacement measurement of large-scale structures based on the Lucas-Kanade template tracking algorithm

    NASA Astrophysics Data System (ADS)

    Guo, Jie; Zhu, Chang`an

    2016-01-01

    The development of optics and computer technologies enables the application of the vision-based technique that uses digital cameras to the displacement measurement of large-scale structures. Compared with traditional contact measurements, vision-based technique allows for remote measurement, has a non-intrusive characteristic, and does not necessitate mass introduction. In this study, a high-speed camera system is developed to complete the displacement measurement in real time. The system consists of a high-speed camera and a notebook computer. The high-speed camera can capture images at a speed of hundreds of frames per second. To process the captured images in computer, the Lucas-Kanade template tracking algorithm in the field of computer vision is introduced. Additionally, a modified inverse compositional algorithm is proposed to reduce the computing time of the original algorithm and improve the efficiency further. The modified algorithm can rapidly accomplish one displacement extraction within 1 ms without having to install any pre-designed target panel onto the structures in advance. The accuracy and the efficiency of the system in the remote measurement of dynamic displacement are demonstrated in the experiments on motion platform and sound barrier on suspension viaduct. Experimental results show that the proposed algorithm can extract accurate displacement signal and accomplish the vibration measurement of large-scale structures.

  18. Incremental terrain processing for large digital elevation models

    NASA Astrophysics Data System (ADS)

    Ye, Z.

    2012-12-01

    Incremental terrain processing for large digital elevation models Zichuan Ye, Dean Djokic, Lori Armstrong Esri, 380 New York Street, Redlands, CA 92373, USA (E-mail: zye@esri.com, ddjokic@esri.com , larmstrong@esri.com) Efficient analyses of large digital elevation models (DEM) require generation of additional DEM artifacts such as flow direction, flow accumulation and other DEM derivatives. When the DEMs to analyze have a large number of grid cells (usually > 1,000,000,000) the generation of these DEM derivatives is either impractical (it takes too long) or impossible (software is incapable of processing such a large number of cells). Different strategies and algorithms can be put in place to alleviate this situation. This paper describes an approach where the overall DEM is partitioned in smaller processing units that can be efficiently processed. The processed DEM derivatives for each partition can then be either mosaicked back into a single large entity or managed on partition level. For dendritic terrain morphologies, the way in which partitions are to be derived and the order in which they are to be processed depend on the river and catchment patterns. These patterns are not available until flow pattern of the whole region is created, which in turn cannot be established upfront due to the size issues. This paper describes a procedure that solves this problem: (1) Resample the original large DEM grid so that the total number of cells is reduced to a level for which the drainage pattern can be established. (2) Run standard terrain preprocessing operations on the resampled DEM to generate the river and catchment system. (3) Define the processing units and their processing order based on the river and catchment system created in step (2). (4) Based on the processing order, apply the analysis, i.e., flow accumulation operation to each of the processing units, at the full resolution DEM. (5) As each processing unit is processed based on the processing order defined in (3), compare the resulting drainage pattern with the drainage pattern established at the coarser scale and adjust the drainage boundaries and rivers if necessary.

  19. Modern Methods for fast generation of digital holograms

    NASA Astrophysics Data System (ADS)

    Tsang, P. W. M.; Liu, J. P.; Cheung, K. W. K.; Poon, T.-C.

    2010-06-01

    With the advancement of computers, digital holography (DH) has become an area of interest that has gained much popularity. Research findings derived from this technology enables holograms representing three dimensional (3-D) scenes to be acquired with optical means, or generated with numerical computation. In both cases, the holograms are in the form of numerical data that can be recorded, transmitted, and processed with digital techniques. On top of that, the availability of high capacity digital storage and wide-band communication technologies also cast light on the emergence of real time video holographic systems, enabling animated 3-D contents to be encoded as holographic data, and distributed via existing medium. At present, development in DH has reached a reasonable degree of maturity, but at the same time the heavy computation involved also imposes difficulty in practical applications. In this paper, a summary on a number of successful accomplishments that have been made recently in overcoming this problem is presented. Subsequently, we shall propose an economical framework that is suitable for real time generation and transmission of holographic video signals over existing distribution media. The proposed framework includes an aspect of extending the depth range of the object scene, which is important for the display of large-scale objects. [Figure not available: see fulltext.

  20. Digital Object Identifiers (DOI's) usage and adoption in U.S Geological Survey (USGS)

    NASA Astrophysics Data System (ADS)

    Frame, M. T.; Palanisamy, G.

    2013-12-01

    Addressing grand environmental science challenges requires unprecedented access to easily understood data that cross the breadth of temporal, spatial, and thematic scales. From a scientist's perspective, the big challenges lie in discovering the relevant data, dealing with extreme data heterogeneity, large data volumes, and converting data to information and knowledge. Historical linkages between derived products, i.e. Publications, and associated datasets has not existed in the earth science community. The USGS Core Science Analytics and Synthesis, in collaboration with DOE's Oak Ridge National Laboratory (ORNL) Mercury Consortium (funded by NASA, USGS and DOE), established a Digital Object Identifier (DOI) service for USGS data, metadata, and other media. This service is offered in partnership through the University of California Digital Library EZID service. USGS scientists, data managers, and other professionals can generate globally unique, persistent and resolvable identifiers for any kind of digital objects. Additional efforts to assign DOIs to historical data and publications have also been underway. These DOI identifiers are being used to cite data in journal articles, web-accessible datasets, and other media for distribution, integration, and in support of improved data management practices. The session will discuss the current DOI efforts within USGS, including a discussion on adoption, challenges, and future efforts necessary to improve access, reuse, sharing, and discoverability of USGS data and information.

  1. Geology and mineral resource assessment of the Venezuelan Guayana Shield at 1:500,000 scale; a digital representation of maps published by the U.S. Geological Survey

    USGS Publications Warehouse

    Schruben, Paul G.; Wynn, J.C.; Gray, Floyd; Cox, D.P.; Sterwart, J.H.; Brooks, W.E.

    1997-01-01

    This CD-ROM contains vector-based digital maps of the geology and resource assessment of the Venezuela Guayana Shield originally published as paper maps in 1993 in U. S. Geological Survey Bulletin 2062, at a scale of 1:1 million and revised in 1993-95 as separate maps at a scale of 1:500,000. Although the maps on this disc can be displayed at different scales, they are not intended to be used at any scale more detailed than 1:500,000.

  2. Building the interspace: Digital library infrastructure for a University Engineering Community

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schatz, B.

    A large-scale digital library is being constructed and evaluated at the University of Illinois, with the goal of bringing professional search and display to Internet information services. A testbed planned to grow to 10K documents and 100K users is being constructed in the Grainger Engineering Library Information Center, as a joint effort of the University Library and the National Center for Supercomputing Applications (NCSA), with evaluation and research by the Graduate School of Library and Information Science and the Department of Computer Science. The electronic collection will be articles from engineering and science journals and magazines, obtained directly from publishersmore » in SGML format and displayed containing all text, figures, tables, and equations. The publisher partners include IEEE Computer Society, AIAA (Aerospace Engineering), American Physical Society, and Wiley & Sons. The software will be based upon NCSA Mosaic as a network engine connected to commercial SGML displayers and full-text searchers. The users will include faculty/students across the midwestern universities in the Big Ten, with evaluations via interviews, surveys, and transaction logs. Concurrently, research into scaling the testbed is being conducted. This includes efforts in computer science, information science, library science, and information systems. These efforts will evaluate different semantic retrieval technologies, including automatic thesaurus and subject classification graphs. New architectures will be designed and implemented for a next generation digital library infrastructure, the Interspace, which supports interaction with information spread across information spaces within the Net.« less

  3. Iterative current mode per pixel ADC for 3D SoftChip implementation in CMOS

    NASA Astrophysics Data System (ADS)

    Lachowicz, Stefan W.; Rassau, Alexander; Lee, Seung-Minh; Eshraghian, Kamran; Lee, Mike M.

    2003-04-01

    Mobile multimedia communication has rapidly become a significant area of research and development constantly challenging boundaries on a variety of technological fronts. The processing requirements for the capture, conversion, compression, decompression, enhancement, display, etc. of increasingly higher quality multimedia content places heavy demands even on current ULSI (ultra large scale integration) systems, particularly for mobile applications where area and power are primary considerations. The ADC presented in this paper is designed for a vertically integrated (3D) system comprising two distinct layers bonded together using Indium bump technology. The top layer is a CMOS imaging array containing analogue-to-digital converters, and a buffer memory. The bottom layer takes the form of a configurable array processor (CAP), a highly parallel array of soft programmable processors capable of carrying out complex processing tasks directly on data stored in the top plane. This paper presents a ADC scheme for the image capture plane. The analogue photocurrent or sampled voltage is transferred to the ADC via a column or a column/row bus. In the proposed system, an array of analogue-to-digital converters is distributed, so that a one-bit cell is associated with one sensor. The analogue-to-digital converters are algorithmic current-mode converters. Eight such cells are cascaded to form an 8-bit converter. Additionally, each photo-sensor is equipped with a current memory cell, and multiple conversions are performed with scaled values of the photocurrent for colour processing.

  4. Reconstruction of halo power spectrum from redshift-space galaxy distribution: cylinder-grouping method and halo exclusion effect

    NASA Astrophysics Data System (ADS)

    Okumura, Teppei; Takada, Masahiro; More, Surhud; Masaki, Shogo

    2017-07-01

    The peculiar velocity field measured by redshift-space distortions (RSD) in galaxy surveys provides a unique probe of the growth of large-scale structure. However, systematic effects arise when including satellite galaxies in the clustering analysis. Since satellite galaxies tend to reside in massive haloes with a greater halo bias, the inclusion boosts the clustering power. In addition, virial motions of the satellite galaxies cause a significant suppression of the clustering power due to non-linear RSD effects. We develop a novel method to recover the redshift-space power spectrum of haloes from the observed galaxy distribution by minimizing the contamination of satellite galaxies. The cylinder-grouping method (CGM) we study effectively excludes satellite galaxies from a galaxy sample. However, we find that this technique produces apparent anisotropies in the reconstructed halo distribution over all the scales which mimic RSD. On small scales, the apparent anisotropic clustering is caused by exclusion of haloes within the anisotropic cylinder used by the CGM. On large scales, the misidentification of different haloes in the large-scale structures, aligned along the line of sight, into the same CGM group causes the apparent anisotropic clustering via their cross-correlation with the CGM haloes. We construct an empirical model for the CGM halo power spectrum, which includes correction terms derived using the CGM window function at small scales as well as the linear matter power spectrum multiplied by a simple anisotropic function at large scales. We apply this model to a mock galaxy catalogue at z = 0.5, designed to resemble Sloan Digital Sky Survey-III Baryon Oscillation Spectroscopic Survey (BOSS) CMASS galaxies, and find that our model can predict both the monopole and quadrupole power spectra of the host haloes up to k < 0.5 {{h Mpc^{-1}}} to within 5 per cent.

  5. Automated Processing of ISIS Topside Ionograms into Electron Density Profiles

    NASA Technical Reports Server (NTRS)

    Reinisch, bodo W.; Huang, Xueqin; Bilitza, Dieter; Hills, H. Kent

    2004-01-01

    Modeling of the topside ionosphere has for the most part relied on just a few years of data from topside sounder satellites. The widely used Bent et al. (1972) model, for example, is based on only 50,000 Alouette 1 profiles. The International Reference Ionosphere (IRI) (Bilitza, 1990, 2001) uses an analytical description of the graphs and tables provided by Bent et al. (1972). The Alouette 1, 2 and ISIS 1, 2 topside sounder satellites of the sixties and seventies were ahead of their times in terms of the sheer volume of data obtained and in terms of the computer and software requirements for data analysis. As a result, only a small percentage of the collected topside ionograms was converted into electron density profiles. Recently, a NASA-funded data restoration project has undertaken and is continuing the process of digitizing the Alouette/ISIS ionograms from the analog 7-track tapes. Our project involves the automated processing of these digital ionograms into electron density profiles. The project accomplished a set of important goals that will have a major impact on understanding and modeling of the topside ionosphere: (1) The TOPside Ionogram Scaling and True height inversion (TOPIST) software was developed for the automated scaling and inversion of topside ionograms. (2) The TOPIST software was applied to the over 300,000 ISIS-2 topside ionograms that had been digitized in the fkamework of a separate AISRP project (PI: R.F. Benson). (3) The new TOPIST-produced database of global electron density profiles for the topside ionosphere were made publicly available through NASA s National Space Science Data Center (NSSDC) ftp archive at . (4) Earlier Alouette 1,2 and ISIS 1, 2 data sets of electron density profiles from manual scaling of selected sets of ionograms were converted fiom a highly-compressed binary format into a user-friendly ASCII format and made publicly available through nssdcftp.gsfc.nasa.gov. The new database for the topside ionosphere established as a result of this project, has stimulated a multitude of new studies directed towards a better description and prediction of the topside ionosphere. Marinov et al. (2004) developed a new model for the upper ion transition height (Oxygen to Hydrogen and Helium) and Bilitza (2004) deduced a correction term for the I N topside electron density model. Kutiev et al. (2005) used this data to develop a new model for the topside ionosphere scale height (TISH) as a function of month, local time, latitude, longitude and solar flux F10.7. Comparisons by Belehaki et al. (2005) show that TISH is in general agreement with scale heights deduced from ground ionosondes but the model predicts post-midnight and afternoon maxima whereas the ionosonde data show a noon maximum. Webb and Benson (2005) reported on their effort to deduce changes in the plasma temperature and ion composition from changes in the topside electron density profile as recorded by topside sounders. Limitations and possible improvements of the IRI topside model were discussed by Coisson et al. (2005) including also the possible use of the NeQuick model, Our project progressed in close collaboration and coordination with the GSFC team involved in the ISIS digitization effort. The digitization project was highly successful producing a large amount of digital topside ionograms. Several no-cost extensions of the TOPIST project were necessary to keep up with the pace and volume of the digitization effort.

  6. The value of the wechsler intelligence scale for children-fourth edition digit span as an embedded measure of effort: an investigation into children with dual diagnoses.

    PubMed

    Loughan, Ashlee R; Perna, Robert; Hertza, Jeremy

    2012-11-01

    The Test of Memory Malingering (TOMM) is a measure of test-taking effort which has traditionally been utilized with adults, but which more recently has demonstrated utility with children. The purpose of this study was to investigate whether the Wechsler Intelligence Scale for Children-Fourth Edition (WISC-IV) Digit Span, commonly used in neuropsychological evaluations, can also be functional as an embedded measure by detecting effort in children with dual diagnoses; a population yet to be investigated. Participants (n = 51) who completed neuropsychological evaluations including the TOMM, WISC-IV, Wisconsin Card Sorting Test, Children's Memory Scale, and Delis-Kaplan Executive Function System were divided into two groups: Optimal Effort and Suboptimal Effort, based on their TOMM Trial 2 scores. Digit Span findings suggest a useful scaled score of ≤4 resulted in optimal cutoff scores, yielding specificity of 91% and sensitivity of 43%. This study supports previous research that the WISC-IV Digit Span has good utility in determining optimal effort, even in children with dual diagnosis or comorbidities.

  7. Digital versus analog complete-arch impressions for single-unit premolar implant crowns: Operating time and patient preference.

    PubMed

    Schepke, Ulf; Meijer, Henny J A; Kerdijk, Wouter; Cune, Marco S

    2015-09-01

    Digital impression-making techniques are supposedly more patient friendly and less time-consuming than analog techniques, but evidence is lacking to substantiate this assumption. The purpose of this in vivo within-subject comparison study was to examine patient perception and time consumption for 2 complete-arch impression-making methods: a digital and an analog technique. Fifty participants with a single missing premolar were included. Treatment consisted of implant therapy. Three months after implant placement, complete-arch digital (Cerec Omnicam; Sirona) and analog impressions (semi-individual tray, Impregum; 3M ESPE) were made, and the participant's opinion was evaluated with a standard questionnaire addressing several domains (inconvenience, shortness of breath, fear of repeating the impression, and feelings of helplessness during the procedure) with the visual analog scale. All participants were asked which procedure they preferred. Operating time was measured with a stopwatch. The differences between impressions made for maxillary and mandibular implants were also compared. The data were analyzed with paired and independent sample t tests, and effect sizes were calculated. Statistically significant differences were found in favor of the digital procedure regarding all subjective domains (P<.001), with medium to large effect sizes. Of all the participants, over 80% preferred the digital procedure to the analog procedure. The mean duration of digital impression making was 6 minutes and 39 seconds (SD=1:51) versus 12 minutes and 13 seconds (SD=1:24) for the analog impression (P<.001, effect size=2.7). Digital impression making for the restoration of a single implant crown takes less time than analog impression making. Furthermore, participants preferred the digital scan and reported less inconvenience, less shortness of breath, less fear of repeating the impression, and fewer feelings of helplessness during the procedure. Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  8. Towards machine learned quality control: A benchmark for sharpness quantification in digital pathology.

    PubMed

    Campanella, Gabriele; Rajanna, Arjun R; Corsale, Lorraine; Schüffler, Peter J; Yagi, Yukako; Fuchs, Thomas J

    2018-04-01

    Pathology is on the verge of a profound change from an analog and qualitative to a digital and quantitative discipline. This change is mostly driven by the high-throughput scanning of microscope slides in modern pathology departments, reaching tens of thousands of digital slides per month. The resulting vast digital archives form the basis of clinical use in digital pathology and allow large scale machine learning in computational pathology. One of the most crucial bottlenecks of high-throughput scanning is quality control (QC). Currently, digital slides are screened manually to detected out-of-focus regions, to compensate for the limitations of scanner software. We present a solution to this problem by introducing a benchmark dataset for blur detection, an in-depth comparison of state-of-the art sharpness descriptors and their prediction performance within a random forest framework. Furthermore, we show that convolution neural networks, like residual networks, can be used to train blur detectors from scratch. We thoroughly evaluate the accuracy of feature based and deep learning based approaches for sharpness classification (99.74% accuracy) and regression (MSE 0.004) and additionally compare them to domain experts in a comprehensive human perception study. Our pipeline outputs spacial heatmaps enabling to quantify and localize blurred areas on a slide. Finally, we tested the proposed framework in the clinical setting and demonstrate superior performance over the state-of-the-art QC pipeline comprising commercial software and human expert inspection by reducing the error rate from 17% to 4.7%. Copyright © 2017. Published by Elsevier Ltd.

  9. A CCD experimental platform for large telescope in Antarctica based on FPGA

    NASA Astrophysics Data System (ADS)

    Zhu, Yuhua; Qi, Yongjun

    2014-07-01

    The CCD , as a detector , is one of the important components of astronomical telescopes. For a large telescope in Antarctica, a set of CCD detector system with large size, high sensitivity and low noise is indispensable. Because of the extremely low temperatures and unattended, system maintenance and software and hardware upgrade become hard problems. This paper introduces a general CCD controller experiment platform, using Field programmable gate array FPGA, which is, in fact, a large-scale field reconfigurable array. Taking the advantage of convenience to modify the system, construction of driving circuit, digital signal processing module, network communication interface, control algorithm validation, and remote reconfigurable module may realize. With the concept of integrated hardware and software, the paper discusses the key technology of building scientific CCD system suitable for the special work environment in Antarctica, focusing on the method of remote reconfiguration for controller via network and then offering a feasible hardware and software solution.

  10. Clustering in the SDSS Redshift Survey

    NASA Astrophysics Data System (ADS)

    Zehavi, I.; Blanton, M. R.; Frieman, J. A.; Weinberg, D. H.; SDSS Collaboration

    2002-05-01

    We present measurements of clustering in the Sloan Digital Sky Survey (SDSS) galaxy redshift survey. Our current sample consists of roughly 80,000 galaxies with redshifts in the range 0.02 < z < 0.2, covering about 1200 square degrees. We measure the clustering in redshift space and in real space. The two-dimensional correlation function ξ (rp,π ) shows clear signatures of redshift distortions, both the small-scale ``fingers-of-God'' effect and the large-scale compression. The inferred real-space correlation function is well described by a power law. The SDSS is especially suitable for investigating the dependence of clustering on galaxy properties, due to the wealth of information in the photometric survey. We focus on the dependence of clustering on color and on luminosity.

  11. Preliminary integrated geologic map databases for the United States: Digital data for the geology of southeast Alaska

    USGS Publications Warehouse

    Gehrels, George E.; Berg, Henry C.

    2006-01-01

    The growth in the use of Geographic Information Systems (GIS) has highlighted the need for digital geologic maps that have been attributed with information about age and lithology. Such maps can be conveniently used to generate derivative maps for manifold special purposes such as mineral-resource assessment, metallogenic studies, tectonic studies, and environmental research. This report is part of a series of integrated geologic map databases that cover the entire United States. Three national-scale geologic maps that portray most or all of the United States already exist; for the conterminous U.S., King and Beikman (1974a,b) compiled a map at a scale of 1:2,500,000, Beikman (1980) compiled a map for Alaska at 1:2,500,000 scale, and for the entire U.S., Reed and others (2005a,b) compiled a map at a scale of 1:5,000,000. A digital version of the King and Beikman map was published by Schruben and others (1994). Reed and Bush (2004) produced a digital version of the Reed and others (2005a) map for the conterminous U.S. The present series of maps is intended to provide the next step in increased detail. State geologic maps that range in scale from 1:100,000 to 1:1,000,000 are available for most of the country, and digital versions of these state maps are the basis of this product. The digital geologic maps presented here are in a standardized format as ARC/INFO export files and as ArcView shape files. Data tables that relate the map units to detailed lithologic and age information accompany these GIS files. The map is delivered as a set of 1:250,000-scale quadrangle files. To the best of our ability, these quadrangle files are edge-matched with respect to geology. When the maps are merged, the combined attribute tables can be used directly with the merged maps to make derivative maps.

  12. Digital Data for the reconnaissance geologic map for the Kuskokwim Bay Region of Southwest Alaska

    USGS Publications Warehouse

    Wilson, Frederic H.; Hults, Chad P.; Mohadjer, Solmaz; Coonrad, Warren L.; Shew, Nora B.; Labay, Keith A.

    2008-01-01

    INTRODUCTION The growth in the use of Geographic Information Systems (GIS) has highlighted the need for digital geologic maps that have been attributed with information about age and lithology. Such maps can be conveniently used to generate derivative maps for manifold special purposes such as mineral-resource assessment, metallogenic studies, tectonic studies, and environmental research. This report is part of a series of integrated geologic map databases that cover the entire United States. Three national-scale geologic maps that portray most or all of the United States already exist; for the conterminous U.S., King and Beikman (1974a,b) compiled a map at a scale of 1:2,500,000, Beikman (1980) compiled a map for Alaska at 1:2,500,000 scale, and for the entire U.S., Reed and others (2005a,b) compiled a map at a scale of 1:5,000,000. A digital version of the King and Beikman map was published by Schruben and others (1994). Reed and Bush (2004) produced a digital version of the Reed and others (2005a) map for the conterminous U.S. The present series of maps is intended to provide the next step in increased detail. State geologic maps that range in scale from 1:100,000 to 1:1,000,000 are available for most of the country, and digital versions of these state maps are the basis of this product. The digital geologic maps presented here are in a standardized format as ARC/INFO export files and as ArcView shape files. Data tables that relate the map units to detailed lithologic and age information accompany these GIS files. The map is delivered as a set 1:250,000-scale quadrangle files. To the best of our ability, these quadrangle files are edge-matched with respect to geology. When the maps are merged, the combined attribute tables can be used directly with the merged maps to make derivative maps.

  13. Publications - MP 141 | Alaska Division of Geological & Geophysical Surveys

    Science.gov Websites

    DGGS MP 141 Publication Details Title: Quaternary faults and folds in Alaska: A digital database Combellick, R.A., 2012, Quaternary faults and folds in Alaska: A digital database, in Koehler, R.D Quaternary faults, scale 1:3,700,000 (63.0 M) Digital Geospatial Data Digital Geospatial Data Quaternary

  14. Initial Everglades Depth Estimation Network (EDEN) Digital Elevation Model Research and Development

    USGS Publications Warehouse

    Jones, John W.; Price, Susan D.

    2007-01-01

    Introduction The Everglades Depth Estimation Network (EDEN) offers a consistent and documented dataset that can be used to guide large-scale field operations, to integrate hydrologic and ecological responses, and to support biological and ecological assessments that measure ecosystem responses to the Comprehensive Everglades Restoration Plan (Telis, 2006). To produce historic and near-real time maps of water depths, the EDEN requires a system-wide digital elevation model (DEM) of the ground surface. Accurate Everglades wetland ground surface elevation data were non-existent before the U.S. Geological Survey (USGS) undertook the collection of highly accurate surface elevations at the regional scale. These form the foundation for EDEN DEM development. This development process is iterative as additional high accuracy elevation data (HAED) are collected, water surfacing algorithms improve, and additional ground-based ancillary data become available. Models are tested using withheld HAED and independently measured water depth data, and by using DEM data in EDEN adaptive management applications. Here the collection of HAED is briefly described before the approach to DEM development and the current EDEN DEM are detailed. Finally future research directions for continued model development, testing, and refinement are provided.

  15. Self-organized synchronization of digital phase-locked loops with delayed coupling in theory and experiment

    PubMed Central

    Wetzel, Lucas; Jörg, David J.; Pollakis, Alexandros; Rave, Wolfgang; Fettweis, Gerhard; Jülicher, Frank

    2017-01-01

    Self-organized synchronization occurs in a variety of natural and technical systems but has so far only attracted limited attention as an engineering principle. In distributed electronic systems, such as antenna arrays and multi-core processors, a common time reference is key to coordinate signal transmission and processing. Here we show how the self-organized synchronization of mutually coupled digital phase-locked loops (DPLLs) can provide robust clocking in large-scale systems. We develop a nonlinear phase description of individual and coupled DPLLs that takes into account filter impulse responses and delayed signal transmission. Our phase model permits analytical expressions for the collective frequencies of synchronized states, the analysis of stability properties and the time scale of synchronization. In particular, we find that signal filtering introduces stability transitions that are not found in systems without filtering. To test our theoretical predictions, we designed and carried out experiments using networks of off-the-shelf DPLL integrated circuitry. We show that the phase model can quantitatively predict the existence, frequency, and stability of synchronized states. Our results demonstrate that mutually delay-coupled DPLLs can provide robust and self-organized synchronous clocking in electronic systems. PMID:28207779

  16. Incoherent optical generalized Hough transform: pattern recognition and feature extraction applications

    NASA Astrophysics Data System (ADS)

    Fernández, Ariel; Ferrari, José A.

    2017-05-01

    Pattern recognition and feature extraction are image processing applications of great interest in defect inspection and robot vision among others. In comparison to purely digital methods, the attractiveness of optical processors for pattern recognition lies in their highly parallel operation and real-time processing capability. This work presents an optical implementation of the generalized Hough transform (GHT), a well-established technique for recognition of geometrical features in binary images. Detection of a geometric feature under the GHT is accomplished by mapping the original image to an accumulator space; the large computational requirements for this mapping make the optical implementation an attractive alternative to digital-only methods. We explore an optical setup where the transformation is obtained, and the size and orientation parameters can be controlled, allowing for dynamic scale and orientation-variant pattern recognition. A compact system for the above purposes results from the use of an electrically tunable lens for scale control and a pupil mask implemented on a high-contrast spatial light modulator for orientation/shape variation of the template. Real-time can also be achieved. In addition, by thresholding of the GHT and optically inverse transforming, the previously detected features of interest can be extracted.

  17. Digital image transformation and rectification of spacecraft and radar images

    NASA Technical Reports Server (NTRS)

    Wu, S. S. C.

    1985-01-01

    The application of digital processing techniques to spacecraft television pictures and radar images is discussed. The use of digital rectification to produce contour maps from spacecraft pictures is described; images with azimuth and elevation angles are converted into point-perspective frame pictures. The digital correction of the slant angle of radar images to ground scale is examined. The development of orthophoto and stereoscopic shaded relief maps from digital terrain and digital image data is analyzed. Digital image transformations and rectifications are utilized on Viking Orbiter and Lander pictures of Mars.

  18. Improving the Automatic Inversion of Digital ISIS-2 Ionogram Reflection Traces into Topside Vertical Electron-Density Profiles

    NASA Technical Reports Server (NTRS)

    Benson, R. F.; Truhlik, V.; Huang, X.; Wang, Y.; Bilitza, D.

    2011-01-01

    The topside-sounders on the four satellites of the International Satellites for Ionospheric Studies (ISIS) program were designed as analog systems. The resulting ionograms were displayed on 35-mm film for analysis by visual inspection. Each of these satellites, launched between 1962 and 1971, produced data for 10 to 20 years. A number of the original telemetry tapes from this large data set have been converted directly into digital records. Software, known as the TOPside Ionogram Scalar with True-height (TOPIST) algorithm has been produced that enables the automatic inversion of ISIS-2 ionogram reflection traces into topside vertical electron-density profiles Ne(h). More than million digital Alouette/ISIS topside ionograms have been produced and over 300,000 are from ISIS 2. Many of these ISIS-2 ionograms correspond to a passive mode of operation for the detection of natural radio emissions and thus do not contain ionospheric reflection traces. TOPIST, however, is not able to produce Ne(h) profiles from all of the ISIS-2 ionograms with reflection traces because some of them did not contain frequency information. This information was missing due to difficulties encountered during the analog-to-digital conversion process in the detection of the ionogram frame-sync pulse and/or the frequency markers. Of the many digital topside ionograms that TOPIST was able to process, over 200 were found where direct comparisons could be made with Ne(h) profiles that were produced by manual scaling in the early days of the ISIS program. While many of these comparisons indicated excellent agreement (<10% average difference over the entire profile) there were also many cases with large differences (more than a factor of two). Here we will report on two approaches to improve the automatic inversion process: (1) improve the quality of the digital ionogram database by remedying the missing frequency-information problem when possible, and (2) using the above-mentioned comparisons as teaching examples of how to improve the original TOPIST software.

  19. Simultaneous in-plane and out-of-plane displacement measurement based on a dual-camera imaging system and its application to inspection of large-scale space structures

    NASA Astrophysics Data System (ADS)

    Ri, Shien; Tsuda, Hiroshi; Yoshida, Takeshi; Umebayashi, Takashi; Sato, Akiyoshi; Sato, Eiichi

    2015-07-01

    Optical methods providing full-field deformation data have potentially enormous interest for mechanical engineers. In this study, an in-plane and out-of-plane displacement measurement method based on a dual-camera imaging system is proposed. The in-plane and out-of-plane displacements are determined simultaneously using two measured in-plane displacement data observed from two digital cameras at different view angles. The fundamental measurement principle and experimental results of accuracy confirmation are presented. In addition, we applied this method to the displacement measurement in a static loading and bending test of a solid rocket motor case (CFRP material; 2.2 m diameter and 2.3 m long) for an up-to-date Epsilon rocket developed by JAXA. The effectiveness and measurement accuracy is confirmed by comparing with conventional displacement sensor. This method could be useful to diagnose the reliability of large-scale space structures in the rocket development.

  20. The large-scale three-point correlation function of the SDSS BOSS DR12 CMASS galaxies

    NASA Astrophysics Data System (ADS)

    Slepian, Zachary; Eisenstein, Daniel J.; Beutler, Florian; Chuang, Chia-Hsun; Cuesta, Antonio J.; Ge, Jian; Gil-Marín, Héctor; Ho, Shirley; Kitaura, Francisco-Shu; McBride, Cameron K.; Nichol, Robert C.; Percival, Will J.; Rodríguez-Torres, Sergio; Ross, Ashley J.; Scoccimarro, Román; Seo, Hee-Jong; Tinker, Jeremy; Tojeiro, Rita; Vargas-Magaña, Mariana

    2017-06-01

    We report a measurement of the large-scale three-point correlation function of galaxies using the largest data set for this purpose to date, 777 202 luminous red galaxies in the Sloan Digital Sky Survey Baryon Acoustic Oscillation Spectroscopic Survey (SDSS BOSS) DR12 CMASS sample. This work exploits the novel algorithm of Slepian & Eisenstein to compute the multipole moments of the 3PCF in O(N^2) time, with N the number of galaxies. Leading-order perturbation theory models the data well in a compressed basis where one triangle side is integrated out. We also present an accurate and computationally efficient means of estimating the covariance matrix. With these techniques, the redshift-space linear and non-linear bias are measured, with 2.6 per cent precision on the former if σ8 is fixed. The data also indicate a 2.8σ preference for the BAO, confirming the presence of BAO in the three-point function.

  1. Quadtree of TIN: a new algorithm of dynamic LOD

    NASA Astrophysics Data System (ADS)

    Zhang, Junfeng; Fei, Lifan; Chen, Zhen

    2009-10-01

    Currently, Real-time visualization of large-scale digital elevation model mainly employs the regular structure of GRID based on quadtree and triangle simplification methods based on irregular triangulated network (TIN). TIN is a refined means to express the terrain surface in the computer science, compared with GRID. However, the data structure of TIN model is complex, and is difficult to realize view-dependence representation of level of detail (LOD) quickly. GRID is a simple method to realize the LOD of terrain, but contains more triangle count. A new algorithm, which takes full advantage of the two methods' merit, is presented in this paper. This algorithm combines TIN with quadtree structure to realize the view-dependence LOD controlling over the irregular sampling point sets, and holds the details through the distance of viewpoint and the geometric error of terrain. Experiments indicate that this approach can generate an efficient quadtree triangulation hierarchy over any irregular sampling point sets and achieve dynamic and visual multi-resolution performance of large-scale terrain at real-time.

  2. Morphologic Evolution of the Mount St. Helens Crater Area, Washington

    NASA Technical Reports Server (NTRS)

    Beach, G. L.

    1985-01-01

    The large rockslide-avalanche that preceded the eruption of Mount St. Helens on 18 May 1980 removed approximately 2.8 cubic km of material from the summit and north flank of the volcano, forming a horseshoe-shaped crater 2.0 km wide and 3.9 km long. A variety of erosional and depositional processes, notably mass wasting and gully development, acted to modify the topographic configuration of the crater area. To document this morphologic evolution, a series of annual large-scale topographic maps is being produced as a base for comparitive geomorphic analysis. Four topographic maps of the Mount St. Helens crater area at a scale of 1:4000 were produced by the National Mapping Division of the U. S. Geological Survey. Stereo aerial photography for the maps was obtained on 23 October 1980, 10 September 1981, 1 September 1982, and 17 August 1983. To quantify topographic changes in the study area, each topographic map is being digitized and corresponding X, Y, and Z values from successive maps are being computer-compared.

  3. High-resolution mapping of bifurcations in nonlinear biochemical circuits

    NASA Astrophysics Data System (ADS)

    Genot, A. J.; Baccouche, A.; Sieskind, R.; Aubert-Kato, N.; Bredeche, N.; Bartolo, J. F.; Taly, V.; Fujii, T.; Rondelez, Y.

    2016-08-01

    Analog molecular circuits can exploit the nonlinear nature of biochemical reaction networks to compute low-precision outputs with fewer resources than digital circuits. This analog computation is similar to that employed by gene-regulation networks. Although digital systems have a tractable link between structure and function, the nonlinear and continuous nature of analog circuits yields an intricate functional landscape, which makes their design counter-intuitive, their characterization laborious and their analysis delicate. Here, using droplet-based microfluidics, we map with high resolution and dimensionality the bifurcation diagrams of two synthetic, out-of-equilibrium and nonlinear programs: a bistable DNA switch and a predator-prey DNA oscillator. The diagrams delineate where function is optimal, dynamics bifurcates and models fail. Inverse problem solving on these large-scale data sets indicates interference from enzymatic coupling. Additionally, data mining exposes the presence of rare, stochastically bursting oscillators near deterministic bifurcations.

  4. Digitization of multistep organic synthesis in reactionware for on-demand pharmaceuticals.

    PubMed

    Kitson, Philip J; Marie, Guillaume; Francoia, Jean-Patrick; Zalesskiy, Sergey S; Sigerson, Ralph C; Mathieson, Jennifer S; Cronin, Leroy

    2018-01-19

    Chemical manufacturing is often done at large facilities that require a sizable capital investment and then produce key compounds for a finite period. We present an approach to the manufacturing of fine chemicals and pharmaceuticals in a self-contained plastic reactionware device. The device was designed and constructed by using a chemical to computer-automated design (ChemCAD) approach that enables the translation of traditional bench-scale synthesis into a platform-independent digital code. This in turn guides production of a three-dimensional printed device that encloses the entire synthetic route internally via simple operations. We demonstrate the approach for the γ-aminobutyric acid receptor agonist, (±)-baclofen, establishing a concept that paves the way for the local manufacture of drugs outside of specialist facilities. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  5. Investigation of application of two-degree-of-freedom dry tuned-gimbal gyroscopes to strapdown navigation systems. [for use in VTOL aircraft

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The work is described which was accomplished during the investigation of the application of dry-tuned gimbal gyroscopes to strapdown navigation systems. A conventional strapdown configuration, employing analog electronics in conjunction with digital attitude and navigation computation, was examined using various levels of redundancy and both orthogonal and nonorthogonal sensor orientations. It is concluded that the cost and reliability performance constraints which had been established could not be met simultaneously with such a system. This conclusion led to the examination of an alternative system configuration which utilizes an essentially new strapdown system concept. This system employs all-digital signal processing in conjunction with the newly-developed large scale integration (LSI) electronic packaging techniques and a new two-degree-of-freedom dry tuned-gimbal instrument which is capable of providing both angular rate and acceleration information. Such a system is capable of exceeding the established performance goals.

  6. Design of a high-speed digital processing element for parallel simulation

    NASA Technical Reports Server (NTRS)

    Milner, E. J.; Cwynar, D. S.

    1983-01-01

    A prototype of a custom designed computer to be used as a processing element in a multiprocessor based jet engine simulator is described. The purpose of the custom design was to give the computer the speed and versatility required to simulate a jet engine in real time. Real time simulations are needed for closed loop testing of digital electronic engine controls. The prototype computer has a microcycle time of 133 nanoseconds. This speed was achieved by: prefetching the next instruction while the current one is executing, transporting data using high speed data busses, and using state of the art components such as a very large scale integration (VLSI) multiplier. Included are discussions of processing element requirements, design philosophy, the architecture of the custom designed processing element, the comprehensive instruction set, the diagnostic support software, and the development status of the custom design.

  7. Low power sensor network for wireless condition monitoring

    NASA Astrophysics Data System (ADS)

    Richter, Ch.; Frankenstein, B.; Schubert, L.; Weihnacht, B.; Friedmann, H.; Ebert, C.

    2009-03-01

    For comprehensive fatigue tests and surveillance of large scale structures, a vibration monitoring system working in the Hz and sub Hz frequency range was realized and tested. The system is based on a wireless sensor network and focuses especially on the realization of a low power measurement, signal processing and communication. Regarding the development, we met the challenge of synchronizing the wireless connected sensor nodes with sufficient accuracy. The sensor nodes ware realized by compact, sensor near signal processing structures containing components for analog preprocessing of acoustic signals, their digitization, algorithms for data reduction and network communication. The core component is a digital micro controller which performs the basic algorithms necessary for the data acquisition synchronization and the filtering. As a first application, the system was installed in a rotor blade of a wind power turbine in order to monitor the Eigen modes over a longer period of time. Currently the sensor nodes are battery powered.

  8. Tissue microarrays and digital image analysis.

    PubMed

    Ryan, Denise; Mulrane, Laoighse; Rexhepaj, Elton; Gallagher, William M

    2011-01-01

    Tissue microarrays (TMAs) have recently emerged as very valuable tools for high-throughput pathological assessment, especially in the cancer research arena. This important technology, however, has yet to fully penetrate into the area of toxicology. Here, we describe the creation of TMAs representative of samples produced from conventional toxicology studies within a large-scale, multi-institutional pan-European project, PredTox. PredTox, short for Predictive Toxicology, formed part of an EU FP6 Integrated Project, Innovative Medicines for Europe (InnoMed), and aimed to study pre-clinically 16 compounds of known liver and/or kidney toxicity. In more detail, TMAs were constructed from materials corresponding to the full face sections of liver and kidney from rats treated with different drug candidates by members of the consortium. We also describe the process of digital slide scanning of kidney and liver sections, in the context of creating an online resource of histopathological data.

  9. Power in the loop real time simulation platform for renewable energy generation

    NASA Astrophysics Data System (ADS)

    Li, Yang; Shi, Wenhui; Zhang, Xing; He, Guoqing

    2018-02-01

    Nowadays, a large scale of renewable energy sources has been connecting to power system and the real time simulation platform is widely used to carry out research on integration control algorithm, power system stability etc. Compared to traditional pure digital simulation and hardware in the loop simulation, power in the loop simulation has higher accuracy and degree of reliability. In this paper, a power in the loop analog digital hybrid simulation platform has been built and it can be used not only for the single generation unit connecting to grid, but also for multiple new energy generation units connecting to grid. A wind generator inertia control experiment was carried out on the platform. The structure of the inertia control platform was researched and the results verify that the platform is up to need for renewable power in the loop real time simulation.

  10. On the Photometric Calibration of FORS2 and the Sloan Digital Sky Survey

    NASA Astrophysics Data System (ADS)

    Bramich, D.; Moehler, S.; Coccato, L.; Freudling, W.; Garcia-Dabó, C. E.; Müller, P.; Saviane, I.

    2012-09-01

    An accurate absolute calibration of photometric data to place them on a standard magnitude scale is very important for many science goals. Absolute calibration requires the observation of photometric standard stars and analysis of the observations with an appropriate photometric model including all relevant effects. In the FORS Absolute Photometry (FAP) project, we have developed a standard star observing strategy and modelling procedure that enables calibration of science target photometry to better than 3% accuracy on photometrically stable nights given sufficient signal-to-noise. In the application of this photometric modelling to large photometric databases, we have investigated the Sloan Digital Sky Survey (SDSS) and found systematic trends in the published photometric data. The amplitudes of these trends are similar to the reported typical precision (˜1% and ˜2%) of the SDSS photometry in the griz- and u-bands, respectively.

  11. Recently active traces of the Bartlett Springs Fault, California: a digital database

    USGS Publications Warehouse

    Lienkaemper, James J.

    2010-01-01

    The purpose of this map is to show the location of and evidence for recent movement on active fault traces within the Bartlett Springs Fault Zone, California. The location and recency of the mapped traces is primarily based on geomorphic expression of the fault as interpreted from large-scale aerial photography. In a few places, evidence of fault creep and offset Holocene strata in trenches and natural exposures have confirmed the activity of some of these traces. This publication is formatted both as a digital database for use within a geographic information system (GIS) and for broader public access as map images that may be browsed on-line or download a summary map. The report text describes the types of scientific observations used to make the map, gives references pertaining to the fault and the evidence of faulting, and provides guidance for use of and limitations of the map.

  12. Enhanced job control language procedures for the SIMSYS2D two-dimensional water-quality simulation system

    USGS Publications Warehouse

    Karavitis, G.A.

    1984-01-01

    The SIMSYS2D two-dimensional water-quality simulation system is a large-scale digital modeling software system used to simulate flow and transport of solutes in freshwater and estuarine environments. Due to the size, processing requirements, and complexity of the system, there is a need to easily move the system and its associated files between computer sites when required. A series of job control language (JCL) procedures was written to allow transferability between IBM and IBM-compatible computers. (USGS)

  13. LSI/VLSI design for testability analysis and general approach

    NASA Technical Reports Server (NTRS)

    Lam, A. Y.

    1982-01-01

    The incorporation of testability characteristics into large scale digital design is not only necessary for, but also pertinent to effective device testing and enhancement of device reliability. There are at least three major DFT techniques, namely, the self checking, the LSSD, and the partitioning techniques, each of which can be incorporated into a logic design to achieve a specific set of testability and reliability requirements. Detailed analysis of the design theory, implementation, fault coverage, hardware requirements, application limitations, etc., of each of these techniques are also presented.

  14. Distributed sensor networks: a cellular nonlinear network perspective.

    PubMed

    Haenggi, Martin

    2003-12-01

    Large-scale networks of integrated wireless sensors become increasingly tractable. Advances in hardware technology and engineering design have led to dramatic reductions in size, power consumption, and cost for digital circuitry, and wireless communications. Networking, self-organization, and distributed operation are crucial ingredients to harness the sensing, computing, and computational capabilities of the nodes into a complete system. This article shows that those networks can be considered as cellular nonlinear networks (CNNs), and that their analysis and design may greatly benefit from the rich theoretical results available for CNNs.

  15. Educational Reports That Scale across Users and Data

    ERIC Educational Resources Information Center

    Rolleston, Rob; Howe, Richard; Sprague, Mary Ann

    2015-01-01

    The field of education is undergoing fundamental change with the growing use of data. Fine-scale data collection at the item-response level is now possible. Xerox has developed a system that bridges the paper-to-digital divide by providing the well-established and easy-to-use paper interface to students, but digitizes the responses for scoring,…

  16. Comparison of 7.5-minute and 1-degree digital elevation models

    NASA Technical Reports Server (NTRS)

    Isaacson, Dennis L.; Ripple, William J.

    1995-01-01

    We compared two digital elevation models (DEM's) for the Echo Mountain SE quadrangle in the Cascade Mountains of Oregon. Comparisons were made between 7.5-minute (1:24,000-scale) and 1-degree (1:250,000-scale) images using the variables of elevation, slope aspect, and slope gradient. Both visual and statistical differences are presented.

  17. Comparison of 7.5-minute and 1-degree digital elevation models

    NASA Technical Reports Server (NTRS)

    Isaacson, Dennis L.; Ripple, William J.

    1990-01-01

    Two digital elevation models are compared for the Echo Mountain SE quadrangle in the Cascade Mountains of Oregon. Comparisons were made between 7.5-minute (1:24,000-scale) and 1-degree (1:250,000-scale) images using the variables of elevation, slope aspect, and slope gradient. Both visual and statistical differences are presented.

  18. Dimension scaling effects on the yield sensitivity of HEMT digital circuits

    NASA Technical Reports Server (NTRS)

    Sarker, Jogendra C.; Purviance, John E.

    1992-01-01

    In our previous works, using a graphical tool, yield factor histograms, we studied the yield sensitivity of High Electron Mobility Transistors (HEMT) and HEMT circuit performance with the variation of process parameters. This work studies the scaling effects of process parameters on yield sensitivity of HEMT digital circuits. The results from two HEMT circuits are presented.

  19. A Networked Sensor System for the Analysis of Plot-Scale Hydrology.

    PubMed

    Villalba, German; Plaza, Fernando; Zhong, Xiaoyang; Davis, Tyler W; Navarro, Miguel; Li, Yimei; Slater, Thomas A; Liang, Yao; Liang, Xu

    2017-03-20

    This study presents the latest updates to the Audubon Society of Western Pennsylvania (ASWP) testbed, a $50,000 USD, 104-node outdoor multi-hop wireless sensor network (WSN). The network collects environmental data from over 240 sensors, including the EC-5, MPS-1 and MPS-2 soil moisture and soil water potential sensors and self-made sap flow sensors, across a heterogeneous deployment comprised of MICAz, IRIS and TelosB wireless motes. A low-cost sensor board and software driver was developed for communicating with the analog and digital sensors. Innovative techniques (e.g., balanced energy efficient routing and heterogeneous over-the-air mote reprogramming) maintained high success rates (>96%) and enabled effective software updating, throughout the large-scale heterogeneous WSN. The edaphic properties monitored by the network showed strong agreement with data logger measurements and were fitted to pedotransfer functions for estimating local soil hydraulic properties. Furthermore, sap flow measurements, scaled to tree stand transpiration, were found to be at or below potential evapotranspiration estimates. While outdoor WSNs still present numerous challenges, the ASWP testbed proves to be an effective and (relatively) low-cost environmental monitoring solution and represents a step towards developing a platform for monitoring and quantifying statistically relevant environmental parameters from large-scale network deployments.

  20. A Networked Sensor System for the Analysis of Plot-Scale Hydrology

    PubMed Central

    Villalba, German; Plaza, Fernando; Zhong, Xiaoyang; Davis, Tyler W.; Navarro, Miguel; Li, Yimei; Slater, Thomas A.; Liang, Yao; Liang, Xu

    2017-01-01

    This study presents the latest updates to the Audubon Society of Western Pennsylvania (ASWP) testbed, a $50,000 USD, 104-node outdoor multi-hop wireless sensor network (WSN). The network collects environmental data from over 240 sensors, including the EC-5, MPS-1 and MPS-2 soil moisture and soil water potential sensors and self-made sap flow sensors, across a heterogeneous deployment comprised of MICAz, IRIS and TelosB wireless motes. A low-cost sensor board and software driver was developed for communicating with the analog and digital sensors. Innovative techniques (e.g., balanced energy efficient routing and heterogeneous over-the-air mote reprogramming) maintained high success rates (>96%) and enabled effective software updating, throughout the large-scale heterogeneous WSN. The edaphic properties monitored by the network showed strong agreement with data logger measurements and were fitted to pedotransfer functions for estimating local soil hydraulic properties. Furthermore, sap flow measurements, scaled to tree stand transpiration, were found to be at or below potential evapotranspiration estimates. While outdoor WSNs still present numerous challenges, the ASWP testbed proves to be an effective and (relatively) low-cost environmental monitoring solution and represents a step towards developing a platform for monitoring and quantifying statistically relevant environmental parameters from large-scale network deployments. PMID:28335534

  1. Intrauterine Exposure to Methylmercury and Neurocognitive Functions: Minamata Disease.

    PubMed

    Yorifuji, Takashi; Kato, Tsuguhiko; Kado, Yoko; Tokinobu, Akiko; Yamakawa, Michiyo; Tsuda, Toshihide; Sanada, Satoshi

    2015-01-01

    A large-scale food poisoning caused by methylmercury was identified in Minamata, Japan, in the 1950s. The severe intrauterine exposure cases are well known, although the possible impact of low-to-moderate methylmercury exposure in utero are rarely investigated. We examined neurocognitive functions among 22 participants in Minamata, mainly using an intelligence quotient test (Wechsler Adults Intelligent Scale III), in 2012/2013. The participants tended to score low on the Index score of processing speed (PS) relative to full-scale IQ, and discrepancies between PS and other scores within each participant were observed. The lower score on PS was due to deficits in digit symbol-coding and symbol search and was associated with methylmercury concentration in umbilical cords. The residents who experienced low-to-moderate methylmercury exposure including prenatal one in Minamata manifested deficits in their cognitive functions, processing speed in particular.

  2. Development of a 3D printer using scanning projection stereolithography

    PubMed Central

    Lee, Michael P.; Cooper, Geoffrey J. T.; Hinkley, Trevor; Gibson, Graham M.; Padgett, Miles J.; Cronin, Leroy

    2015-01-01

    We have developed a system for the rapid fabrication of low cost 3D devices and systems in the laboratory with micro-scale features yet cm-scale objects. Our system is inspired by maskless lithography, where a digital micromirror device (DMD) is used to project patterns with resolution up to 10 µm onto a layer of photoresist. Large area objects can be fabricated by stitching projected images over a 5cm2 area. The addition of a z-stage allows multiple layers to be stacked to create 3D objects, removing the need for any developing or etching steps but at the same time leading to true 3D devices which are robust, configurable and scalable. We demonstrate the applications of the system by printing a range of micro-scale objects as well as a fully functioning microfluidic droplet device and test its integrity by pumping dye through the channels. PMID:25906401

  3. Coarsening mechanism of phase separation caused by a double temperature quench in an off-symmetric binary mixture.

    PubMed

    Sigehuzi, Tomoo; Tanaka, Hajime

    2004-11-01

    We study phase-separation behavior of an off-symmetric fluid mixture induced by a "double temperature quench." We first quench a system into the unstable region. After a large phase-separated structure is formed, we again quench the system more deeply and follow the pattern-evolution process. The second quench makes the domains formed by the first quench unstable and leads to double phase separation; that is, small droplets are formed inside the large domains created by the first quench. The complex coarsening behavior of this hierarchic structure having two characteristic length scales is studied in detail by using the digital image analysis. We find three distinct time regimes in the time evolution of the structure factor of the system. In the first regime, small droplets coarsen with time inside large domains. There a large domain containing small droplets in it can be regarded as an isolated system. Later, however, the coarsening of small droplets stops when they start to interact via diffusion with the large domain containing them. Finally, small droplets disappear due to the Lifshitz-Slyozov mechanism. Thus the observed behavior can be explained by the crossover of the nature of a large domain from the isolated to the open system; this is a direct consequence of the existence of the two characteristic length scales.

  4. Reproducible Operating Margins on a 72800-Device Digital Superconducting Chip (Open Access)

    DTIC Science & Technology

    2015-10-28

    superconductor digital logic. Keywords: flux trapping, yield, digital Superconductor digital technology offers fundamental advantages over conventional...trapping in the superconductor films can degrade or preclude correct circuit operation. Scaling superconductor technology is now possible due to recent...advances in circuit design embodied in reciprocal quantum logic (RQL) [2, 3] and recent advances in superconductor integrated circuit fabrication, which

  5. Detection of baryon acoustic oscillation features in the large-scale three-point correlation function of SDSS BOSS DR12 CMASS galaxies

    NASA Astrophysics Data System (ADS)

    Slepian, Zachary; Eisenstein, Daniel J.; Brownstein, Joel R.; Chuang, Chia-Hsun; Gil-Marín, Héctor; Ho, Shirley; Kitaura, Francisco-Shu; Percival, Will J.; Ross, Ashley J.; Rossi, Graziano; Seo, Hee-Jong; Slosar, Anže; Vargas-Magaña, Mariana

    2017-08-01

    We present the large-scale three-point correlation function (3PCF) of the Sloan Digital Sky Survey DR12 Constant stellar Mass (CMASS) sample of 777 202 Luminous Red Galaxies, the largest-ever sample used for a 3PCF or bispectrum measurement. We make the first high-significance (4.5σ) detection of baryon acoustic oscillations (BAO) in the 3PCF. Using these acoustic features in the 3PCF as a standard ruler, we measure the distance to z = 0.57 to 1.7 per cent precision (statistical plus systematic). We find DV = 2024 ± 29 Mpc (stat) ± 20 Mpc (sys) for our fiducial cosmology (consistent with Planck 2015) and bias model. This measurement extends the use of the BAO technique from the two-point correlation function (2PCF) and power spectrum to the 3PCF and opens an avenue for deriving additional cosmological distance information from future large-scale structure redshift surveys such as DESI. Our measured distance scale from the 3PCF is fairly independent from that derived from the pre-reconstruction 2PCF and is equivalent to increasing the length of BOSS by roughly 10 per cent; reconstruction appears to lower the independence of the distance measurements. Fitting a model including tidal tensor bias yields a moderate-significance (2.6σ) detection of this bias with a value in agreement with the prediction from local Lagrangian biasing.

  6. Programmable high-output-impedance, large-voltage compliance, microstimulator for low-voltage biomedical applications.

    PubMed

    Farahmand, Sina; Maghami, Mohammad Hossein; Sodagar, Amir M

    2012-01-01

    This paper reports on the design of a programmable, high output impedance, large voltage compliance microstimulator for low-voltage biomedical applications. A 6-bit binary-weighted digital to analog converter (DAC) is used to generate biphasic stimulus current pulses. A compact current mirror with large output voltage compliance and high output resistance conveys the current pulses to the target tissue. Designed and simulated in a standard 0.18µm CMOS process, the microstimulator circuit is capable of delivering a maximum stimulation current of 160µA to a 10-kΩ resistive load. Operated at a 1.8-V supply voltage, the output stage exhibits a voltage compliance of 1.69V and output resistance of 160MΩ at full scale stimulus current. Layout of the core microelectrode circuit measures 25.5µm×31.5µm.

  7. All-Digital Time-Domain CMOS Smart Temperature Sensor with On-Chip Linearity Enhancement.

    PubMed

    Chen, Chun-Chi; Chen, Chao-Lieh; Lin, Yi

    2016-01-30

    This paper proposes the first all-digital on-chip linearity enhancement technique for improving the accuracy of the time-domain complementary metal-oxide semiconductor (CMOS) smart temperature sensor. To facilitate on-chip application and intellectual property reuse, an all-digital time-domain smart temperature sensor was implemented using 90 nm Field Programmable Gate Arrays (FPGAs). Although the inverter-based temperature sensor has a smaller circuit area and lower complexity, two-point calibration must be used to achieve an acceptable inaccuracy. With the help of a calibration circuit, the influence of process variations was reduced greatly for one-point calibration support, reducing the test costs and time. However, the sensor response still exhibited a large curvature, which substantially affected the accuracy of the sensor. Thus, an on-chip linearity-enhanced circuit is proposed to linearize the curve and achieve a new linearity-enhanced output. The sensor was implemented on eight different Xilinx FPGA using 118 slices per sensor in each FPGA to demonstrate the benefits of the linearization. Compared with the unlinearized version, the maximal inaccuracy of the linearized version decreased from 5 °C to 2.5 °C after one-point calibration in a range of -20 °C to 100 °C. The sensor consumed 95 μW using 1 kSa/s. The proposed linearity enhancement technique significantly improves temperature sensing accuracy, avoiding costly curvature compensation while it is fully synthesizable for future Very Large Scale Integration (VLSI) system.

  8. All-Digital Time-Domain CMOS Smart Temperature Sensor with On-Chip Linearity Enhancement

    PubMed Central

    Chen, Chun-Chi; Chen, Chao-Lieh; Lin, Yi

    2016-01-01

    This paper proposes the first all-digital on-chip linearity enhancement technique for improving the accuracy of the time-domain complementary metal-oxide semiconductor (CMOS) smart temperature sensor. To facilitate on-chip application and intellectual property reuse, an all-digital time-domain smart temperature sensor was implemented using 90 nm Field Programmable Gate Arrays (FPGAs). Although the inverter-based temperature sensor has a smaller circuit area and lower complexity, two-point calibration must be used to achieve an acceptable inaccuracy. With the help of a calibration circuit, the influence of process variations was reduced greatly for one-point calibration support, reducing the test costs and time. However, the sensor response still exhibited a large curvature, which substantially affected the accuracy of the sensor. Thus, an on-chip linearity-enhanced circuit is proposed to linearize the curve and achieve a new linearity-enhanced output. The sensor was implemented on eight different Xilinx FPGA using 118 slices per sensor in each FPGA to demonstrate the benefits of the linearization. Compared with the unlinearized version, the maximal inaccuracy of the linearized version decreased from 5 °C to 2.5 °C after one-point calibration in a range of −20 °C to 100 °C. The sensor consumed 95 μW using 1 kSa/s. The proposed linearity enhancement technique significantly improves temperature sensing accuracy, avoiding costly curvature compensation while it is fully synthesizable for future Very Large Scale Integration (VLSI) system. PMID:26840316

  9. Red, Straight, no bends: primordial power spectrum reconstruction from CMB and large-scale structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ravenni, Andrea; Verde, Licia; Cuesta, Antonio J., E-mail: andrea.ravenni@pd.infn.it, E-mail: liciaverde@icc.ub.edu, E-mail: ajcuesta@icc.ub.edu

    2016-08-01

    We present a minimally parametric, model independent reconstruction of the shape of the primordial power spectrum. Our smoothing spline technique is well-suited to search for smooth features such as deviations from scale invariance, and deviations from a power law such as running of the spectral index or small-scale power suppression. We use a comprehensive set of the state-of the art cosmological data: Planck observations of the temperature and polarisation anisotropies of the cosmic microwave background, WiggleZ and Sloan Digital Sky Survey Data Release 7 galaxy power spectra and the Canada-France-Hawaii Lensing Survey correlation function. This reconstruction strongly supports the evidencemore » for a power law primordial power spectrum with a red tilt and disfavours deviations from a power law power spectrum including small-scale power suppression such as that induced by significantly massive neutrinos. This offers a powerful confirmation of the inflationary paradigm, justifying the adoption of the inflationary prior in cosmological analyses.« less

  10. Red, Straight, no bends: primordial power spectrum reconstruction from CMB and large-scale structure

    NASA Astrophysics Data System (ADS)

    Ravenni, Andrea; Verde, Licia; Cuesta, Antonio J.

    2016-08-01

    We present a minimally parametric, model independent reconstruction of the shape of the primordial power spectrum. Our smoothing spline technique is well-suited to search for smooth features such as deviations from scale invariance, and deviations from a power law such as running of the spectral index or small-scale power suppression. We use a comprehensive set of the state-of the art cosmological data: Planck observations of the temperature and polarisation anisotropies of the cosmic microwave background, WiggleZ and Sloan Digital Sky Survey Data Release 7 galaxy power spectra and the Canada-France-Hawaii Lensing Survey correlation function. This reconstruction strongly supports the evidence for a power law primordial power spectrum with a red tilt and disfavours deviations from a power law power spectrum including small-scale power suppression such as that induced by significantly massive neutrinos. This offers a powerful confirmation of the inflationary paradigm, justifying the adoption of the inflationary prior in cosmological analyses.

  11. Valley s'Asymmetric Characteristics of the Loess Plateau in Northwestern Shanxi Based on DEM

    NASA Astrophysics Data System (ADS)

    Duan, J.

    2016-12-01

    The valleys of the Loess Plateau in northwestern Shanxi show great asymmetry. This study using multi-scale DEMs, high-resolution satellite images and digital terrain analysis method, put forward a quantitative index to describe the asymmetric morphology. Several typical areas are selected to test and verify the spatial variability. Results show: (1) Considering the difference of spatial distribution, Pianguanhe basin, Xianchuanhe basin and Yangjiachuan basin are the areas where show most significant asymmetric characteristics . (2) Considering the difference of scale, the shape of large-scale valleys represents three characteristics: randomness, equilibrium and relative symmetry, while small-scale valleys show directionality and asymmetry. (3) Asymmetric morphology performs orientation, and the east-west valleys extremely obvious. Combined with field survey, its formation mechanism can be interpreted as follows :(1)Loess uneven distribution in the valleys. (2) The distribution diversities of vegetation, water , heat conditions and other factors, make a difference in water erosion capability which leads to asymmetric characteristics.

  12. The Angular Correlation Function of Galaxies from Early Sloan Digital Sky Survey Data

    NASA Astrophysics Data System (ADS)

    Connolly, Andrew J.; Scranton, Ryan; Johnston, David; Dodelson, Scott; Eisenstein, Daniel J.; Frieman, Joshua A.; Gunn, James E.; Hui, Lam; Jain, Bhuvnesh; Kent, Stephen; Loveday, Jon; Nichol, Robert C.; O'Connell, Liam; Postman, Marc; Scoccimarro, Roman; Sheth, Ravi K.; Stebbins, Albert; Strauss, Michael A.; Szalay, Alexander S.; Szapudi, István; Tegmark, Max; Vogeley, Michael S.; Zehavi, Idit; Annis, James; Bahcall, Neta; Brinkmann, J.; Csabai, István; Doi, Mamoru; Fukugita, Masataka; Hennessy, G. S.; Hindsley, Robert; Ichikawa, Takashi; Ivezić, Željko; Kim, Rita S. J.; Knapp, Gillian R.; Kunszt, Peter; Lamb, D. Q.; Lee, Brian C.; Lupton, Robert H.; McKay, Timothy A.; Munn, Jeff; Peoples, John; Pier, Jeff; Rockosi, Constance; Schlegel, David; Stoughton, Christopher; Tucker, Douglas L.; Yanny, Brian; York, Donald G.

    2002-11-01

    The Sloan Digital Sky Survey is one of the first multicolor photometric and spectroscopic surveys designed to measure the statistical properties of galaxies within the local universe. In this paper we present some of the initial results on the angular two-point correlation function measured from the early SDSS galaxy data. The form of the correlation function, over the magnitude interval 18

  13. The Digital Slide Archive: A Software Platform for Management, Integration, and Analysis of Histology for Cancer Research.

    PubMed

    Gutman, David A; Khalilia, Mohammed; Lee, Sanghoon; Nalisnik, Michael; Mullen, Zach; Beezley, Jonathan; Chittajallu, Deepak R; Manthey, David; Cooper, Lee A D

    2017-11-01

    Tissue-based cancer studies can generate large amounts of histology data in the form of glass slides. These slides contain important diagnostic, prognostic, and biological information and can be digitized into expansive and high-resolution whole-slide images using slide-scanning devices. Effectively utilizing digital pathology data in cancer research requires the ability to manage, visualize, share, and perform quantitative analysis on these large amounts of image data, tasks that are often complex and difficult for investigators with the current state of commercial digital pathology software. In this article, we describe the Digital Slide Archive (DSA), an open-source web-based platform for digital pathology. DSA allows investigators to manage large collections of histologic images and integrate them with clinical and genomic metadata. The open-source model enables DSA to be extended to provide additional capabilities. Cancer Res; 77(21); e75-78. ©2017 AACR . ©2017 American Association for Cancer Research.

  14. Compressed digital holography: from micro towards macro

    NASA Astrophysics Data System (ADS)

    Schretter, Colas; Bettens, Stijn; Blinder, David; Pesquet-Popescu, Béatrice; Cagnazzo, Marco; Dufaux, Frédéric; Schelkens, Peter

    2016-09-01

    signal processing methods from software-driven computer engineering and applied mathematics. The compressed sensing theory in particular established a practical framework for reconstructing the scene content using few linear combinations of complex measurements and a sparse prior for regularizing the solution. Compressed sensing found direct applications in digital holography for microscopy. Indeed, the wave propagation phenomenon in free space mixes in a natural way the spatial distribution of point sources from the 3-dimensional scene. As the 3-dimensional scene is mapped to a 2-dimensional hologram, the hologram samples form a compressed representation of the scene as well. This overview paper discusses contributions in the field of compressed digital holography at the micro scale. Then, an outreach on future extensions towards the real-size macro scale is discussed. Thanks to advances in sensor technologies, increasing computing power and the recent improvements in sparse digital signal processing, holographic modalities are on the verge of practical high-quality visualization at a macroscopic scale where much higher resolution holograms must be acquired and processed on the computer.

  15. Preservation in the Age of Google: Digitization, Digital Preservation, and Dilemmas

    ERIC Educational Resources Information Center

    Conway, Paul

    2010-01-01

    The cultural heritage preservation community now functions largely within the environment of digital technologies. This article begins by juxtaposing definitions of the terms "digitization for preservation" and "digital preservation" within a sociotechnical environment for which Google serves as a relevant metaphor. It then reviews two reports…

  16. VizieR Online Data Catalog: Tully-Fisher relation for SDSS galaxies (Reyes+, 2011)

    NASA Astrophysics Data System (ADS)

    Reyes, R.; Mandelbaum, R.; Gunn, J. E.; Pizagno, J.; Lackner, C. N.

    2012-05-01

    In this paper, we derive scaling relations between photometric observable quantities and disc galaxy rotation velocity Vrot or Tully-Fisher relations (TFRs). Our methodology is dictated by our purpose of obtaining purely photometric, minimal-scatter estimators of Vrot applicable to large galaxy samples from imaging surveys. To achieve this goal, we have constructed a sample of 189 disc galaxies at redshifts z<0.1 with long-slit Hα spectroscopy from Pizagno et al. (2007, Cat. J/AJ/134/945) and new observations. By construction, this sample is a fair subsample of a large, well-defined parent disc sample of ~170000 galaxies selected from the Sloan Digital Sky Survey Data Release 7 (SDSS DR7). (4 data files).

  17. A digital gigapixel large-format tile-scan camera.

    PubMed

    Ben-Ezra, M

    2011-01-01

    Although the resolution of single-lens reflex (SLR) and medium-format digital cameras has increased in recent years, applications for cultural-heritage preservation and computational photography require even higher resolutions. Addressing this issue, a large-format cameras' large image planes can achieve very high resolution without compromising pixel size and thus can provide high-quality, high-resolution images.This digital large-format tile scan camera can acquire high-quality, high-resolution images of static scenes. It employs unique calibration techniques and a simple algorithm for focal-stack processing of very large images with significant magnification variations. The camera automatically collects overlapping focal stacks and processes them into a high-resolution, extended-depth-of-field image.

  18. Gaussian pre-filtering for uncertainty minimization in digital image correlation using numerically-designed speckle patterns

    NASA Astrophysics Data System (ADS)

    Mazzoleni, Paolo; Matta, Fabio; Zappa, Emanuele; Sutton, Michael A.; Cigada, Alfredo

    2015-03-01

    This paper discusses the effect of pre-processing image blurring on the uncertainty of two-dimensional digital image correlation (DIC) measurements for the specific case of numerically-designed speckle patterns having particles with well-defined and consistent shape, size and spacing. Such patterns are more suitable for large measurement surfaces on large-scale specimens than traditional spray-painted random patterns without well-defined particles. The methodology consists of numerical simulations where Gaussian digital filters with varying standard deviation are applied to a reference speckle pattern. To simplify the pattern application process for large areas and increase contrast to reduce measurement uncertainty, the speckle shape, mean size and on-center spacing were selected to be representative of numerically-designed patterns that can be applied on large surfaces through different techniques (e.g., spray-painting through stencils). Such 'designer patterns' are characterized by well-defined regions of non-zero frequency content and non-zero peaks, and are fundamentally different from typical spray-painted patterns whose frequency content exhibits near-zero peaks. The effect of blurring filters is examined for constant, linear, quadratic and cubic displacement fields. Maximum strains between ±250 and ±20,000 με are simulated, thus covering a relevant range for structural materials subjected to service and ultimate stresses. The robustness of the simulation procedure is verified experimentally using a physical speckle pattern subjected to constant displacements. The stability of the relation between standard deviation of the Gaussian filter and measurement uncertainty is assessed for linear displacement fields at varying image noise levels, subset size, and frequency content of the speckle pattern. It is shown that bias error as well as measurement uncertainty are minimized through Gaussian pre-filtering. This finding does not apply to typical spray-painted patterns without well-defined particles, for which image blurring is only beneficial in reducing bias errors.

  19. Access to digital technology among families coming to urban pediatric primary care clinics.

    PubMed

    Demartini, Tori L; Beck, Andrew F; Klein, Melissa D; Kahn, Robert S

    2013-07-01

    Digital technologies offer new platforms for health promotion and disease management. Few studies have evaluated the use of digital technology among families receiving care in an urban pediatric primary care setting. A self-administered survey was given to a convenience sample of caregivers bringing their children to 2 urban pediatric primary care centers in spring 2012. The survey assessed access to home Internet, e-mail, smartphone, and social media (Facebook and Twitter). A "digital technology" scale (0-4) quantified the number of available digital technologies and connections. Frequency of daily use and interest in receiving medical information digitally were also assessed. The survey was completed by 257 caregivers. The sample was drawn from a clinical population that was 73% African American and 92% Medicaid insured with a median patient age of 2.9 years (interquartile range 0.8-7.4). Eighty percent of respondents reported having Internet at home, and 71% had a smartphone. Ninety-one percent reported using e-mail, 78% Facebook, and 27% Twitter. Ninety-seven percent scored ≥1 on the digital technology scale; 49% had a digital technology score of 4. The digital technology score was associated with daily use of digital media in a graded fashion (P < .0001). More than 70% of respondents reported that they would use health care information supplied digitally if approved by their child's medical provider. Caregivers in an urban pediatric primary care setting have access to and frequently use digital technologies. Digital connections may help reach a traditionally hard-to-reach population.

  20. Student-directed investigation of natural phenomena: Using digital simulations to achieve NGSS-aligned 3D learning in middle school

    NASA Astrophysics Data System (ADS)

    Selvans, M. M.; Spafford, C. D.

    2016-12-01

    Many Earth Science phenomena cannot be observed directly because they happen slowly (e.g., Plate Motion) or at large spatial scales (e.g., Weather Patterns). Such topics are investigated by scientists through analysis of large data sets, numerical modeling, and laboratory studies that isolate aspects of the overall phenomena. Middle school students have limited time and lab equipment in comparison, but can employ authentic science practices through investigations using interactive digital simulations (sims). Designing a sim aligned to the Next Generation Science Standards (NGSS) allows students to explore and connect to science ideas in a seamless and supportive way that also deepens their understanding of the phenomena. We helped develop seven units, including the two above, that cover the middle school Earth Science Disciplinary Core Ideas and give students exposure to the other two dimensions of the NGSS (science practices and cross-cutting concepts). These units are developed by the Learning Design Group and Amplify Science. Sims are key to how students engage in 3D learning in these units. For example, in the Rock Transformations Sim students can investigate the ideas that energy from the sun and from Earth's interior can transform rock, and that the transformation processes change the Earth's surface at varying time and spatial scales (ESS2.A). Students can choose and selectively apply transformation processes (melting, weathering, etc.) or energy sources to rock in a cross-section landscape to explore their effects. Students are able to plan steps for making a particular rock transformation happen and carry out their own investigations. A benefit of using a digital platform for student learning is the ability to embed formative assessment. When students plan and carry out missions to achieve specific objectives, the digital platform can capture a record of their actions to measure how they apply science ideas from instruction. Data of these actions, combined with data from other embedded assessments and the teacher's own observations, can be used to provide feedback to teachers about support that can benefit specific students. We will highlight the features of sims in our units that allow middle school students to investigate natural phenomena and support teachers in facilitating 3D learning.

  1. A psychophysical comparison of two methods for adaptive histogram equalization.

    PubMed

    Zimmerman, J B; Cousins, S B; Hartzell, K M; Frisse, M E; Kahn, M G

    1989-05-01

    Adaptive histogram equalization (AHE) is a method for adaptive contrast enhancement of digital images. It is an automatic, reproducible method for the simultaneous viewing of contrast within a digital image with a large dynamic range. Recent experiments have shown that in specific cases, there is no significant difference in the ability of AHE and linear intensity windowing to display gray-scale contrast. More recently, a variant of AHE which limits the allowed contrast enhancement of the image has been proposed. This contrast-limited adaptive histogram equalization (CLAHE) produces images in which the noise content of an image is not excessively enhanced, but in which sufficient contrast is provided for the visualization of structures within the image. Images processed with CLAHE have a more natural appearance and facilitate the comparison of different areas of an image. However, the reduced contrast enhancement of CLAHE may hinder the ability of an observer to detect the presence of some significant gray-scale contrast. In this report, a psychophysical observer experiment was performed to determine if there is a significant difference in the ability of AHE and CLAHE to depict gray-scale contrast. Observers were presented with computed tomography (CT) images of the chest processed with AHE and CLAHE. Subtle artificial lesions were introduced into some images. The observers were asked to rate their confidence regarding the presence of the lesions; this rating-scale data was analyzed using receiver operating characteristic (ROC) curve techniques. These ROC curves were compared for significant differences in the observers' performances. In this report, no difference was found in the abilities of AHE and CLAHE to depict contrast information.

  2. A Generalizable Framework for Multi-Scale Auditing of Digital Learning Provision in Higher Education

    ERIC Educational Resources Information Center

    Ross, Samuel R. P-J.; Volz, Veronica; Lancaster, Matthew K.; Divan, Aysha

    2018-01-01

    It is increasingly important that higher education institutions be able to audit and evaluate the scope and efficacy of their digital learning resources across various scales. To date there has been little effort to address this need for a validated, appropriate, and simple-to-execute method that will facilitate such an audit, whether it be at the…

  3. Scalable hybrid computation with spikes.

    PubMed

    Sarpeshkar, Rahul; O'Halloran, Micah

    2002-09-01

    We outline a hybrid analog-digital scheme for computing with three important features that enable it to scale to systems of large complexity: First, like digital computation, which uses several one-bit precise logical units to collectively compute a precise answer to a computation, the hybrid scheme uses several moderate-precision analog units to collectively compute a precise answer to a computation. Second, frequent discrete signal restoration of the analog information prevents analog noise and offset from degrading the computation. And, third, a state machine enables complex computations to be created using a sequence of elementary computations. A natural choice for implementing this hybrid scheme is one based on spikes because spike-count codes are digital, while spike-time codes are analog. We illustrate how spikes afford easy ways to implement all three components of scalable hybrid computation. First, as an important example of distributed analog computation, we show how spikes can create a distributed modular representation of an analog number by implementing digital carry interactions between spiking analog neurons. Second, we show how signal restoration may be performed by recursive spike-count quantization of spike-time codes. And, third, we use spikes from an analog dynamical system to trigger state transitions in a digital dynamical system, which reconfigures the analog dynamical system using a binary control vector; such feedback interactions between analog and digital dynamical systems create a hybrid state machine (HSM). The HSM extends and expands the concept of a digital finite-state-machine to the hybrid domain. We present experimental data from a two-neuron HSM on a chip that implements error-correcting analog-to-digital conversion with the concurrent use of spike-time and spike-count codes. We also present experimental data from silicon circuits that implement HSM-based pattern recognition using spike-time synchrony. We outline how HSMs may be used to perform learning, vector quantization, spike pattern recognition and generation, and how they may be reconfigured.

  4. Dagik Earth: A Digital Globe Project for Classrooms, Science Museums, and Research Institutes

    NASA Astrophysics Data System (ADS)

    Saito, A.; Tsugawa, T.

    2017-12-01

    Digital globe system is a powerful tool to make the audiences understand phenomena on the Earth and planets in intuitive way. Geo-cosmos of Miraikan, Japan uses 6-m spherical LED, and is one of the largest systems of digital globe. Science on a Sphere (SOS) by NOAA is a digital globe system that is most widely used in science museums around the world. These systems are so expensive that the usage of the digital globes is mainly limited to large-scale science museums. Dagik Earth is a digital globe project that promotes educational programs using digital globe with low cost. It aims to be used especially in classrooms. The cost for the digital globe of Dagik Earth is from several US dollars if PC and PC projector are available. It uses white spheres, such as balloons and balance balls, as the screen. The software is provided by the project with free of charge for the educational usage. The software runs on devices of Windows, Mac and iOS. There are English and Chinese language versions of the PC software besides Japanese version. The number of the registered users of Dagik Earth is about 1,400 in Japan. About 60% of them belongs to schools, 30% to universities and research institutes, and 8% to science museums. In schools, it is used in classes by teachers, and science activities by students. Several teachers have used the system for five years and more. In a students' activity, Dagik Earth contents on the typhoon, solar eclipse, and satellite launch were created and presented in a school festival. This is a good example of the usage of Dagik Earth for STEM education. In the presentation, the system and activity of Dagik Earth will be presented, and the future expansion of the project will be discussed.

  5. On Reading and Digital Media: Rejoinder to "Digital Technology and Student Cognitive Development: The Neuroscience of the University Classroom"

    ERIC Educational Resources Information Center

    Williams-Pierce, Caroline

    2016-01-01

    This commentary serves as an introduction to multiple scholarly fields about the value of digital media for providing contexts for and provoking learning. The author proposes that rather than considering a dichotomy between reading physical books and reading digital media, as encouraged by Cavanaugh et al. (2015), instead consider a scale of sorts…

  6. 3-D Printing as a Tool to Investigate the Effects of Changes in Rock Microstructures on Permeability

    NASA Astrophysics Data System (ADS)

    Head, D. A.; Vanorio, T.

    2016-12-01

    Rocks are naturally heterogeneous; two rock samples with identical bulk properties can vary widely in microstructure. Understanding the evolutionary trends of rock properties requires the ability to connect time-lapse measurements of properties at different scales: the macro- scale used in the laboratory and field analyses capturing the bulk scale changes and the micro- scale used in imaging and digital techniques capturing the changes to the pore space. However, measuring those properties at different scales is very challenging, and sometimes impossible. The advent of modern 3D printing has provided an unprecedented opportunity to link those scales by combining the strengths of digital and experimental rock physics. To determine the feasibility of this technique we characterized the resolution capabilities of two different 3D printers. To calibrate our digital models with our printed models, we created a sample with an analytically solvable permeability. This allowed us to directly compare analytic calculation, numerical simulation, and laboratory measurement of permeability of the exact same sample. Next we took a CT-scanned model of a natural carbonate pore space, then iteratively digitally manipulated, 3D printed, and measured the flow properties in the laboratory. This approach allowed us to access multiple scales digitally and experimentally, to test hypotheses about how changes in rock microstructure due to compaction and dissolution affect bulk transport properties, and to connect laboratory measurements of porosity and permeability to quantities that are traditionally impossible to measure in the laboratory such as changes in surface area and tortuosity. As 3D printing technology continues to advance, we expect this technique to contribute to our ability to characterize the properties of remote and/or delicate samples as well as to test the impact of microstructural alteration on bulk physical properties in the lab in a highly consistent, repeatable manner.

  7. Methods in Astronomical Image Processing

    NASA Astrophysics Data System (ADS)

    Jörsäter, S.

    A Brief Introductory Note History of Astronomical Imaging Astronomical Image Data Images in Various Formats Digitized Image Data Digital Image Data Philosophy of Astronomical Image Processing Properties of Digital Astronomical Images Human Image Processing Astronomical vs. Computer Science Image Processing Basic Tools of Astronomical Image Processing Display Applications Calibration of Intensity Scales Calibration of Length Scales Image Re-shaping Feature Enhancement Noise Suppression Noise and Error Analysis Image Processing Packages: Design of AIPS and MIDAS AIPS MIDAS Reduction of CCD Data Bias Subtraction Clipping Preflash Subtraction Dark Subtraction Flat Fielding Sky Subtraction Extinction Correction Deconvolution Methods Rebinning/Combining Summary and Prospects for the Future

  8. Digit Symbol Performance in Mild Dementia and Depression.

    ERIC Educational Resources Information Center

    Hart, Robert P.; And Others

    1987-01-01

    Patients with mild dementia of the Alzheimer's type (DAT), patients with major depression, and normal control subjects completed the Wechsler Adult Intelligence Scale (WAIS) Digit Symbol test of incidental memory. Though mild DAT and depressed patients had equivalent deficits in psychomotor speed, DAT patients recalled fewer digit-symbol items.…

  9. Digitized Special Collections and Multiple User Groups

    ERIC Educational Resources Information Center

    Gueguen, Gretchen

    2010-01-01

    Many organizations have evolved since their early attempts to mount digital exhibits on the Web and are experimenting with ways to increase the scale of their digitized collections by utilizing archival finding aid description rather than resource-intensive collections and exhibits. This article examines usability research to predict how such…

  10. The Rise of the Digital Public Library

    ERIC Educational Resources Information Center

    McKendrick, Joseph

    2012-01-01

    There is a growing shift to digital offerings among public libraries. Libraries increasingly are fulfilling roles as technology hubs for their communities, with high demand for technology and career development training resources. Ebooks and other digital materials are on the rise, while print is being scaled back. More libraries are turning to…

  11. Digital Learning in Schools: Conceptualizing the Challenges and Influences on Teacher Practice

    ERIC Educational Resources Information Center

    Blundell, Christopher; Lee, Kar-Tin; Nykvist, Shaun

    2016-01-01

    Digital technologies are an important requirement for curriculum expectations, including general ICT capability and STEM education. These technologies are also positioned as mechanisms for educational reform via transformation of teacher practice. It seems, however, that wide-scale transformation of teacher practice and digital learning remain…

  12. Field-based Digital Mapping of the November 3, 2002 Susitna Glacier Fault Rupture - Integrating remotely sensed data, GIS, and photo-linking technologies

    NASA Astrophysics Data System (ADS)

    Staft, L. A.; Craw, P. A.

    2003-12-01

    In July 2003, the U.S. Geological Survey and the Alaska Division of Geological & Geophysical Surveys (DGGS) conducted field studies on the Susitna Glacier Fault (SGF), which ruptured on November 2002 during the M 7.9 Denali fault earthquake. The DGGS assumed responsibility for Geographic Information System (GIS) and data management, integrating remotely sensed imagery, GPS data, GIS, and photo-linking software to aid in planning and documentation of fieldwork. Pre-field preparation included acquisition of over 150, 1:6,000-scale true-color aerial photographs taken shortly after the SGF rupture, 1:63,360-scale color-infrared (CIR) 1980 aerial photographs, and digital geographic information including a 15-minute Digital Elevation Model (DEM), 1:63,360-scale Digital Raster Graphics (DRG), and LandSat 7 satellite imagery. Using Orthomapper software, we orthorectified and mosaiced seven CIRs, creating a georeferenced, digital photo base of the study area. We used this base to reference the 1:6,000-scale aerial photography, to view locations of field sites downloaded from GPS, and to locate linked digital photographs that were taken in the field. Photos were linked using GPS-Photo Link software which "links" digital photographs to GPS data by correlating time stamps from the GPS track log or waypoint file to those of the digital photos, using the correlated point data to create a photo location ESRI shape file. When this file is opened in ArcMap or ArcView with the GPS-Photo Link utility enabled, a thumbnail image of the linked photo appears when the cursor is over the photo location. Viewing photographed features and scarp-profile locations in GIS allowed us to evaluate data coverage of the rupture daily. Using remotely sensed imagery in the field with GIS gave us the versatility to display data on a variety of bases, including topographic maps, air photos, and satellite imagery, during fieldwork. In the field, we downloaded, processed, and reviewed data as it was collected, taking major steps toward final digital map production. Using the described techniques greatly enhanced our ability to analyze and interpret field data; the resulting digital data structure allows us to efficiently gather, disseminate, and archive critical field data.

  13. Effects of axisymmetric contractions on turbulence of various scales

    NASA Technical Reports Server (NTRS)

    Tan-Atichat, J.; Nagib, H. M.; Drubka, R. E.

    1980-01-01

    Digitally acquired and processed results from an experimental investigation of grid generated turbulence of various scales through and downstream of nine matched cubic contour contractions ranging in area ratio from 2 to 36, and in length to inlet diameter ratio from 0.25 to 1.50 are reported. An additional contraction with a fifth order contour was also utilized for studying the shape effect. Thirteen homogeneous and nearly isotropic test flow conditions with a range of turbulence intensities, length scales and Reynolds numbers were generated and used to examine the sensitivity of the contractions to upstream turbulence. The extent to which the turbulence is altered by the contraction depends on the incoming turbulence scales, the total strain experienced by the fluid, as well as the contraction ratio and the strain rate. Varying the turbulence integral scale influences the transverse turbulence components more than the streamwise component. In general, the larger the turbulence scale, the lesser the reduction in the turbulence intensity of the transverse components. Best agreement with rapid distortion theory was obtained for large scale turbulence, where viscous decay over the contraction length was negligible, or when a first order correction for viscous decay was applied to the results.

  14. Design, Modeling, and Fabrication of Chemical Vapor Deposition Grown MoS2 Circuits with E-Mode FETs for Large-Area Electronics.

    PubMed

    Yu, Lili; El-Damak, Dina; Radhakrishna, Ujwal; Ling, Xi; Zubair, Ahmad; Lin, Yuxuan; Zhang, Yuhao; Chuang, Meng-Hsi; Lee, Yi-Hsien; Antoniadis, Dimitri; Kong, Jing; Chandrakasan, Anantha; Palacios, Tomas

    2016-10-12

    Two-dimensional electronics based on single-layer (SL) MoS 2 offers significant advantages for realizing large-scale flexible systems owing to its ultrathin nature, good transport properties, and stable crystalline structure. In this work, we utilize a gate first process technology for the fabrication of highly uniform enhancement mode FETs with large mobility and excellent subthreshold swing. To enable large-scale MoS 2 circuit, we also develop Verilog-A compact models that accurately predict the performance of the fabricated MoS 2 FETs as well as a parametrized layout cell for the FET to facilitate the design and layout process using computer-aided design (CAD) tools. Using this CAD flow, we designed combinational logic gates and sequential circuits (AND, OR, NAND, NOR, XNOR, latch, edge-triggered register) as well as switched capacitor dc-dc converter, which were then fabricated using the proposed flow showing excellent performance. The fabricated integrated circuits constitute the basis of a standard cell digital library that is crucial for electronic circuit design using hardware description languages. The proposed design flow provides a platform for the co-optimization of the device fabrication technology and circuits design for future ubiquitous flexible and transparent electronics using two-dimensional materials.

  15. Digital Competence at the Beginning of Upper Secondary School: Identifying Factors Explaining Digital Inclusion

    ERIC Educational Resources Information Center

    Hatlevik, Ove Edvard; Christophersen, Knut-Andreas

    2013-01-01

    During the last decade, information and communication technology has been given an increasingly large importance in our society. There seems to be a consensus regarding the necessity of supporting and developing school-based digital competence. In order to sustain digital inclusion, schools need to identify digital deficiencies and digital…

  16. Multiple-digit resurfacing using a thin latissimus dorsi perforator flap.

    PubMed

    Kim, Sang Wha; Lee, Ho Jun; Kim, Jeong Tae; Kim, Youn Hwan

    2014-01-01

    Traumatic digit defects of high complexity and with inadequate local tissue represent challenging surgical problems. Recently, perforator flaps have been proposed for reconstructing large defects of the hand because of their thinness and pliability and minimal donor site morbidity. Here, we illustrate the use of thin latissimus dorsi perforator flaps for resurfacing multiple defects of distal digits. We describe the cases of seven patients with large defects, including digits, circumferential defects and multiple-digit defects, who underwent reconstruction with thin latissimus dorsi perforator flaps between January 2008 and March 2012. Single-digit resurfacing procedures were excluded. The mean age was 56.3 years and the mean flap size was 160.4 cm(2). All the flaps survived completely. Two patients had minor complications including partial flap loss and scar contracture. The mean follow-up period was 11.7 months. The ideal flap for digit resurfacing should be thin and amenable to moulding, have a long pedicle for microanastomosis and have minimal donor site morbidity. Thin flaps can be harvested by excluding the deep adipose layer, and their high pliability enables resurfacing without multiple debulking procedures. The latissimus dorsi perforator flap may be the best flap for reconstructing complex defects of the digits, such as large, multiple-digit or circumferential defects, which require complete wrapping of volar and dorsal surfaces. Copyright © 2013 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  17. Test of Gravity on Large Scales with Weak Gravitational Lensing and Clustering Measurements of SDSS Luminous Red Galaxies

    NASA Astrophysics Data System (ADS)

    Reyes, Reinabelle; Mandelbaum, R.; Seljak, U.; Gunn, J.; Lombriser, L.

    2009-01-01

    We perform a test of gravity on large scales (5-50 Mpc/h) using 70,000 luminous red galaxies (LRGs) from the Sloan Digital Sky Survey (SDSS) DR7 with redshifts 0.16

  18. Schiaparelli Hemisphere

    NASA Image and Video Library

    1996-06-03

    This mosaic is composed of about 100 red- and violet- filter Viking Orbiter images, digitally mosaiced in an orthographic projection at a scale of 1 km/pixel. The images were acquired in 1980 during mid northern summer on Mars (Ls = 89 degrees). The center of the image is near the impact crater Schiaparelli (latitude -3 degrees, longitude 343 degrees). The limits of this mosaic are approximately latitude -60 to 60 degrees and longitude 280 to 30 degrees. The color variations have been enhanced by a factor of two, and the large-scale brightness variations (mostly due to sun-angle variations) have been normalized by large-scale filtering. The large circular area with a bright yellow color (in this rendition) is known as Arabia. The boundary between the ancient, heavily-cratered southern highlands and the younger northern plains occurs far to the north (latitude 40 degrees) on this side of the planet, just north of Arabia. The dark streaks with bright margins emanating from craters in the Oxia Palus region (to the left of Arabia) are caused by erosion and/or deposition by the wind. The dark blue area on the far right, called Syrtis Major Planum, is a low-relief volcanic shield of probable basaltic composition. Bright white areas to the south, including the Hellas impact basin at the lower right, are covered by carbon dioxide frost. http://photojournal.jpl.nasa.gov/catalog/PIA00004

  19. Asymptotic theory of time varying networks with burstiness and heterogeneous activation patterns

    NASA Astrophysics Data System (ADS)

    Burioni, Raffaella; Ubaldi, Enrico; Vezzani, Alessandro

    2017-05-01

    The recent availability of large-scale, time-resolved and high quality digital datasets has allowed for a deeper understanding of the structure and properties of many real-world networks. The empirical evidence of a temporal dimension prompted the switch of paradigm from a static representation of networks to a time varying one. In this work we briefly review the framework of time-varying-networks in real world social systems, especially focusing on the activity-driven paradigm. We develop a framework that allows for the encoding of three generative mechanisms that seem to play a central role in the social networks’ evolution: the individual’s propensity to engage in social interactions, its strategy in allocate these interactions among its alters and the burstiness of interactions amongst social actors. The functional forms and probability distributions encoding these mechanisms are typically data driven. A natural question arises if different classes of strategies and burstiness distributions, with different local scale behavior and analogous asymptotics can lead to the same long time and large scale structure of the evolving networks. We consider the problem in its full generality, by investigating and solving the system dynamics in the asymptotic limit, for general classes of ties allocation mechanisms and waiting time probability distributions. We show that the asymptotic network evolution is driven by a few characteristics of these functional forms, that can be extracted from direct measurements on large datasets.

  20. Digital Geologic Map of the Rosalia 1:100,000 Quadrangle, Washington and Idaho: A Digital Database for the 1990 S.Z. Waggoner Map

    USGS Publications Warehouse

    Derkey, Pamela D.; Johnson, Bruce R.; Lackaff, Beatrice B.; Derkey, Robert E.

    1998-01-01

    The geologic map of the Rosalia 1:100,000-scale quadrangle was compiled in 1990 by S.Z. Waggoner of the Washington state Division of Geology and Earth Resources. This data was entered into a geographic information system (GIS) as part of a larger effort to create regional digital geology for the Pacific Northwest. The intent was to provide a digital geospatial database for a previously published black and white paper geologic map. This database can be queried in many ways to produce a variety of geologic maps. Digital base map data files are not included: they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:100,000 (e.g., 1:62,500 or 1:24,000) as it has been somewhat generalized to fit the 1:100,000 scale map. The map area is located in eastern Washington and extends across the state border into western Idaho. This open-file report describes the methods used to convert the geologic map data into a digital format, documents the file structures, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet. We wish to thank J. Eric Schuster of the Washington Division of Geology and Earth Resources for providing the original stable-base mylar and the funding for it to be scanned. We also thank Dick Blank and Barry Moring of the U.S. Geological Survey for reviewing the manuscript and digital files, respectively.

  1. References and benchmarks for pore-scale flow simulated using micro-CT images of porous media and digital rocks

    NASA Astrophysics Data System (ADS)

    Saxena, Nishank; Hofmann, Ronny; Alpak, Faruk O.; Berg, Steffen; Dietderich, Jesse; Agarwal, Umang; Tandon, Kunj; Hunter, Sander; Freeman, Justin; Wilson, Ove Bjorn

    2017-11-01

    We generate a novel reference dataset to quantify the impact of numerical solvers, boundary conditions, and simulation platforms. We consider a variety of microstructures ranging from idealized pipes to digital rocks. Pore throats of the digital rocks considered are large enough to be well resolved with state-of-the-art micro-computerized tomography technology. Permeability is computed using multiple numerical engines, 12 in total, including, Lattice-Boltzmann, computational fluid dynamics, voxel based, fast semi-analytical, and known empirical models. Thus, we provide a measure of uncertainty associated with flow computations of digital media. Moreover, the reference and standards dataset generated is the first of its kind and can be used to test and improve new fluid flow algorithms. We find that there is an overall good agreement between solvers for idealized cross-section shape pipes. As expected, the disagreement increases with increase in complexity of the pore space. Numerical solutions for pipes with sinusoidal variation of cross section show larger variability compared to pipes of constant cross-section shapes. We notice relatively larger variability in computed permeability of digital rocks with coefficient of variation (of up to 25%) in computed values between various solvers. Still, these differences are small given other subsurface uncertainties. The observed differences between solvers can be attributed to several causes including, differences in boundary conditions, numerical convergence criteria, and parameterization of fundamental physics equations. Solvers that perform additional meshing of irregular pore shapes require an additional step in practical workflows which involves skill and can introduce further uncertainty. Computation times for digital rocks vary from minutes to several days depending on the algorithm and available computational resources. We find that more stringent convergence criteria can improve solver accuracy but at the expense of longer computation time.

  2. 2017 Guralp Affinity Digitizer Evaluation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merchant, Bion J.

    Sandia National Laboratories has tested and evaluated two Guralp Affinity digitizers. The Affinity digitizers are intended to record sensor output for seismic and infrasound monitoring applications. The purpose of this digitizer evaluation is to measure the performance characteristics in such areas as power consumption, input impedance, sensitivity, full scale, self- noise, dynamic range, system noise, response, passband, and timing. The Affinity digitizers are being evaluated for potential use in the International Monitoring System (IMS) of the Comprehensive Nuclear Test-Ban-Treaty Organization (CTBTO).

  3. Bayesian analysis of the dynamic cosmic web in the SDSS galaxy survey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leclercq, Florent; Wandelt, Benjamin; Jasche, Jens, E-mail: florent.leclercq@polytechnique.org, E-mail: jasche@iap.fr, E-mail: wandelt@iap.fr

    Recent application of the Bayesian algorithm \\textsc(borg) to the Sloan Digital Sky Survey (SDSS) main sample galaxies resulted in the physical inference of the formation history of the observed large-scale structure from its origin to the present epoch. In this work, we use these inferences as inputs for a detailed probabilistic cosmic web-type analysis. To do so, we generate a large set of data-constrained realizations of the large-scale structure using a fast, fully non-linear gravitational model. We then perform a dynamic classification of the cosmic web into four distinct components (voids, sheets, filaments, and clusters) on the basis of themore » tidal field. Our inference framework automatically and self-consistently propagates typical observational uncertainties to web-type classification. As a result, this study produces accurate cosmographic classification of large-scale structure elements in the SDSS volume. By also providing the history of these structure maps, the approach allows an analysis of the origin and growth of the early traces of the cosmic web present in the initial density field and of the evolution of global quantities such as the volume and mass filling fractions of different structures. For the problem of web-type classification, the results described in this work constitute the first connection between theory and observations at non-linear scales including a physical model of structure formation and the demonstrated capability of uncertainty quantification. A connection between cosmology and information theory using real data also naturally emerges from our probabilistic approach. Our results constitute quantitative chrono-cosmography of the complex web-like patterns underlying the observed galaxy distribution.« less

  4. Bidding process in online auctions and winning strategy: Rate equation approach

    NASA Astrophysics Data System (ADS)

    Yang, I.; Kahng, B.

    2006-06-01

    Online auctions have expanded rapidly over the last decade and have become a fascinating new type of business or commercial transaction in this digital era. Here we introduce a master equation for the bidding process that takes place in online auctions. We find that the number of distinct bidders who bid k times up to the t th bidding progresses, called the k -frequent bidder, seems to scale as nk(t)˜tk-2.4 . The successfully transmitted bidding rate by the k -frequent bidder is likely to scale as qk(t)˜k-1.4 , independent of t for large t . This theoretical prediction is close to empirical data. These results imply that bidding at the last moment is a rational and effective strategy to win in an eBay auction.

  5. A novel cosmetic antifungal/anti-inflammatory topical gel for the treatment of mild to moderate seborrheic dermatitis of the face: an open-label trial utilizing clinical evaluation and erythema-directed digital photography.

    PubMed

    Dall' Oglio, Federica; Tedeschi, Aurora; Fusto, Carmelinda M; Lacarrubba, Francesco; Dinotta, Franco; Micali, Giuseppe

    2017-10-01

    Topical cosmetic agents may play a role in the management of facial seborrheic dermatitis by reducing inflammation and scale production. Advanced digital photography, equipped with technology able to provide a detailed evaluation of red skin components corresponding to vascular flare (erythema-directed digital photography), is a useful tool for evaluation of erythema in patients affected by inflammatory dermatoses. The aim of this study was to assess the efficacy and safety of a new cosmetic topical gel containing piroctone olamine, lactoferrin, glycero-phospho-inositol, and Aloe vera for the treatment of facial seborrheic dermatitis by clinical and advanced digital photography evaluation. An open-label, prospective, clinical trial was conducted on 25 patients with mild to moderate facial seborrheic dermatitis. Subjects were instructed to apply the gel twice daily for 45 days. The clinical efficacy was evaluated by measuring at baseline, at day 15 and 45 the degree of desquamation (by clinical examination) and erythema (by digital photography technology via VISIA-CR™ system equipped with RBX™), using a 5-point severity scale, and pruritus (by subject-completed Visual Analogue Scale; scale from 0 to 100 mm). Finally, at baseline and at the end of the study, IGA (Investigator Global Assessment) was performed using a 5-point severity scale (from 0 = worsening to 4 = excellent response). At the end of treatment, a significant reduction (P<0.001) of all considered parameters was observed. Moreover, an excellent response (>80% improvement) was recorded in 47.9% of patients, with no case of worsening. No signs of local intolerance were documented. The tested cosmetic topical gel was effective in treating mild to moderate seborrheic dermatitis of the face. Erythema-directed digital photography may represent a noteworthy tool for the therapeutic monitoring of facial seborrheic dermatitis and an important adjunct aid in the dermatologic clinical practice.

  6. Semi-Automated Digital Image Analysis of Pick’s Disease and TDP-43 Proteinopathy

    PubMed Central

    Irwin, David J.; Byrne, Matthew D.; McMillan, Corey T.; Cooper, Felicia; Arnold, Steven E.; Lee, Edward B.; Van Deerlin, Vivianna M.; Xie, Sharon X.; Lee, Virginia M.-Y.; Grossman, Murray; Trojanowski, John Q.

    2015-01-01

    Digital image analysis of histology sections provides reliable, high-throughput methods for neuropathological studies but data is scant in frontotemporal lobar degeneration (FTLD), which has an added challenge of study due to morphologically diverse pathologies. Here, we describe a novel method of semi-automated digital image analysis in FTLD subtypes including: Pick’s disease (PiD, n=11) with tau-positive intracellular inclusions and neuropil threads, and TDP-43 pathology type C (FTLD-TDPC, n=10), defined by TDP-43-positive aggregates predominantly in large dystrophic neurites. To do this, we examined three FTLD-associated cortical regions: mid-frontal gyrus (MFG), superior temporal gyrus (STG) and anterior cingulate gyrus (ACG) by immunohistochemistry. We used a color deconvolution process to isolate signal from the chromogen and applied both object detection and intensity thresholding algorithms to quantify pathological burden. We found object-detection algorithms had good agreement with gold-standard manual quantification of tau- and TDP-43-positive inclusions. Our sampling method was reliable across three separate investigators and we obtained similar results in a pilot analysis using open-source software. Regional comparisons using these algorithms finds differences in regional anatomic disease burden between PiD and FTLD-TDP not detected using traditional ordinal scale data, suggesting digital image analysis is a powerful tool for clinicopathological studies in morphologically diverse FTLD syndromes. PMID:26538548

  7. Semi-Automated Digital Image Analysis of Pick's Disease and TDP-43 Proteinopathy.

    PubMed

    Irwin, David J; Byrne, Matthew D; McMillan, Corey T; Cooper, Felicia; Arnold, Steven E; Lee, Edward B; Van Deerlin, Vivianna M; Xie, Sharon X; Lee, Virginia M-Y; Grossman, Murray; Trojanowski, John Q

    2016-01-01

    Digital image analysis of histology sections provides reliable, high-throughput methods for neuropathological studies but data is scant in frontotemporal lobar degeneration (FTLD), which has an added challenge of study due to morphologically diverse pathologies. Here, we describe a novel method of semi-automated digital image analysis in FTLD subtypes including: Pick's disease (PiD, n=11) with tau-positive intracellular inclusions and neuropil threads, and TDP-43 pathology type C (FTLD-TDPC, n=10), defined by TDP-43-positive aggregates predominantly in large dystrophic neurites. To do this, we examined three FTLD-associated cortical regions: mid-frontal gyrus (MFG), superior temporal gyrus (STG) and anterior cingulate gyrus (ACG) by immunohistochemistry. We used a color deconvolution process to isolate signal from the chromogen and applied both object detection and intensity thresholding algorithms to quantify pathological burden. We found object-detection algorithms had good agreement with gold-standard manual quantification of tau- and TDP-43-positive inclusions. Our sampling method was reliable across three separate investigators and we obtained similar results in a pilot analysis using open-source software. Regional comparisons using these algorithms finds differences in regional anatomic disease burden between PiD and FTLD-TDP not detected using traditional ordinal scale data, suggesting digital image analysis is a powerful tool for clinicopathological studies in morphologically diverse FTLD syndromes. © The Author(s) 2015.

  8. Multiresolution image registration in digital x-ray angiography with intensity variation modeling.

    PubMed

    Nejati, Mansour; Pourghassem, Hossein

    2014-02-01

    Digital subtraction angiography (DSA) is a widely used technique for visualization of vessel anatomy in diagnosis and treatment. However, due to unavoidable patient motions, both externally and internally, the subtracted angiography images often suffer from motion artifacts that adversely affect the quality of the medical diagnosis. To cope with this problem and improve the quality of DSA images, registration algorithms are often employed before subtraction. In this paper, a novel elastic registration algorithm for registration of digital X-ray angiography images, particularly for the coronary location, is proposed. This algorithm includes a multiresolution search strategy in which a global transformation is calculated iteratively based on local search in coarse and fine sub-image blocks. The local searches are accomplished in a differential multiscale framework which allows us to capture both large and small scale transformations. The local registration transformation also explicitly accounts for local variations in the image intensities which incorporated into our model as a change of local contrast and brightness. These local transformations are then smoothly interpolated using thin-plate spline interpolation function to obtain the global model. Experimental results with several clinical datasets demonstrate the effectiveness of our algorithm in motion artifact reduction.

  9. Digital representation of exposures of Precambrian bedrock in parts of Dickinson and Iron Counties, Michigan, and Florence and Marinette Counties, Wisconsin

    USGS Publications Warehouse

    Cannon, William F.; Schulte, Ruth; Bickerstaff, Damon

    2018-04-04

    The U.S. Geological Survey (USGS) conducted a program of bedrock geologic mapping in much of the central and western Upper Peninsula of Michigan from the 1940s until the late 1990s. Geologic studies in this region are hampered by a scarcity of bedrock exposures because of a nearly continuous blanket of unconsolidated sediments resulting from glaciation of the region during the Pleistocene ice ages. The USGS mapping, done largely at a scale of 1:24,000, routinely recorded the location and extent of exposed bedrock to provide both an indication of where direct observations were made and a guide for future investigations to expedite location of observable rock exposures. The locations of outcrops were generally shown as colored or patterned overlays on printed geologic maps. Although those maps have been scanned and are available as Portable Document Format (PDF) files, no further digital portrayal of the outcrops had been done. We have conducted a prototype study of digitizing and improving locational accuracy of the outcrop locations in parts of Dickinson County, Michigan, to form a data layer that can be used with other data layers in geographic information system applications.

  10. Review and comparison of non-conventional imaging systems for three-dimensional digitization of transparent objects

    NASA Astrophysics Data System (ADS)

    Mériaudeau, Fabrice; Rantoson, Rindra; Fofi, David; Stolz, Christophe

    2012-04-01

    Fashion and design greatly influence the conception of manufactured products which now exhibit complex forms and shapes. Two-dimensional quality control procedures (e.g., shape, textures, colors, and 2D geometry) are progressively being replaced by 3D inspection methods (e.g., 3D geometry, colors, and texture on the 3D shape) therefore requiring a digitization of the object surface. Three dimensional surface acquisition is a topic which has been studied to a large extent, and a significant number of techniques for acquiring 3D shapes has been proposed, leading to a wide range of commercial solutions available on the market. These systems cover a wide range from micro-scale objects such as shape from focus and shape from defocus techniques, to several meter sized objects (time of flight technique). Nevertheless, the use of such systems still encounters difficulties when dealing with non-diffuse (non Lambertian) surfaces as is the case for transparent, semi-transparent, or highly reflective materials (e.g., glass, crystals, plastics, and shiny metals). We review and compare various systems and approaches which were recently developed for 3D digitization of transparent objects.

  11. Commercial vs professional UAVs for mapping

    NASA Astrophysics Data System (ADS)

    Nikolakopoulos, Konstantinos G.; Koukouvelas, Ioannis

    2017-09-01

    The continuous advancements in the technology behind Unmanned Aerial Vehicles (UAVs), in accordance with the consecutive decrease to their cost and the availability of photogrammetric software, make the use of UAVs an excellent tool for large scale mapping. In addition with the use of UAVs, the problems of increased costs, time consumption and the possible terrain accessibility problems, are significantly reduced. However, despite the growing number of UAV applications there has been a little quantitative assessment of UAV performance and of the quality of the derived products (orthophotos and Digital Surface Models). Here, we present results from field experiments designed to evaluate the accuracy of photogrammetrically-derived digital surface models (DSM) developed from imagery acquired with onboard digital cameras. We also show the comparison of the high resolution vs moderate resolution imagery for largescale geomorphic mapping. The acquired data analyzed in this study comes from a small commercial and a professional UAV. The test area was mapped using the same photogrammetric grid by the two UAVs. 3D models, DSMs and orthophotos were created using special software. Those products were compared to in situ survey measurements and the results are presented in this paper.

  12. Stellar and planetary remnants in digital sky surveys

    NASA Astrophysics Data System (ADS)

    Girven, Jonathan

    Large scale digital sky surveys have produced an unprecedented volume of uniform data covering both vast proportions of the sky and a wide range of wavelength, from the ultraviolet to the near-infrared. The challenge facing astronomers today is how to use this multitude of information to extract trends, outliers and and rare objects. For example, a large sample of single white dwarf stars has the potential to probe the Galaxy through the luminosity function. The aim of this work was to study stellar and planetary remnants in these surveys. In the last few decades, it has been shown that a handful of white dwarfs have remnants of planetary systems around them, in the form of a dusty disc. These are currently providing the best constraints on the composition of extra-solar planetary systems. Finding significant numbers of dusty discs is only possible in large scale digital sky surveys. I ultilised the SDSS DR7 and colour-colour diagrams to and DA white dwarfs from optical photometry. This nearly doubled the number of spectroscopically confirmed DA white dwarfs in the SDSS compared with DR4 [Eisenstein et al., 2006], and introduced nearly 10; 000 photometric-only DA white dwarf candidates. I further cross-matched our white dwarf catalogue with UKIDSS LAS DR8 to carry out the currently largest and deepest untargeted search for low-mass companions to, and dust discs around, DA white dwarfs. Simultaneously, I analyzed Spitzer observations of 15 white dwarfs with metal-polluted atmospheres, all but one having helium-dominated atmospheres. Three of these stars were found to have an infrared excess consistent with a dusty disc. I used the total sample to estimate a typical disc lifetime of log[tdisc(yr)] = 5:6+1:1, which is compatible with the relatively large range estimated from different theoretical models. Subdwarf population synthesis models predicted a vast population of subdwarfs with F to K-type companions, produced in the effcient RLOF formation channel. I used a cross-match of ultraviolet, optical and infrared surveys to search for this unseen population. I select a complementary sample to those found from radial velocity surveys, offering direct tests of binary evolution pathways. Finally, I present a method to use common proper motion white dwarf pairs to constrain the initial-final mass relation, which is extremely uncertain at low masses. In the example I show, one of the stars is a magnetic white dwarf with B ' 6 MG, making this a rare and intriguing system from a magnetic white dwarf formation point of view.

  13. Robotically Assembled Aerospace Structures: Digital Material Assembly using a Gantry-Type Assembler

    NASA Technical Reports Server (NTRS)

    Trinh, Greenfield; Copplestone, Grace; O'Connor, Molly; Hu, Steven; Nowak, Sebastian; Cheung, Kenneth; Jenett, Benjamin; Cellucci, Daniel

    2017-01-01

    This paper evaluates the development of automated assembly techniques for discrete lattice structures using a multi-axis gantry type CNC machine. These lattices are made of discrete components called digital materials. We present the development of a specialized end effector that works in conjunction with the CNC machine to assemble these lattices. With this configuration we are able to place voxels at a rate of 1.5 per minute. The scalability of digital material structures due to the incremental modular assembly is one of its key traits and an important metric of interest. We investigate the build times of a 5x5 beam structure on the scale of 1 meter (325 parts), 10 meters (3,250 parts), and 30 meters (9,750 parts). Utilizing the current configuration with a single end effector, performing serial assembly with a globally fixed feed station at the edge of the build volume, the build time increases according to a scaling law of n4, where n is the build scale. Build times can be reduced significantly by integrating feed systems into the gantry itself, resulting in a scaling law of n3. A completely serial assembly process will encounter time limitations as build scale increases. Automated assembly for digital materials can assemble high performance structures from discrete parts, and techniques such as built in feed systems, parallelization, and optimization of the fastening process will yield much higher throughput.

  14. Robotically Assembled Aerospace Structures: Digital Material Assembly using a Gantry-Type Assembler

    NASA Technical Reports Server (NTRS)

    Trinh, Greenfield; Copplestone, Grace; O'Connor, Molly; Hu, Steven; Nowak, Sebastian; Cheung, Kenneth; Jenett, Benjamin; Cellucci, Daniel

    2017-01-01

    This paper evaluates the development of automated assembly techniques for discrete lattice structures using a multi-axis gantry type CNC machine. These lattices are made of discrete components called "digital materials." We present the development of a specialized end effector that works in conjunction with the CNC machine to assemble these lattices. With this configuration we are able to place voxels at a rate of 1.5 per minute. The scalability of digital material structures due to the incremental modular assembly is one of its key traits and an important metric of interest. We investigate the build times of a 5x5 beam structure on the scale of 1 meter (325 parts), 10 meters (3,250 parts), and 30 meters (9,750 parts). Utilizing the current configuration with a single end effector, performing serial assembly with a globally fixed feed station at the edge of the build volume, the build time increases according to a scaling law of n4, where n is the build scale. Build times can be reduced significantly by integrating feed systems into the gantry itself, resulting in a scaling law of n3. A completely serial assembly process will encounter time limitations as build scale increases. Automated assembly for digital materials can assemble high performance structures from discrete parts, and techniques such as built in feed systems, parallelization, and optimization of the fastening process will yield much higher throughput.

  15. Preliminary integrated geologic map databases for the United States: Digital data for the reconnaissance bedrock geologic map for the northern Alaska peninsula area, southwest Alaska

    USGS Publications Warehouse

    ,

    2006-01-01

    he growth in the use of Geographic Information Systems (GIS) has highlighted the need for digital geologic maps that have been attributed with information about age and lithology. Such maps can be conveniently used to generate derivative maps for manifold special purposes such as mineral-resource assessment, metallogenic studies, tectonic studies, and environmental research. This report is part of a series of integrated geologic map databases that cover the entire United States. Three national-scale geologic maps that portray most or all of the United States already exist; for the conterminous U.S., King and Beikman (1974a,b) compiled a map at a scale of 1:2,500,000, Beikman (1980) compiled a map for Alaska at 1:2,500,000 scale, and for the entire U.S., Reed and others (2005a,b) compiled a map at a scale of 1:5,000,000. A digital version of the King and Beikman map was published by Schruben and others (1994). Reed and Bush (2004) produced a digital version of the Reed and others (2005a) map for the conterminous U.S. The present series of maps is intended to provide the next step in increased detail. State geologic maps that range in scale from 1:100,000 to 1:1,000,000 are available for most of the country, and digital versions of these state maps are the basis of this product. The digital geologic maps presented here are in a standardized format as ARC/INFO export files and as ArcView shape files. Data tables that relate the map units to detailed lithologic and age information accompany these GIS files. The map is delivered as a set 1:250,000-scale quadrangle files. To the best of our ability, these quadrangle files are edge-matched with respect to geology. When the maps are merged, the combined attribute tables can be used directly with the merged maps to make derivative maps.

  16. Preliminary integrated geologic map databases for the United States: Digital data for the reconnaissance geologic map of the western Aleutian Islands, Alaska

    USGS Publications Warehouse

    ,

    2006-01-01

    The growth in the use of Geographic Information Systems (GIS) has highlighted the need for digital geologic maps that have been attributed with information about age and lithology. Such maps can be conveniently used to generate derivative maps for manifold special purposes such as mineral-resource assessment, metallogenic studies, tectonic studies, and environmental research. This report is part of a series of integrated geologic map databases that cover the entire United States. Three national-scale geologic maps that portray most or all of the United States already exist; for the conterminous U.S., King and Beikman (1974a,b) compiled a map at a scale of 1:2,500,000, Beikman (1980) compiled a map for Alaska at 1:2,500,000 scale, and for the entire U.S., Reed and others (2005a,b) compiled a map at a scale of 1:5,000,000. A digital version of the King and Beikman map was published by Schruben and others (1994). Reed and Bush (2004) produced a digital version of the Reed and others (2005a) map for the conterminous U.S. The present series of maps is intended to provide the next step in increased detail. State geologic maps that range in scale from 1:100,000 to 1:1,000,000 are available for most of the country, and digital versions of these state maps are the basis of this product. The digital geologic maps presented here are in a standardized format as ARC/INFO Exportfiles/ and as ArcView shape files. Data tables that relate the map units to detailed lithologic and age information accompany these GIS files. The map is delivered as a set 1:250,000-scale quadrangle files. To the best of our ability, these quadrangle files are edge-matched with respect to geology. When the maps are merged, the combined attribute tables can be used directly with the merged maps to make derivative maps.

  17. Digital data for the geology of the Southern Brooks Range, Alaska

    USGS Publications Warehouse

    Till, Alison B.; Dumoulin, Julie A.; Harris, Anita G.; Moore, Thomas E.; Bleick, Heather A.; Siwiec, Benjamin; Labay, Keith A.; Wilson, Frederic H.; Shew, Nora B.

    2008-01-01

    The growth in the use of Geographic Information Systems (GIS) has highlighted the need for digital geologic maps that have been attributed with information about age and lithology. Such maps can be conveniently used to generate derivative maps for manifold special purposes such as mineral-resource assessment, metallogenic studies, tectonic studies, and environmental research. This report is part of a series of integrated geologic map databases that cover the entire United States. Three national-scale geologic maps that portray most or all of the United States already exist; for the conterminous U.S., King and Beikman (1974a,b) compiled a map at a scale of 1:2,500,000, Beikman (1980) compiled a map for Alaska at 1:2,500,000 scale, and for the entire U.S., Reed and others (2005a,b) compiled a map at a scale of 1:5,000,000. A digital version of the King and Beikman map was published by Schruben and others (1994). Reed and Bush (2004) produced a digital version of the Reed and others (2005a) map for the conterminous U.S. The present series of maps is intended to provide the next step in increased detail. State geologic maps that range in scale from 1:100,000 to 1:1,000,000 are available for most of the country, and digital versions of these state maps are the basis of this product. The digital geologic maps presented here are in a standardized format as ARC/INFO export files and as ArcView shape files. The files named __geol contain geologic polygons and line (contact) attributes; files named __fold contain fold axes; files named __lin contain lineaments; and files named __dike contain dikes as lines. Data tables that relate the map units to detailed lithologic and age information accompany these GIS files. The map is delivered as a set 1:250,000-scale quadrangle files. To the best of our ability, these quadrangle files are edge-matched with respect to geology. When the maps are merged, the combined attribute tables can be used directly with the merged maps to make derivative maps.

  18. Digital Data for the reconnaissance geologic map for Prince William Sound and the Kenai Peninsula, Alaska

    USGS Publications Warehouse

    Wilson, Frederic H.; Hults, Chad P.; Labay, Keith A.; Shew, Nora B.

    2007-01-01

    The growth in the use of Geographic Information Systems (GIS) has highlighted the need for digital geologic maps that have been attributed with information about age and lithology. Such maps can be conveniently used to generate derivative maps for manifold special purposes such as mineral-resource assessment, metallogenic studies, tectonic studies, and environmental research. This report is part of a series of integrated geologic map databases that cover the entire United States. Three national-scale geologic maps that portray most or all of the United States already exist; for the conterminous U.S., King and Beikman (1974a,b) compiled a map at a scale of 1:2,500,000, Beikman (1980) compiled a map for Alaska at 1:2,500,000 scale, and for the entire U.S., Reed and others (2005a,b) compiled a map at a scale of 1:5,000,000. A digital version of the King and Beikman map was published by Schruben and others (1994). Reed and Bush (2004) produced a digital version of the Reed and others (2005a) map for the conterminous U.S. The present series of maps is intended to provide the next step in increased detail. State geologic maps that range in scale from 1:100,000 to 1:1,000,000 are available for most of the country, and digital versions of these state maps are the basis of this product. The digital geologic maps presented here are in a standardized format as ARC/INFO export files and as ArcView shape files. The files named __geol contain geologic polygons and line (contact) attributes; files named __fold contain fold axes; files named __lin contain lineaments; and files named __dike contain dikes as lines. Data tables that relate the map units to detailed lithologic and age information accompany these GIS files. The map is delivered as a set 1:250,000-scale quadrangle files. To the best of our ability, these quadrangle files are edge-matched with respect to geology. When the maps are merged, the combined attribute tables can be used directly with the merged maps to make derivative maps.

  19. Preliminary integrated geologic map databases for the United States: Digital data for the generalized bedrock geologic map, Yukon Flats region, east-central Alaska

    USGS Publications Warehouse

    Till, Alison B.; Dumoulin, Julie A.; Phillips, Jeffrey D.; Stanley, Richard G.; Crews, Jessie

    2006-01-01

    The growth in the use of Geographic Information Systems (GIS) has highlighted the need for digital geologic maps that have been attributed with information about age and lithology. Such maps can be conveniently used to generate derivative maps for manifold special purposes such as mineral-resource assessment, metallogenic studies, tectonic studies, and environmental research. This report is part of a series of integrated geologic map databases that cover the entire United States. Three national-scale geologic maps that portray most or all of the United States already exist; for the conterminous U.S., King and Beikman (1974a,b) compiled a map at a scale of 1:2,500,000, Beikman (1980) compiled a map for Alaska at 1:2,500,000 scale, and for the entire U.S., Reed and others (2005a,b) compiled a map at a scale of 1:5,000,000. A digital version of the King and Beikman map was published by Schruben and others (1994). Reed and Bush (2004) produced a digital version of the Reed and others (2005a) map for the conterminous U.S. The present series of maps is intended to provide the next step in increased detail. State geologic maps that range in scale from 1:100,000 to 1:1,000,000 are available for most of the country, and digital versions of these state maps are the basis of this product. The digital geologic maps presented here are in a standardized format as ARC/INFO export files and as ArcView shape files. Data tables that relate the map units to detailed lithologic and age information accompany these GIS files. The map is delivered as a set 1:250,000-scale quadrangle files. To the best of our ability, these quadrangle files are edge-matched with respect to geology. When the maps are merged, the combined attribute tables can be used directly with the merged maps to make derivative maps.

  20. Preliminary integrated geologic map databases for the United States: Digital data for the reconnaissance geologic map of the lower Yukon River region, Alaska

    USGS Publications Warehouse

    ,

    2006-01-01

    The growth in the use of Geographic Information Systems (GIS) has highlighted the need for digital geologic maps that have been attributed with information about age and lithology. Such maps can be conveniently used to generate derivative maps for manifold special purposes such as mineral-resource assessment, metallogenic studies, tectonic studies, and environmental research. This report is part of a series of integrated geologic map databases that cover the entire United States. Three national-scale geologic maps that portray most or all of the United States already exist; for the conterminous U.S., King and Beikman (1974a,b) compiled a map at a scale of 1:2,500,000, Beikman (1980) compiled a map for Alaska at 1:2,500,000 scale, and for the entire U.S., Reed and others (2005a,b) compiled a map at a scale of 1:5,000,000. A digital version of the King and Beikman map was published by Schruben and others (1994). Reed and Bush (2004) produced a digital version of the Reed and others (2005a) map for the conterminous U.S. The present series of maps is intended to provide the next step in increased detail. State geologic maps that range in scale from 1:100,000 to 1:1,000,000 are available for most of the country, and digital versions of these state maps are the basis of this product. The digital geologic maps presented here are in a standardized format as ARC/INFO export files and as ArcView shape files. Data tables that relate the map units to detailed lithologic and age information accompany these GIS files. The map is delivered as a set 1:250,000-scale quadrangle files. To the best of our ability, these quadrangle files are edge-matched with respect to geology. When the maps are merged, the combined attribute tables can be used directly with the merged maps to make derivative maps.

  1. PG&E's Seismic Network Goes Digital With Strong Motion: Successes and Challenges

    NASA Astrophysics Data System (ADS)

    Stanton, M. A.; Cullen, J.; McLaren, M. K.

    2008-12-01

    Pacific Gas and Electric Company (PG&E) is in year 3 of a 5-year project to upgrade the Central Coast Seismic Network (CCSN) from analog to digital. Located along the south-central California coast, the CCSN began operation in 1987, with 20 analog stations; 15 vertical component and 5 dual gain 3-component S-13 sensors. The analog signals travel over FM radio telemetry links and voice channels via PG&E's microwave network to our facility in San Francisco (SF), where the A/D conversion is performed on a computer running Earthworm v7.1, which also transmits the data to the USGS in Menlo Park. At the conversion point the dynamic ranges of the vertical and dual-gain sensors are 40-50dB and 60-70dB, respectively. Dynamic range exceedance (data clipping) generally occurs for a M2.5 or greater event within about 40 km of a station. The motivations to upgrade the seismic network were the need for higher dynamic range and to retire obsolete analog transmission equipment. The upgraded digital stations consist of the existing velocity sensors, a 131A-02/3 accelerometer and a Reftek 130-01 Broadband Seismic Recorder for digital data recording and transmission to SF. Vertical only stations have one component of velocity and 3 components of acceleration. Dual gain sites have 3 components of velocity and 3 of acceleration. To date we have successfully upgraded 6 sites; 3 more will be installed by the end of 2008. Some of the advantages of going digital are 1) data is recorded at each site and in SF, 2) substantially increased dynamic range of the velocity sensors to 120dB, as observed by on scale, close-by recordings from a M3.9 San Simeon aftershock on 04/29/2008, 3) accelerometers for on scale recording of large earthquakes, and 4) ability to contribute our strong motion data to USGS ShakeMaps. A significant challenge has been consistent radio communications. To resolve this issue we are installing point-to-multipoint Motorola Canopy spread spectrum radios at the stations and communication towers.

  2. A simple landslide susceptibility analysis for hazard and risk assessment in developing countries

    NASA Astrophysics Data System (ADS)

    Guinau, M.; Vilaplana, J. M.

    2003-04-01

    In recent years, a number of techniques and methodologies have been developed for mitigating natural disasters. The complexity of these methodologies and the scarcity of material and data series justify the need for simple methodologies to obtain the necessary information for minimising the effects of catastrophic natural phenomena. The work with polygonal maps using a GIS allowed us to develop a simple methodology, which was developed in an area of 473 Km2 in the Departamento de Chinandega (NW Nicaragua). This area was severely affected by a large number of landslides (mainly debris flows), triggered by the Hurricane Mitch rainfalls in October 1998. With the aid of aerial photography interpretation at 1:40.000 scale, amplified to 1:20.000, and detailed field work, a landslide map at 1:10.000 scale was constructed. The failure zones of landslides were digitized in order to obtain a failure zone digital map. A terrain unit digital map, in which a series of physical-environmental terrain factors are represented, was also used. Dividing the studied area into two zones (A and B) with homogeneous physical and environmental characteristics, allows us to develop the proposed methodology and to validate it. In zone A, the failure zone digital map is superimposed onto the terrain unit digital map to establish the relationship between the different terrain factors and the failure zones. The numerical expression of this relationship enables us to classify the terrain by its landslide susceptibility. In zone B, this numerical relationship was employed to obtain a landslide susceptibility map, obviating the need for a failure zone map. The validity of the methodology can be tested in this area by using the degree of superposition of the susceptibility map and the failure zone map. The implementation of the methodology in tropical countries with physical and environmental characteristics similar to those of the study area allows us to carry out a landslide susceptibility analysis in areas where landslide records do not exist. This analysis is essential to landslide hazard and risk assessment, which is necessary to determine the actions for mitigating landslide effects, e.g. land planning, emergency aid actions, etc.

  3. Digitally enhanced GLORIA images for petroleum exploration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prindle, R.O.; Lanz, K

    1990-05-01

    This poster presentation graphically depicts the geological and structural information that can be derived from digitally enhanced Geological Long Range Inclined Asdic (GLORIA) sonar images. This presentation illustrates the advantages of scale enlargement as an interpreter's tool in an offshore area within the Eel River Basin, Northern California. Sonographs were produced from digital tapes originally collected for the exclusive economic zone (EEZ)-SCAN 1984 survey, which was published in the Atlas of the Western Conterminous US at a scale of 1:500,000. This scale is suitable for displaying regional offshore tectonic features but does not have the resolution required for detailed geologicalmore » mapping necessary for petroleum exploration. Applications of digital enhancing techniques which utilize contrast stretching and assign false colors to wide-swath sonar imagery (approximately 40 km) with 50-m resolution enables the acquisition and interpretation of significantly more geological and structural data. This, combined with a scale enlargement to 1:100,000 and high contrast contact prints vs. the offset prints of the atlas, increases the resolution and sharpness of bathymetric features so that many more subtle features may be mapped in detail. A tectonic interpretation of these digitally enhanced GLORIA sonographs from the Eel River basin is presented, displaying anticlines, lineaments, ridge axis, pathways of sediment flow, and subtle doming. Many of these features are not present on published bathymetric maps and have not been derived from seismic data because the plan view spatial resolution is much less than that available from the GLORIA imagery.« less

  4. Review of integrated digital systems: evolution and adoption

    NASA Astrophysics Data System (ADS)

    Fritz, Lawrence W.

    The factors that are influencing the evolution of photogrammetric and remote sensing technology to transition into fully integrated digital systems are reviewed. These factors include societal pressures for new, more timely digital products from the Spatial Information Sciencesand the adoption of rapid technological advancements in digital processing hardware and software. Current major developments in leading government mapping agencies of the USA, such as the Digital Production System (DPS) modernization programme at the Defense Mapping Agency, and the Automated Nautical Charting System II (ANCS-II) programme and Integrated Digital Photogrammetric Facility (IDPF) at NOAA/National Ocean Service, illustrate the significant benefits to be realized. These programmes are examples of different levels of integrated systems that have been designed to produce digital products. They provide insights to the management complexities to be considered for very large integrated digital systems. In recognition of computer industry trends, a knowledge-based architecture for managing the complexity of the very large spatial information systems of the future is proposed.

  5. Is 9 louder than 1? Audiovisual cross-modal interactions between number magnitude and judged sound loudness.

    PubMed

    Alards-Tomalin, Doug; Walker, Alexander C; Shaw, Joshua D M; Leboe-McGowan, Launa C

    2015-09-01

    The cross-modal impact of number magnitude (i.e. Arabic digits) on perceived sound loudness was examined. Participants compared a target sound's intensity level against a previously heard reference sound (which they judged as quieter or louder). Paired with each target sound was a task irrelevant Arabic digit that varied in magnitude, being either small (1, 2, 3) or large (7, 8, 9). The degree to which the sound and the digit were synchronized was manipulated, with the digit and sound occurring simultaneously in Experiment 1, and the digit preceding the sound in Experiment 2. Firstly, when target sounds and digits occurred simultaneously, sounds paired with large digits were categorized as loud more frequently than sounds paired with small digits. Secondly, when the events were separated, number magnitude ceased to bias sound intensity judgments. In Experiment 3, the events were still separated, however the participants held the number in short-term memory. In this instance the bias returned. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Perceptions of Prospective Teachers on Digital Literacy

    ERIC Educational Resources Information Center

    Çam, Emre; Kiyici, Mübin

    2017-01-01

    The aim of the quantitative study is to identify the digital literacy levels of prospective teachers in terms of several variables. The sample consisted of 354 prospective teachers studying in different departments of Sakarya University College of Education. The 30-item instrument used to gather the data was the "Digital Literacy Scale"…

  7. Full-scale system impact analysis: Digital document storage project

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The Digital Document Storage Full Scale System can provide cost effective electronic document storage, retrieval, hard copy reproduction, and remote access for users of NASA Technical Reports. The desired functionality of the DDS system is highly dependent on the assumed requirements for remote access used in this Impact Analysis. It is highly recommended that NASA proceed with a phased, communications requirement analysis to ensure that adequate communications service can be supplied at a reasonable cost in order to validate recent working assumptions upon which the success of the DDS Full Scale System is dependent.

  8. Past and present cosmic structure in the SDSS DR7 main sample

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jasche, J.; Leclercq, F.; Wandelt, B.D., E-mail: jasche@iap.fr, E-mail: florent.leclercq@polytechnique.org, E-mail: wandelt@iap.fr

    2015-01-01

    We present a chrono-cosmography project, aiming at the inference of the four dimensional formation history of the observed large scale structure from its origin to the present epoch. To do so, we perform a full-scale Bayesian analysis of the northern galactic cap of the Sloan Digital Sky Survey (SDSS) Data Release 7 main galaxy sample, relying on a fully probabilistic, physical model of the non-linearly evolved density field. Besides inferring initial conditions from observations, our methodology naturally and accurately reconstructs non-linear features at the present epoch, such as walls and filaments, corresponding to high-order correlation functions generated by late-time structuremore » formation. Our inference framework self-consistently accounts for typical observational systematic and statistical uncertainties such as noise, survey geometry and selection effects. We further account for luminosity dependent galaxy biases and automatic noise calibration within a fully Bayesian approach. As a result, this analysis provides highly-detailed and accurate reconstructions of the present density field on scales larger than ∼ 3 Mpc/h, constrained by SDSS observations. This approach also leads to the first quantitative inference of plausible formation histories of the dynamic large scale structure underlying the observed galaxy distribution. The results described in this work constitute the first full Bayesian non-linear analysis of the cosmic large scale structure with the demonstrated capability of uncertainty quantification. Some of these results will be made publicly available along with this work. The level of detail of inferred results and the high degree of control on observational uncertainties pave the path towards high precision chrono-cosmography, the subject of simultaneously studying the dynamics and the morphology of the inhomogeneous Universe.« less

  9. Zooming into local active galactic nuclei: the power of combining SDSS-IV MaNGA with higher resolution integral field unit observations

    NASA Astrophysics Data System (ADS)

    Wylezalek, Dominika; Schnorr Müller, Allan; Zakamska, Nadia L.; Storchi-Bergmann, Thaisa; Greene, Jenny E.; Müller-Sánchez, Francisco; Kelly, Michael; Liu, Guilin; Law, David R.; Barrera-Ballesteros, Jorge K.; Riffel, Rogemar A.; Thomas, Daniel

    2017-05-01

    Ionized gas outflows driven by active galactic nuclei (AGN) are ubiquitous in high-luminosity AGN with outflow speeds apparently correlated with the total bolometric luminosity of the AGN. This empirical relation and theoretical work suggest that in the range Lbol ˜ 1043-45 erg s-1 there must exist a threshold luminosity above which the AGN becomes powerful enough to launch winds that will be able to escape the galaxy potential. In this paper, we present pilot observations of two AGN in this transitional range that were taken with the Gemini North Multi-Object Spectrograph integral field unit (IFU). Both sources have also previously been observed within the Sloan Digital Sky Survey-IV (SDSS) Mapping Nearby Galaxies at Apache Point Observatory (MaNGA) survey. While the MaNGA IFU maps probe the gas fields on galaxy-wide scales and show that some regions are dominated by AGN ionization, the new Gemini IFU data zoom into the centre with four times better spatial resolution. In the object with the lower Lbol we find evidence of a young or stalled biconical AGN-driven outflow where none was obvious at the MaNGA resolution. In the object with the higher Lbol we trace the large-scale biconical outflow into the nuclear region and connect the outflow from small to large scales. These observations suggest that AGN luminosity and galaxy potential are crucial in shaping wind launching and propagation in low-luminosity AGN. The transition from small and young outflows to galaxy-wide feedback can only be understood by combining large-scale IFU data that trace the galaxy velocity field with higher resolution, small-scale IFU maps.

  10. A miniature high-efficiency fully digital adaptive voltage scaling buck converter

    NASA Astrophysics Data System (ADS)

    Li, Hangbiao; Zhang, Bo; Luo, Ping; Zhen, Shaowei; Liao, Pengfei; He, Yajuan; Li, Zhaoji

    2015-09-01

    A miniature high-efficiency fully digital adaptive voltage scaling (AVS) buck converter is proposed in this paper. The pulse skip modulation with flexible duty cycle (FD-PSM) is used in the AVS controller, which simplifies the circuit architecture (<170 gates) and greatly saves the die area and the power consumption. The converter is implemented in a 0.13-μm one-poly-eight-metal (1P8 M) complementary metal oxide semiconductor process and the active on-chip area of the controller is only 0.003 mm2, which is much smaller. The measurement results show that when the operating frequency of the digital load scales dynamically from 25.6 MHz to 112.6 MHz, the supply voltage of which can be scaled adaptively from 0.84 V to 1.95 V. The controller dissipates only 17.2 μW, while the supply voltage of the load is 1 V and the operating frequency is 40 MHz.

  11. Remote Patient Monitoring via Non-Invasive Digital Technologies: A Systematic Review

    PubMed Central

    Tran, Melody; Angelaccio, Michele; Arcona, Steve

    2017-01-01

    Abstract Background: We conducted a systematic literature review to identify key trends associated with remote patient monitoring (RPM) via noninvasive digital technologies over the last decade. Materials and Methods: A search was conducted in EMBASE and Ovid MEDLINE. Citations were screened for relevance against predefined selection criteria based on the PICOTS (Population, Intervention, Comparator, Outcomes, Timeframe, and Study Design) format. We included studies published between January 1, 2005 and September 15, 2015 that used RPM via noninvasive digital technology (smartphones/personal digital assistants [PDAs], wearables, biosensors, computerized systems, or multiple components of the formerly mentioned) in evaluating health outcomes compared to standard of care or another technology. Studies were quality appraised according to Critical Appraisal Skills Programme. Results: Of 347 articles identified, 62 met the selection criteria. Most studies were randomized control trials with older adult populations, small sample sizes, and limited follow-up. There was a trend toward multicomponent interventions (n = 26), followed by smartphones/PDAs (n = 12), wearables (n = 11), biosensor devices (n = 7), and computerized systems (n = 6). Another key trend was the monitoring of chronic conditions, including respiratory (23%), weight management (17%), metabolic (18%), and cardiovascular diseases (16%). Although substantial diversity in health-related outcomes was noted, studies predominantly reported positive findings. Conclusions: This review will help decision makers develop a better understanding of the current landscape of peer-reviewed literature, demonstrating the utility of noninvasive RPM in various patient populations. Future research is needed to determine the effectiveness of RPM via noninvasive digital technologies in delivering patient healthcare benefits and the feasibility of large-scale implementation. PMID:27116181

  12. Improving the Automatic Inversion of Digital Alouette/ISIS Ionogram Reflection Traces into Topside Electron Density Profiles

    NASA Technical Reports Server (NTRS)

    Benson, Robert F.; Truhlik, Vladimir; Huang, Xueqin; Wang, Yongli; Bilitza, Dieter

    2012-01-01

    The topside sounders of the International Satellites for Ionospheric Studies (ISIS) program were designed as analog systems. The resulting ionograms were displayed on 35 mm film for analysis by visual inspection. Each of these satellites, launched between 1962 and 1971, produced data for 10 to 20 years. A number of the original telemetry tapes from this large data set have been converted directly into digital records. Software, known as the Topside Ionogram Scalar With True-Height (TOPIST) algorithm, has been produced and used for the automatic inversion of the ionogram reflection traces on more than 100,000 ISIS-2 digital topside ionograms into topside vertical electron density profiles Ne(h). Here we present some topside ionospheric solar cycle variations deduced from the TOPIST database to illustrate the scientific benefit of improving and expanding the topside ionospheric Ne(h) database. The profile improvements will be based on improvements in the TOPIST software motivated by direct comparisons between TOPIST profiles and profiles produced by manual scaling in the early days of the ISIS program. The database expansion will be based on new software designed to overcome limitations in the original digital topside ionogram database caused by difficulties encountered during the analog-to-digital conversion process in the detection of the ionogram frame sync pulse and/or the frequency markers. This improved and expanded TOPIST topside Ne(h) database will greatly enhance investigations into both short- and long-term ionospheric changes, e.g., the observed topside ionospheric responses to magnetic storms, induced by interplanetary magnetic clouds, and solar cycle variations, respectively.

  13. Storage and distribution of pathology digital images using integrated web-based viewing systems.

    PubMed

    Marchevsky, Alberto M; Dulbandzhyan, Ronda; Seely, Kevin; Carey, Steve; Duncan, Raymond G

    2002-05-01

    Health care providers have expressed increasing interest in incorporating digital images of gross pathology specimens and photomicrographs in routine pathology reports. To describe the multiple technical and logistical challenges involved in the integration of the various components needed for the development of a system for integrated Web-based viewing, storage, and distribution of digital images in a large health system. An Oracle version 8.1.6 database was developed to store, index, and deploy pathology digital photographs via our Intranet. The database allows for retrieval of images by patient demographics or by SNOMED code information. The Intranet of a large health system accessible from multiple computers located within the medical center and at distant private physician offices. The images can be viewed using any of the workstations of the health system that have authorized access to our Intranet, using a standard browser or a browser configured with an external viewer or inexpensive plug-in software, such as Prizm 2.0. The images can be printed on paper or transferred to film using a digital film recorder. Digital images can also be displayed at pathology conferences by using wireless local area network (LAN) and secure remote technologies. The standardization of technologies and the adoption of a Web interface for all our computer systems allows us to distribute digital images from a pathology database to a potentially large group of users distributed in multiple locations throughout a large medical center.

  14. Calibration sets and the accuracy of vibrational scaling factors: A case study with the X3LYP hybrid functional

    NASA Astrophysics Data System (ADS)

    Teixeira, Filipe; Melo, André; Cordeiro, M. Natália D. S.

    2010-09-01

    A linear least-squares methodology was used to determine the vibrational scaling factors for the X3LYP density functional. Uncertainties for these scaling factors were calculated according to the method devised by Irikura et al. [J. Phys. Chem. A 109, 8430 (2005)]. The calibration set was systematically partitioned according to several of its descriptors and the scaling factors for X3LYP were recalculated for each subset. The results show that the scaling factors are only significant up to the second digit, irrespective of the calibration set used. Furthermore, multivariate statistical analysis allowed us to conclude that the scaling factors and the associated uncertainties are independent of the size of the calibration set and strongly suggest the practical impossibility of obtaining vibrational scaling factors with more than two significant digits.

  15. Calibration sets and the accuracy of vibrational scaling factors: a case study with the X3LYP hybrid functional.

    PubMed

    Teixeira, Filipe; Melo, André; Cordeiro, M Natália D S

    2010-09-21

    A linear least-squares methodology was used to determine the vibrational scaling factors for the X3LYP density functional. Uncertainties for these scaling factors were calculated according to the method devised by Irikura et al. [J. Phys. Chem. A 109, 8430 (2005)]. The calibration set was systematically partitioned according to several of its descriptors and the scaling factors for X3LYP were recalculated for each subset. The results show that the scaling factors are only significant up to the second digit, irrespective of the calibration set used. Furthermore, multivariate statistical analysis allowed us to conclude that the scaling factors and the associated uncertainties are independent of the size of the calibration set and strongly suggest the practical impossibility of obtaining vibrational scaling factors with more than two significant digits.

  16. Virtual Exploration of Earth's Evolution

    NASA Astrophysics Data System (ADS)

    Anbar, A. D.; Bruce, G.; Semken, S. C.; Summons, R. E.; Buxner, S.; Horodyskyj, L.; Kotrc, B.; Swann, J.; Klug Boonstra, S. L.; Oliver, C.

    2014-12-01

    Traditional introductory STEM courses often reinforce misconceptions because the large scale of many classes forces a structured, lecture-centric model of teaching that emphasizes delivery of facts rather than exploration, inquiry, and scientific reasoning. This problem is especially acute in teaching about the co-evolution of Earth and life, where classroom learning and textbook teaching are far removed from the immersive and affective aspects of field-based science, and where the challenges of taking large numbers of students into the field make it difficult to expose them to the complex context of the geologic record. We are exploring the potential of digital technologies and online delivery to address this challenge, using immersive and engaging virtual environments that are more like games than like lectures, grounded in active learning, and deliverable at scale via the internet. The goal is to invert the traditional lecture-centric paradigm by placing lectures at the periphery and inquiry-driven, integrative virtual investigations at the center, and to do so at scale. To this end, we are applying a technology platform we devised, supported by NASA and the NSF, that integrates a variety of digital media in a format that we call an immersive virtual field trip (iVFT). In iVFTs, students engage directly with virtual representations of real field sites, with which they interact non-linearly at a variety of scales via game-like exploration while guided by an adaptive tutoring system. This platform has already been used to develop pilot iVFTs useful in teaching anthropology, archeology, ecology, and geoscience. With support the Howard Hughes Medical Institute, we are now developing and evaluating a coherent suite of ~ 12 iVFTs that span the sweep of life's history on Earth, from the 3.8 Ga metasediments of West Greenland to ancient hominid sites in East Africa. These iVFTs will teach fundamental principles of geology and practices of scientific inquiry, and expose students to the evidence from which evolutionary and paleoenvironmental inferences are derived. In addition to making these iVFT available to the geoscience community for EPO, we will evaluate the comparative effectiveness of iVFT and traditional lecture and lab approaches to achieving geoscience learning objectives.

  17. Mosaic of Digital Raster Soviet Topographic Maps of Afghanistan

    USGS Publications Warehouse

    Chirico, Peter G.; Warner, Michael B.

    2005-01-01

    EXPLANATION The data contained in this publication include scanned, geographically referenced digital raster graphics (DRGs) of Soviet 1:200,000 - scale topographic map quadrangles. The original Afghanistan topographic map series at 1:200,000 scale, for the entire country, was published by the Soviet military between 1985 and 1991(MTDGS, 85-91). Hard copies of these original paper maps were scanned using a large format scanner, reprojected into Geographic Coordinate System (GCS) coordinates, and then clipped to remove the map collars to create a seamless, topographic map base for the entire country. An index of all available topographic map sheets is displayed here: Index_Geo_DD.pdf. This publication also includes the originial topographic map quadrangles projected in Universal Transverse Mercator (UTM) projection. The country of Afghanistan spans three UTM Zones: Zone 41, Zone 42, and Zone 43. Maps are stored as GeoTIFFs in their respective UTM zone projection. Indexes of all available topographic map sheets in their respective UTM zone are displayed here: Index_UTM_Z41.pdf, Index_UTM_Z42.pdf, Index_UTM_Z43.pdf. An Adobe Acrobat PDF file of the U.S. Department of the Army's Technical Manual 30-548, is available (U.S. Army, 1958). This document has been translated into English for assistance in reading Soviet topographic map symbols.

  18. Using Neural Networks to Classify Digitized Images of Galaxies

    NASA Astrophysics Data System (ADS)

    Goderya, S. N.; McGuire, P. C.

    2000-12-01

    Automated classification of Galaxies into Hubble types is of paramount importance to study the large scale structure of the Universe, particularly as survey projects like the Sloan Digital Sky Survey complete their data acquisition of one million galaxies. At present it is not possible to find robust and efficient artificial intelligence based galaxy classifiers. In this study we will summarize progress made in the development of automated galaxy classifiers using neural networks as machine learning tools. We explore the Bayesian linear algorithm, the higher order probabilistic network, the multilayer perceptron neural network and Support Vector Machine Classifier. The performance of any machine classifier is dependant on the quality of the parameters that characterize the different groups of galaxies. Our effort is to develop geometric and invariant moment based parameters as input to the machine classifiers instead of the raw pixel data. Such an approach reduces the dimensionality of the classifier considerably, and removes the effects of scaling and rotation, and makes it easier to solve for the unknown parameters in the galaxy classifier. To judge the quality of training and classification we develop the concept of Mathews coefficients for the galaxy classification community. Mathews coefficients are single numbers that quantify classifier performance even with unequal prior probabilities of the classes.

  19. A 16 channel discriminator VME board with enhanced triggering capabilities

    NASA Astrophysics Data System (ADS)

    Borsato, E.; Garfagnini, A.; Menon, G.

    2012-08-01

    Electronics and data acquisition systems used in small and large scale laboratories often have to handle analog signals with varying polarity, amplitude and duration which have to be digitized to be used as trigger signals to validate the acquired data. In the specific case of experiments dealing with ionizing radiation, ancillary particle detectors (for instance plastic scintillators or Resistive Plate Chambers) are used to trigger and select the impinging particles for the experiment. A novel approach using commercial LVDS line receivers as discriminator devices is presented. Such devices, with a proper calibration, can handle positive and negative analog signals in a wide dynamic range (from 20 mV to 800 mV signal amplitude). The clear advantages, with respect to conventional discriminator devices, are reduced costs, high reliability of a mature technology and the possibility of high integration scale. Moreover, commercial discriminator boards with positive input signal and a wide threshold swing are not available on the market. The present paper describes the design and characterization of a VME board capable to handle 16 differential or single-ended input channels. The output digital signals, available independently for each input, can be combined in the board into three independent trigger logic units which provide additional outputs for the end user.

  20. Reliability and agreement in the use of four- and six-point ordinal scales for the assessment of erythema in digital images of canine skin.

    PubMed

    Hill, Peter B

    2015-06-01

    Grading of erythema in clinical practice is a subjective assessment that cannot be confirmed using a definitive test; nevertheless, erythema scores are typically measured in clinical trials assessing the response to treatment interventions. Most commonly, ordinal scales are used for this purpose, but the optimal number of categories in such scales has not been determined. This study aimed to compare the reliability and agreement of a four-point and a six-point ordinal scale for the assessment of erythema in digital images of canine skin. Fifteen digital images showing varying degrees of erythema were assessed by specialist dermatologists and laypeople, using either the four-point or the six-point scale. Reliability between the raters was assessed using intraclass correlation coefficients and Cronbach's α. Agreement was assessed using the variation ratio (the percentage of respondents who chose the mode, the most common answer). Intraobserver variability was assessed by comparing the results of two grading sessions, at least 6 weeks apart. Both scales demonstrated high reliability, with intraclass correlation coefficient values and Cronbach's α above 0.99. However, the four-point scale demonstrated significantly superior agreement, with variation ratios for the four-point scale averaging 74.8%, compared with 56.2% for the six-point scale. Intraobserver consistency for the four-point scale was very high. Although both scales demonstrated high reliability, the four-point scale was superior in terms of agreement. For the assessment of erythema in clinical trials, a four-point ordinal scale is recommended. © 2014 ESVD and ACVD.

Top