Science.gov

Sample records for additional big images

  1. Cincinnati Big Area Additive Manufacturing (BAAM)

    SciTech Connect

    Duty, Chad E.; Love, Lonnie J.

    2015-03-04

    Oak Ridge National Laboratory (ORNL) worked with Cincinnati Incorporated (CI) to demonstrate Big Area Additive Manufacturing which increases the speed of the additive manufacturing (AM) process by over 1000X, increases the size of parts by over 10X and shows a cost reduction of over 100X. ORNL worked with CI to transition the Big Area Additive Manufacturing (BAAM) technology from a proof-of-principle (TRL 2-3) demonstration to a prototype product stage (TRL 7-8).

  2. Big data in oncologic imaging.

    PubMed

    Regge, Daniele; Mazzetti, Simone; Giannini, Valentina; Bracco, Christian; Stasi, Michele

    2016-09-13

    Cancer is a complex disease and unfortunately understanding how the components of the cancer system work does not help understand the behavior of the system as a whole. In the words of the Greek philosopher Aristotle "the whole is greater than the sum of parts." To date, thanks to improved information technology infrastructures, it is possible to store data from each single cancer patient, including clinical data, medical images, laboratory tests, and pathological and genomic information. Indeed, medical archive storage constitutes approximately one-third of total global storage demand and a large part of the data are in the form of medical images. The opportunity is now to draw insight on the whole to the benefit of each individual patient. In the oncologic patient, big data analysis is at the beginning but several useful applications can be envisaged including development of imaging biomarkers to predict disease outcome, assessing the risk of X-ray dose exposure or of renal damage following the administration of contrast agents, and tracking and optimizing patient workflow. The aim of this review is to present current evidence of how big data derived from medical images may impact on the diagnostic pathway of the oncologic patient.

  3. AirMSPI PODEX BigSur Terrain Images

    Atmospheric Science Data Center

    2013-12-13

    ... Browse Images from the PODEX 2013 Campaign   Big Sur target (Big Sur, California) 02/03/2013 Terrain-projected   Select ...   Version number   For more information, see the Data Product Specifications (DPS)   ...

  4. BigView Image Viewing on Tiled Displays

    NASA Technical Reports Server (NTRS)

    Sandstrom, Timothy

    2007-01-01

    BigView allows for interactive panning and zooming of images of arbitrary size on desktop PCs running Linux. Additionally, it can work in a multi-screen environment where multiple PCs cooperate to view a single, large image. Using this software, one can explore on relatively modest machines images such as the Mars Orbiter Camera mosaic [92,160 33,280 pixels]. The images must be first converted into paged format, where the image is stored in 256 256 pages to allow rapid movement of pixels into texture memory. The format contains an image pyramid : a set of scaled versions of the original image. Each scaled image is 1/2 the size of the previous, starting with the original down to the smallest, which fits into a single 256 x 256 page.

  5. AirMSPI PODEX Big Sur Ellipsoid Images

    Atmospheric Science Data Center

    2013-12-11

    ... Browse Images from the PODEX 2013 Campaign   Big Sur target 02/03/2013 Ellipsoid-projected   Select link to ...   Version number   For more information, see the  Data Product Specifications (DPS) ...

  6. Population-based imaging biobanks as source of big data.

    PubMed

    Gatidis, Sergios; Heber, Sophia D; Storz, Corinna; Bamberg, Fabian

    2016-09-09

    Advances of computational sciences over the last decades have enabled the introduction of novel methodological approaches in biomedical research. Acquiring extensive and comprehensive data about a research subject and subsequently extracting significant information has opened new possibilities in gaining insight into biological and medical processes. This so-called big data approach has recently found entrance into medical imaging and numerous epidemiological studies have been implementing advanced imaging to identify imaging biomarkers that provide information about physiological processes, including normal development and aging but also on the development of pathological disease states. The purpose of this article is to present existing epidemiological imaging studies and to discuss opportunities, methodological and organizational aspects, and challenges that population imaging poses to the field of big data research.

  7. Small Art Images--Big Art Learning

    ERIC Educational Resources Information Center

    Stephens, Pam

    2005-01-01

    When small art images are incorporated into the curriculum, students are afforded opportunities to slow down, observe minute details, and communicate ideas about art and artists. This sort of purposeful art contemplation takes students beyond the day-to-day educational practice. It is through these sorts of art activities that students develop…

  8. Big area additive manufacturing of high performance bonded NdFeB magnets

    SciTech Connect

    Li, Ling; Tirado, Angelica; Nlebedim, I. C.; Rios, Orlando; Post, Brian; Kunc, Vlastimil; Lowden, R. R.; Lara-Curzio, Edgar; Fredette, Robert; Ormerod, John; Lograsso, Thomas A.; Paranthaman, M. Parans

    2016-10-31

    Additive manufacturing allows for the production of complex parts with minimum material waste, offering an effective technique for fabricating permanent magnets which frequently involve critical rare earth elements. In this report, we demonstrate a novel method - Big Area Additive Manufacturing (BAAM) - to fabricate isotropic near-net-shape NdFeB bonded magnets with magnetic and mechanical properties comparable or better than those of traditional injection molded magnets. The starting polymer magnet composite pellets consist of 65 vol% isotropic NdFeB powder and 35 vol% polyamide (Nylon-12). The density of the final BAAM magnet product reached 4.8 g/cm3, and the room temperature magnetic properties are: intrinsic coercivity Hci = 688.4 kA/m, remanence Br = 0.51 T, and energy product (BH)max = 43.49 kJ/m3 (5.47 MGOe). In addition, tensile tests performed on four dog-bone shaped specimens yielded an average ultimate tensile strength of 6.60 MPa and an average failure strain of 4.18%. Scanning electron microscopy images of the fracture surfaces indicate that the failure is primarily related to the debonding of the magnetic particles from the polymer binder. As a result, the present method significantly simplifies manufacturing of near-net-shape bonded magnets, enables efficient use of rare earth elements thus contributing towards enriching the supply of critical materials.

  9. Big area additive manufacturing of high performance bonded NdFeB magnets

    DOE PAGES

    Li, Ling; Tirado, Angelica; Nlebedim, I. C.; ...

    2016-10-31

    Additive manufacturing allows for the production of complex parts with minimum material waste, offering an effective technique for fabricating permanent magnets which frequently involve critical rare earth elements. In this report, we demonstrate a novel method - Big Area Additive Manufacturing (BAAM) - to fabricate isotropic near-net-shape NdFeB bonded magnets with magnetic and mechanical properties comparable or better than those of traditional injection molded magnets. The starting polymer magnet composite pellets consist of 65 vol% isotropic NdFeB powder and 35 vol% polyamide (Nylon-12). The density of the final BAAM magnet product reached 4.8 g/cm3, and the room temperature magnetic propertiesmore » are: intrinsic coercivity Hci = 688.4 kA/m, remanence Br = 0.51 T, and energy product (BH)max = 43.49 kJ/m3 (5.47 MGOe). In addition, tensile tests performed on four dog-bone shaped specimens yielded an average ultimate tensile strength of 6.60 MPa and an average failure strain of 4.18%. Scanning electron microscopy images of the fracture surfaces indicate that the failure is primarily related to the debonding of the magnetic particles from the polymer binder. As a result, the present method significantly simplifies manufacturing of near-net-shape bonded magnets, enables efficient use of rare earth elements thus contributing towards enriching the supply of critical materials.« less

  10. Big Area Additive Manufacturing of High Performance Bonded NdFeB Magnets

    PubMed Central

    Li, Ling; Tirado, Angelica; Nlebedim, I. C.; Rios, Orlando; Post, Brian; Kunc, Vlastimil; Lowden, R. R.; Lara-Curzio, Edgar; Fredette, Robert; Ormerod, John; Lograsso, Thomas A.; Paranthaman, M. Parans

    2016-01-01

    Additive manufacturing allows for the production of complex parts with minimum material waste, offering an effective technique for fabricating permanent magnets which frequently involve critical rare earth elements. In this report, we demonstrate a novel method - Big Area Additive Manufacturing (BAAM) - to fabricate isotropic near-net-shape NdFeB bonded magnets with magnetic and mechanical properties comparable or better than those of traditional injection molded magnets. The starting polymer magnet composite pellets consist of 65 vol% isotropic NdFeB powder and 35 vol% polyamide (Nylon-12). The density of the final BAAM magnet product reached 4.8 g/cm3, and the room temperature magnetic properties are: intrinsic coercivity Hci = 688.4 kA/m, remanence Br = 0.51 T, and energy product (BH)max = 43.49 kJ/m3 (5.47 MGOe). In addition, tensile tests performed on four dog-bone shaped specimens yielded an average ultimate tensile strength of 6.60 MPa and an average failure strain of 4.18%. Scanning electron microscopy images of the fracture surfaces indicate that the failure is primarily related to the debonding of the magnetic particles from the polymer binder. The present method significantly simplifies manufacturing of near-net-shape bonded magnets, enables efficient use of rare earth elements thus contributing towards enriching the supply of critical materials. PMID:27796339

  11. Big Area Additive Manufacturing of High Performance Bonded NdFeB Magnets.

    PubMed

    Li, Ling; Tirado, Angelica; Nlebedim, I C; Rios, Orlando; Post, Brian; Kunc, Vlastimil; Lowden, R R; Lara-Curzio, Edgar; Fredette, Robert; Ormerod, John; Lograsso, Thomas A; Paranthaman, M Parans

    2016-10-31

    Additive manufacturing allows for the production of complex parts with minimum material waste, offering an effective technique for fabricating permanent magnets which frequently involve critical rare earth elements. In this report, we demonstrate a novel method - Big Area Additive Manufacturing (BAAM) - to fabricate isotropic near-net-shape NdFeB bonded magnets with magnetic and mechanical properties comparable or better than those of traditional injection molded magnets. The starting polymer magnet composite pellets consist of 65 vol% isotropic NdFeB powder and 35 vol% polyamide (Nylon-12). The density of the final BAAM magnet product reached 4.8 g/cm(3), and the room temperature magnetic properties are: intrinsic coercivity Hci = 688.4 kA/m, remanence Br = 0.51 T, and energy product (BH)max = 43.49 kJ/m(3) (5.47 MGOe). In addition, tensile tests performed on four dog-bone shaped specimens yielded an average ultimate tensile strength of 6.60 MPa and an average failure strain of 4.18%. Scanning electron microscopy images of the fracture surfaces indicate that the failure is primarily related to the debonding of the magnetic particles from the polymer binder. The present method significantly simplifies manufacturing of near-net-shape bonded magnets, enables efficient use of rare earth elements thus contributing towards enriching the supply of critical materials.

  12. Big Area Additive Manufacturing of High Performance Bonded NdFeB Magnets

    NASA Astrophysics Data System (ADS)

    Li, Ling; Tirado, Angelica; Nlebedim, I. C.; Rios, Orlando; Post, Brian; Kunc, Vlastimil; Lowden, R. R.; Lara-Curzio, Edgar; Fredette, Robert; Ormerod, John; Lograsso, Thomas A.; Paranthaman, M. Parans

    2016-10-01

    Additive manufacturing allows for the production of complex parts with minimum material waste, offering an effective technique for fabricating permanent magnets which frequently involve critical rare earth elements. In this report, we demonstrate a novel method - Big Area Additive Manufacturing (BAAM) - to fabricate isotropic near-net-shape NdFeB bonded magnets with magnetic and mechanical properties comparable or better than those of traditional injection molded magnets. The starting polymer magnet composite pellets consist of 65 vol% isotropic NdFeB powder and 35 vol% polyamide (Nylon-12). The density of the final BAAM magnet product reached 4.8 g/cm3, and the room temperature magnetic properties are: intrinsic coercivity Hci = 688.4 kA/m, remanence Br = 0.51 T, and energy product (BH)max = 43.49 kJ/m3 (5.47 MGOe). In addition, tensile tests performed on four dog-bone shaped specimens yielded an average ultimate tensile strength of 6.60 MPa and an average failure strain of 4.18%. Scanning electron microscopy images of the fracture surfaces indicate that the failure is primarily related to the debonding of the magnetic particles from the polymer binder. The present method significantly simplifies manufacturing of near-net-shape bonded magnets, enables efficient use of rare earth elements thus contributing towards enriching the supply of critical materials.

  13. Development of imaging biomarkers and generation of big data.

    PubMed

    Alberich-Bayarri, Ángel; Hernández-Navarro, Rafael; Ruiz-Martínez, Enrique; García-Castro, Fabio; García-Juan, David; Martí-Bonmatí, Luis

    2017-02-21

    Several image processing algorithms have emerged to cover unmet clinical needs but their application to radiological routine with a clear clinical impact is still not straightforward. Moving from local to big infrastructures, such as Medical Imaging Biobanks (millions of studies), or even more, Federations of Medical Imaging Biobanks (in some cases totaling to hundreds of millions of studies) require the integration of automated pipelines for fast analysis of pooled data to extract clinically relevant conclusions, not uniquely linked to medical imaging, but in combination to other information such as genetic profiling. A general strategy for the development of imaging biomarkers and their integration in the cloud for the quantitative management and exploitation in large databases is herein presented. The proposed platform has been successfully launched and is being validated nowadays among the early adopters' community of radiologists, clinicians, and medical imaging researchers.

  14. Neural Computations for Biosonar Imaging in the Big Brown Bat

    NASA Astrophysics Data System (ADS)

    Saillant, Prestor Augusto

    1995-11-01

    The study of the intimate relationship between space and time has taken many forms, ranging from the Theory of Relativity down to the problem of avoiding traffic jams. However, nowhere has this relationship been more fully developed and exploited than in dolphins and bats, which have the ability to utilize biosonar. This thesis describes research on the behavioral and computational basis of echolocation carried out in order to explore the neural mechanisms which may account for the space-time constructs which are of psychological importance to the big brown bat. The SCAT (Spectrogram Correlation and Transformation) computational model was developed to provide a framework for understanding the computational requirements of FM echolocation as determined from psychophysical experiments (i.e., high resolution imaging) and neurobiological constraints (Saillant et al., 1993). The second part of the thesis consisted in developing a new behavioral paradigm for simultaneously studying acoustic behavior and flight behavior of big brown bats in pursuit of stationary or moving targets. In the third part of the thesis a complete acoustic "artificial bat" was constructed, making use of the SCAT process. The development of the artificial bat allowed us to begin experimentation with real world echoes from various targets, in order to gain a better appreciation for the additional complexities and sources of information encountered by bats in flight. Finally, the continued development of the SCAT model has allowed a deeper understanding of the phenomenon of "time expansion" and of the phenomenon of phase sensitivity in the ultrasonic range. Time expansion, first predicted through the use of the SCAT model, and later found in auditory local evoked potential recordings, opens up a new realm of information processing and representation in the brain which as of yet has not been considered. It seems possible, from the work in the auditory system, that time expansion may provide a novel

  15. High performance poly(etherketoneketone) (PEKK) composite parts fabricated using Big Area Additive Manufacturing (BAAM) processes

    SciTech Connect

    Kunc, Vlastimil; Kishore, Vidya; Chen, Xun; Ajinjeru, Christine; Duty, Chad; Hassen, Ahmed A

    2016-09-01

    ORNL collaborated with Arkema Inc. to investigate poly(etherketoneketone) (PEKK) and its composites as potential feedstock material for Big Area Additive Manufacturing (BAAM) system. In this work thermal and rheological properties were investigated and characterized in order to identify suitable processing conditions and material flow behavior for BAAM process.

  16. Utility of Big Area Additive Manufacturing (BAAM) For The Rapid Manufacture of Customized Electric Vehicles

    SciTech Connect

    Love, Lonnie J.

    2015-08-01

    This Oak Ridge National Laboratory (ORNL) Manufacturing Development Facility (MDF) technical collaboration project was conducted in two phases as a CRADA with Local Motors Inc. Phase 1 was previously reported as Advanced Manufacturing of Complex Cyber Mechanical Devices through Community Engagement and Micro-manufacturing and demonstrated the integration of components onto a prototype body part for a vehicle. Phase 2 was reported as Utility of Big Area Additive Manufacturing (BAAM) for the Rapid Manufacture of Customized Electric Vehicles and demonstrated the high profile live printing of an all-electric vehicle using ONRL s Big Area Additive Manufacturing (BAAM) technology. This demonstration generated considerable national attention and successfully demonstrated the capabilities of the BAAM system as developed by ORNL and Cincinnati, Inc. and the feasibility of additive manufacturing of a full scale electric vehicle as envisioned by the CRADA partner Local Motors, Inc.

  17. Material Development for Tooling Applications Using Big Area Additive Manufacturing (BAAM)

    SciTech Connect

    Duty, Chad E.; Drye, Tom; Franc, Alan

    2015-03-01

    Techmer Engineered Solutions (TES) is working with Oak Ridge National Laboratory (ORNL) to develop materials and evaluate their use for ORNL s recently developed Big Area Additive Manufacturing (BAAM) system for tooling applications. The first phase of the project established the performance of some commercially available polymer compositions deposited with the BAAM system. Carbon fiber reinforced ABS demonstrated a tensile strength of nearly 10 ksi, which is sufficient for a number of low temperature tooling applications.

  18. Body image and personality among British men: associations between the Big Five personality domains, drive for muscularity, and body appreciation.

    PubMed

    Benford, Karis; Swami, Viren

    2014-09-01

    The present study examined associations between the Big Five personality domains and measures of men's body image. A total of 509 men from the community in London, UK, completed measures of drive for muscularity, body appreciation, the Big Five domains, and subjective social status, and provided their demographic details. The results of a hierarchical regression showed that, once the effects of participant body mass index (BMI) and subjective social status had been accounted for, men's drive for muscularity was significantly predicted by Neuroticism (β=.29). In addition, taking into account the effects of BMI and subjective social status, men's body appreciation was significantly predicted by Neuroticism (β=-.35) and Extraversion (β=.12). These findings highlight potential avenues for the development of intervention approaches based on the relationship between the Big Five personality traits and body image.

  19. a Hadoop-Based Distributed Framework for Efficient Managing and Processing Big Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Wang, C.; Hu, F.; Hu, X.; Zhao, S.; Wen, W.; Yang, C.

    2015-07-01

    Various sensors from airborne and satellite platforms are producing large volumes of remote sensing images for mapping, environmental monitoring, disaster management, military intelligence, and others. However, it is challenging to efficiently storage, query and process such big data due to the data- and computing- intensive issues. In this paper, a Hadoop-based framework is proposed to manage and process the big remote sensing data in a distributed and parallel manner. Especially, remote sensing data can be directly fetched from other data platforms into the Hadoop Distributed File System (HDFS). The Orfeo toolbox, a ready-to-use tool for large image processing, is integrated into MapReduce to provide affluent image processing operations. With the integration of HDFS, Orfeo toolbox and MapReduce, these remote sensing images can be directly processed in parallel in a scalable computing environment. The experiment results show that the proposed framework can efficiently manage and process such big remote sensing data.

  20. Big Surveys, Big Data Centres

    NASA Astrophysics Data System (ADS)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  1. The caBIG annotation and image Markup project.

    PubMed

    Channin, David S; Mongkolwat, Pattanasak; Kleper, Vladimir; Sepukar, Kastubh; Rubin, Daniel L

    2010-04-01

    Image annotation and markup are at the core of medical interpretation in both the clinical and the research setting. Digital medical images are managed with the DICOM standard format. While DICOM contains a large amount of meta-data about whom, where, and how the image was acquired, DICOM says little about the content or meaning of the pixel data. An image annotation is the explanatory or descriptive information about the pixel data of an image that is generated by a human or machine observer. An image markup is the graphical symbols placed over the image to depict an annotation. While DICOM is the standard for medical image acquisition, manipulation, transmission, storage, and display, there are no standards for image annotation and markup. Many systems expect annotation to be reported verbally, while markups are stored in graphical overlays or proprietary formats. This makes it difficult to extract and compute with both of them. The goal of the Annotation and Image Markup (AIM) project is to develop a mechanism, for modeling, capturing, and serializing image annotation and markup data that can be adopted as a standard by the medical imaging community. The AIM project produces both human- and machine-readable artifacts. This paper describes the AIM information model, schemas, software libraries, and tools so as to prepare researchers and developers for their use of AIM.

  2. Big-data x-ray phase contrast imaging simulation challenges

    NASA Astrophysics Data System (ADS)

    Jimenez, Edward S.; Dagel, Amber L.

    2015-08-01

    This position paper describes a potential implementation of a large-scale grating-based X-ray Phase Contrast Imaging System (XPCI) simulation tool along with the associated challenges in its implementation. This work proposes an implementation based off of an implementation by Peterzol et. al. where each grating is treated as an object imaged in the field-of-view. Two main challenges exist; the first, is the required sampling and information management in object space due to the micron-scale periods of each grating propagating over significant distances. The second is maintaining algorithmic numerical stability for imaging systems relevant to industrial applications. We present preliminary results for a numerical stability study using a simplified algorithm that performs Talbot imaging in a big-data context

  3. Stereoscopic high-speed imaging using additive colors

    PubMed Central

    Sankin, Georgy N.; Piech, David; Zhong, Pei

    2012-01-01

    An experimental system for digital stereoscopic imaging produced by using a high-speed color camera is described. Two bright-field image projections of a three-dimensional object are captured utilizing additive-color backlighting (blue and red). The two images are simultaneously combined on a two-dimensional image sensor using a set of dichromatic mirrors, and stored for off-line separation of each projection. This method has been demonstrated in analyzing cavitation bubble dynamics near boundaries. This technique may be useful for flow visualization and in machine vision applications. PMID:22559533

  4. Workshop Report on Additive Manufacturing for Large-Scale Metal Components - Development and Deployment of Metal Big-Area-Additive-Manufacturing (Large-Scale Metals AM) System

    SciTech Connect

    Babu, Sudarsanam Suresh; Love, Lonnie J.; Peter, William H.; Dehoff, Ryan

    2016-05-01

    Additive manufacturing (AM) is considered an emerging technology that is expected to transform the way industry can make low-volume, high value complex structures. This disruptive technology promises to replace legacy manufacturing methods for the fabrication of existing components in addition to bringing new innovation for new components with increased functional and mechanical properties. This report outlines the outcome of a workshop on large-scale metal additive manufacturing held at Oak Ridge National Laboratory (ORNL) on March 11, 2016. The charter for the workshop was outlined by the Department of Energy (DOE) Advanced Manufacturing Office program manager. The status and impact of the Big Area Additive Manufacturing (BAAM) for polymer matrix composites was presented as the background motivation for the workshop. Following, the extension of underlying technology to low-cost metals was proposed with the following goals: (i) High deposition rates (approaching 100 lbs/h); (ii) Low cost (<$10/lbs) for steel, iron, aluminum, nickel, as well as, higher cost titanium, (iii) large components (major axis greater than 6 ft) and (iv) compliance of property requirements. The above concept was discussed in depth by representatives from different industrial sectors including welding, metal fabrication machinery, energy, construction, aerospace and heavy manufacturing. In addition, DOE’s newly launched High Performance Computing for Manufacturing (HPC4MFG) program was reviewed. This program will apply thermo-mechanical models to elucidate deeper understanding of the interactions between design, process, and materials during additive manufacturing. Following these presentations, all the attendees took part in a brainstorming session where everyone identified the top 10 challenges in large-scale metal AM from their own perspective. The feedback was analyzed and grouped in different categories including, (i) CAD to PART software, (ii) selection of energy source, (iii

  5. Imaging requirements for medical applications of additive manufacturing.

    PubMed

    Huotilainen, Eero; Paloheimo, Markku; Salmi, Mika; Paloheimo, Kaija-Stiina; Björkstrand, Roy; Tuomi, Jukka; Markkola, Antti; Mäkitie, Antti

    2014-02-01

    Additive manufacturing (AM), formerly known as rapid prototyping, is steadily shifting its focus from industrial prototyping to medical applications as AM processes, bioadaptive materials, and medical imaging technologies develop, and the benefits of the techniques gain wider knowledge among clinicians. This article gives an overview of the main requirements for medical imaging affected by needs of AM, as well as provides a brief literature review from existing clinical cases concentrating especially on the kind of radiology they required. As an example application, a pair of CT images of the facial skull base was turned into 3D models in order to illustrate the significance of suitable imaging parameters. Additionally, the model was printed into a preoperative medical model with a popular AM device. Successful clinical cases of AM are recognized to rely heavily on efficient collaboration between various disciplines - notably operating surgeons, radiologists, and engineers. The single main requirement separating tangible model creation from traditional imaging objectives such as diagnostics and preoperative planning is the increased need for anatomical accuracy in all three spatial dimensions, but depending on the application, other specific requirements may be present as well. This article essentially intends to narrow the potential communication gap between radiologists and engineers who work with projects involving AM by showcasing the overlap between the two disciplines.

  6. Scalable splitting algorithms for big-data interferometric imaging in the SKA era

    NASA Astrophysics Data System (ADS)

    Onose, Alexandru; Carrillo, Rafael E.; Repetti, Audrey; McEwen, Jason D.; Thiran, Jean-Philippe; Pesquet, Jean-Christophe; Wiaux, Yves

    2016-11-01

    In the context of next-generation radio telescopes, like the Square Kilometre Array (SKA), the efficient processing of large-scale data sets is extremely important. Convex optimization tasks under the compressive sensing framework have recently emerged and provide both enhanced image reconstruction quality and scalability to increasingly larger data sets. We focus herein mainly on scalability and propose two new convex optimization algorithmic structures able to solve the convex optimization tasks arising in radio-interferometric imaging. They rely on proximal splitting and forward-backward iterations and can be seen, by analogy, with the CLEAN major-minor cycle, as running sophisticated CLEAN-like iterations in parallel in multiple data, prior, and image spaces. Both methods support any convex regularization function, in particular, the well-studied ℓ1 priors promoting image sparsity in an adequate domain. Tailored for big-data, they employ parallel and distributed computations to achieve scalability, in terms of memory and computational requirements. One of them also exploits randomization, over data blocks at each iteration, offering further flexibility. We present simulation results showing the feasibility of the proposed methods as well as their advantages compared to state-of-the-art algorithmic solvers. Our MATLAB code is available online on GitHub.

  7. Comparison of additive image fusion vs. feature-level image fusion techniques for enhanced night driving

    NASA Astrophysics Data System (ADS)

    Bender, Edward J.; Reese, Colin E.; Van Der Wal, Gooitzen S.

    2003-02-01

    The Night Vision & Electronic Sensors Directorate (NVESD) has conducted a series of image fusion evaluations under the Head-Tracked Vision System (HTVS) program. The HTVS is a driving system for both wheeled and tracked military vehicles, wherein dual-waveband sensors are directed in a more natural head-slewed imaging mode. The HTVS consists of thermal and image-intensified TV sensors, a high-speed gimbal, a head-mounted display, and a head tracker. A series of NVESD field tests over the past two years has investigated the degree to which additive (A+B) image fusion of these sensors enhances overall driving performance. Additive fusion employs a single (but user adjustable) fractional weighting for all the features of each sensor's image. More recently, NVESD and Sarnoff Corporation have begun a cooperative effort to evaluate and refine Sarnoff's "feature-level" multi-resolution (pyramid) algorithms for image fusion. This approach employs digital processing techniques to select at each image point only the sensor with the strongest features, and to utilize only those features to reconstruct the fused video image. This selection process is performed simultaneously at multiple scales of the image, which are combined to form the reconstructed fused image. All image fusion techniques attempt to combine the "best of both sensors" in a single image. Typically, thermal sensors are better for detecting military threats and targets, while image-intensified sensors provide more natural scene cues and detect cultural lighting. This investigation will address the differences between additive fusion and feature-level image fusion techniques for enhancing the driver's overall situational awareness.

  8. A Big Data Analytics Pipeline for the Analysis of TESS Full Frame Images

    NASA Astrophysics Data System (ADS)

    Wampler-Doty, Matthew; Pierce Doty, John

    2015-12-01

    We present a novel method for producing a catalogue of extra-solar planets and transients using the full frame image data from TESS. Our method involves (1) creating a fast Monte Carlo simulation of the TESS science instruments, (2) using the simulation to create a labeled dataset consisting of exoplanets with various orbital durations as well as transients (such as tidal disruption events), (3) using supervised machine learning to find optimal matched filters, Support Vector Machines (SVMs) and statistical classifiers (i.e. naïve Bayes and Markov Random Fields) to detect astronomical objects of interest and (4) “Big Data” analysis to produce a catalogue based on the TESS data. We will apply the resulting methods to all stars in the full frame images. We hope that by providing libraries that conform to industry standards of Free Open Source Software we may invite researchers from the astronomical community as well as the wider data-analytics community to contribute to our effort.

  9. Image-based query-by-example for big databases of galaxy images

    NASA Astrophysics Data System (ADS)

    Shamir, Lior; Kuminski, Evan

    2017-01-01

    Very large astronomical databases containing millions or even billions of galaxy images have been becoming increasingly important tools in astronomy research. However, in many cases the very large size makes it more difficult to analyze these data manually, reinforcing the need for computer algorithms that can automate the data analysis process. An example of such task is the identification of galaxies of a certain morphology of interest. For instance, if a rare galaxy is identified it is reasonable to expect that more galaxies of similar morphology exist in the database, but it is virtually impossible to manually search these databases to identify such galaxies. Here we describe computer vision and pattern recognition methodology that receives a galaxy image as an input, and searches automatically a large dataset of galaxies to return a list of galaxies that are visually similar to the query galaxy. The returned list is not necessarily complete or clean, but it provides a substantial reduction of the original database into a smaller dataset, in which the frequency of objects visually similar to the query galaxy is much higher. Experimental results show that the algorithm can identify rare galaxies such as ring galaxies among datasets of 10,000 astronomical objects.

  10. Study on clear stereo image pair acquisition method for small objects with big vertical size in SLM vision system.

    PubMed

    Wang, Yuezong; Jin, Yan; Wang, Lika; Geng, Benliang

    2016-05-01

    Microscopic vision system with stereo light microscope (SLM) has been applied to surface profile measurement. If the vertical size of a small object exceeds the range of depth, its images will contain clear and fuzzy image regions. Hence, in order to obtain clear stereo images, we propose a microscopic sequence image fusion method which is suitable for SLM vision system. First, a solution to capture and align image sequence is designed, which outputs an aligning stereo images. Second, we decompose stereo image sequence by wavelet analysis theory, and obtain a series of high and low frequency coefficients with different resolutions. Then fused stereo images are output based on the high and low frequency coefficient fusion rules proposed in this article. The results show that Δw1 (Δw2 ) and ΔZ of stereo images in a sequence have linear relationship. Hence, a procedure for image alignment is necessary before image fusion. In contrast with other image fusion methods, our method can output clear fused stereo images with better performance, which is suitable for SLM vision system, and very helpful for avoiding image fuzzy caused by big vertical size of small objects.

  11. An Improved InSAR Image Co-Registration Method for Pairs with Relatively Big Distortions or Large Incoherent Areas

    PubMed Central

    Chen, Zhenwei; Zhang, Lei; Zhang, Guo

    2016-01-01

    Co-registration is one of the most important steps in interferometric synthetic aperture radar (InSAR) data processing. The standard offset-measurement method based on cross-correlating uniformly distributed patches takes no account of specific geometric transformation between images or characteristics of ground scatterers. Hence, it is inefficient and difficult to obtain satisfying co-registration results for image pairs with relatively big distortion or large incoherent areas. Given this, an improved co-registration strategy is proposed in this paper which takes both the geometric features and image content into consideration. Firstly, some geometric transformations including scale, flip, rotation, and shear between images were eliminated based on the geometrical information, and the initial co-registration polynomial was obtained. Then the registration points were automatically detected by integrating the signal-to-clutter-ratio (SCR) thresholds and the amplitude information, and a further co-registration process was performed to refine the polynomial. Several comparison experiments were carried out using 2 TerraSAR-X data from the Hong Kong airport and 21 PALSAR data from the Donghai Bridge. Experiment results demonstrate that the proposed method brings accuracy and efficiency improvements for co-registration and processing abilities in the cases of big distortion between images or large incoherent areas in the images. For most co-registrations, the proposed method can enhance the reliability and applicability of co-registration and thus promote the automation to a higher level. PMID:27649207

  12. An Improved InSAR Image Co-Registration Method for Pairs with Relatively Big Distortions or Large Incoherent Areas.

    PubMed

    Chen, Zhenwei; Zhang, Lei; Zhang, Guo

    2016-09-17

    Co-registration is one of the most important steps in interferometric synthetic aperture radar (InSAR) data processing. The standard offset-measurement method based on cross-correlating uniformly distributed patches takes no account of specific geometric transformation between images or characteristics of ground scatterers. Hence, it is inefficient and difficult to obtain satisfying co-registration results for image pairs with relatively big distortion or large incoherent areas. Given this, an improved co-registration strategy is proposed in this paper which takes both the geometric features and image content into consideration. Firstly, some geometric transformations including scale, flip, rotation, and shear between images were eliminated based on the geometrical information, and the initial co-registration polynomial was obtained. Then the registration points were automatically detected by integrating the signal-to-clutter-ratio (SCR) thresholds and the amplitude information, and a further co-registration process was performed to refine the polynomial. Several comparison experiments were carried out using 2 TerraSAR-X data from the Hong Kong airport and 21 PALSAR data from the Donghai Bridge. Experiment results demonstrate that the proposed method brings accuracy and efficiency improvements for co-registration and processing abilities in the cases of big distortion between images or large incoherent areas in the images. For most co-registrations, the proposed method can enhance the reliability and applicability of co-registration and thus promote the automation to a higher level.

  13. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  14. Big Data: Big Confusion? Big Challenges?

    DTIC Science & Technology

    2015-05-01

    12th Annual Acquisition Research Symposium 12th Annual Acquisition Research Symposium Big Data: Big Confusion? Big Challenges? Mary Maureen...currently valid OMB control number. 1. REPORT DATE MAY 2015 2. REPORT TYPE 3. DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Big ...Data: Big Confusion? Big Challenges? 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK

  15. Big Images and Big Ideas!

    ERIC Educational Resources Information Center

    McCullagh, John; Greenwood, Julian

    2011-01-01

    In this digital age, is primary science being left behind? Computer microscopes provide opportunities to transform science lessons into highly exciting learning experiences and to shift enquiry and discovery back into the hands of the children. A class of 5- and 6-year-olds was just one group of children involved in the Digitally Resourced…

  16. Direct laser additive fabrication system with image feedback control

    DOEpatents

    Griffith, Michelle L.; Hofmeister, William H.; Knorovsky, Gerald A.; MacCallum, Danny O.; Schlienger, M. Eric; Smugeresky, John E.

    2002-01-01

    A closed-loop, feedback-controlled direct laser fabrication system is disclosed. The feedback refers to the actual growth conditions obtained by real-time analysis of thermal radiation images. The resulting system can fabricate components with severalfold improvement in dimensional tolerances and surface finish.

  17. Optimal addition of images for detection and photometry

    NASA Technical Reports Server (NTRS)

    Fischer, Philippe; Kochanski, Greg P.

    1994-01-01

    In this paper we describe weighting techniques used for the optimal coaddition of charge coupled devices (CCD) frames with differing characteristics. Optimal means maximum signal to noise (S/N) for stellar objects. We derive formulas for four applications: (1) object detection via matched filter, (2) object detection identical to DAOFIND, (3) aperture photometry, and (4) ALLSTAR profile-fitting photometry. We have included examples involving 21 frames for which either the sky brightness or image resolution varied by a factor of 3. The gains in S/N were modest for most of the examples, except for DAOFIND detection with varying image resolution which exhibited a substantial S/N increase. Even though the only consideration was maximizing S/N, the image resolution was seen to improve for most of the variable resolution examples. Also discussed are empirical fits for the weighting and the availability of the program, WEIGHT, used to generate the weighting for the individual frames. Finally, we include appendices describing the effects of clipping algorithms and a scheme for star/galaxy and cosmic-ray/star discrimination. scheme for star/galaxy and cosmic-ray/star discrimination.

  18. Neutrons Image Additive Manufactured Turbine Blade in 3-D

    SciTech Connect

    2016-04-29

    The video displays the Inconel 718 Turbine Blade made by Additive Manufacturing. First a gray scale neutron computed tomogram (CT) is displays with transparency in order to show the internal structure. Then the neutron CT is overlapped with the engineering drawing that was used to print the part and a comparison of external and internal structures is possible. This provides a map of the accuracy of the printed turbine (printing tolerance). Internal surface roughness can also be observed.

  19. Fingerprinting protocol for images based on additive homomorphic property.

    PubMed

    Kuribayashi, Minoru; Tanaka, Hatsukazu

    2005-12-01

    Homomorphic property of public-key cryptosystems is applied for several cryptographic protocols, such as electronic cash, voting system, bidding protocols, etc. Several fingerprinting protocols also exploit the property to achieve an asymmetric system. However, their enciphering rate is extremely low and the implementation of watermarking technique is difficult. In this paper, we propose a new fingerprinting protocol applying additive homomorphic property of Okamoto-Uchiyama encryption scheme. Exploiting the property ingenuously, the enciphering rate of our fingerprinting scheme can be close to the corresponding cryptosystem. We study the problem of implementation of watermarking technique and propose a successful method to embed an encrypted information without knowing the plain value. The security can also be protected for both a buyer and a merchant in our scheme.

  20. Big data for health.

    PubMed

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  1. Big Society, Big Deal?

    ERIC Educational Resources Information Center

    Thomson, Alastair

    2011-01-01

    Political leaders like to put forward guiding ideas or themes which pull their individual decisions into a broader narrative. For John Major it was Back to Basics, for Tony Blair it was the Third Way and for David Cameron it is the Big Society. While Mr. Blair relied on Lord Giddens to add intellectual weight to his idea, Mr. Cameron's legacy idea…

  2. Rapid and retrievable recording of big data of time-lapse 3D shadow images of microbial colonies.

    PubMed

    Ogawa, Hiroyuki; Nasu, Senshi; Takeshige, Motomu; Saito, Mikako; Matsuoka, Hideaki

    2015-05-15

    We formerly developed an automatic colony count system based on the time-lapse shadow image analysis (TSIA). Here this system has been upgraded and applied to practical rapid decision. A microbial sample was spread on/in an agar plate with 90 mm in diameter as homogeneously as possible. We could obtain the results with several strains that most of colonies appeared within a limited time span. Consequently the number of colonies reached a steady level (Nstdy) and then unchanged until the end of long culture time to give the confirmed value (Nconf). The equivalence of Nstdy and Nconf as well as the difference of times for Nstdy and Nconf determinations were statistically significant at p < 0.001. Nstdy meets the requirement of practical routines treating a large number of plates. The difference of Nstdy and Nconf, if any, may be elucidated by means of retrievable big data. Therefore Nconf is valid for official documentation.

  3. Artificial intelligence in medicine and cardiac imaging: harnessing big data and advanced computing to provide personalized medical diagnosis and treatment.

    PubMed

    Dilsizian, Steven E; Siegel, Eliot L

    2014-01-01

    Although advances in information technology in the past decade have come in quantum leaps in nearly every aspect of our lives, they seem to be coming at a slower pace in the field of medicine. However, the implementation of electronic health records (EHR) in hospitals is increasing rapidly, accelerated by the meaningful use initiatives associated with the Center for Medicare & Medicaid Services EHR Incentive Programs. The transition to electronic medical records and availability of patient data has been associated with increases in the volume and complexity of patient information, as well as an increase in medical alerts, with resulting "alert fatigue" and increased expectations for rapid and accurate diagnosis and treatment. Unfortunately, these increased demands on health care providers create greater risk for diagnostic and therapeutic errors. In the near future, artificial intelligence (AI)/machine learning will likely assist physicians with differential diagnosis of disease, treatment options suggestions, and recommendations, and, in the case of medical imaging, with cues in image interpretation. Mining and advanced analysis of "big data" in health care provide the potential not only to perform "in silico" research but also to provide "real time" diagnostic and (potentially) therapeutic recommendations based on empirical data. "On demand" access to high-performance computing and large health care databases will support and sustain our ability to achieve personalized medicine. The IBM Jeopardy! Challenge, which pitted the best all-time human players against the Watson computer, captured the imagination of millions of people across the world and demonstrated the potential to apply AI approaches to a wide variety of subject matter, including medicine. The combination of AI, big data, and massively parallel computing offers the potential to create a revolutionary way of practicing evidence-based, personalized medicine.

  4. Simultaneous capturing of RGB and additional band images using hybrid color filter array

    NASA Astrophysics Data System (ADS)

    Kiku, Daisuke; Monno, Yusuke; Tanaka, Masayuki; Okutomi, Masatoshi

    2014-03-01

    Extra band information in addition to the RGB, such as the near-infrared (NIR) and the ultra-violet, is valuable for many applications. In this paper, we propose a novel color filter array (CFA), which we call "hybrid CFA," and a demosaicking algorithm for the simultaneous capturing of the RGB and the additional band images. Our proposed hybrid CFA and demosaicking algorithm do not rely on any specific correlation between the RGB and the additional band. Therefore, the additional band can be arbitrarily decided by users. Experimental results demonstrate that our proposed demosaicking algorithm with the proposed hybrid CFA can provide the additional band image while keeping the RGB image almost the same quality as the image acquired by using the standard Bayer CFA.

  5. Breast Imaging in the Era of Big Data: Structured Reporting and Data Mining

    PubMed Central

    Margolies, Laurie R.; Pandey, Gaurav; Horowitz, Eliot R.; Mendelson, David S.

    2016-01-01

    OBJECTIVE The purpose of this article is to describe structured reporting and the development of large databases for use in data mining in breast imaging. CONCLUSION The results of millions of breast imaging examinations are reported with structured tools based on the BI-RADS lexicon. Much of these data are stored in accessible media. Robust computing power creates great opportunity for data scientists and breast imagers to collaborate to improve breast cancer detection and optimize screening algorithms. Data mining can create knowledge, but the questions asked and their complexity require extremely powerful and agile databases. New data technologies can facilitate outcomes research and precision medicine. PMID:26587797

  6. Mapping fetal brain development in utero using magnetic resonance imaging: the Big Bang of brain mapping.

    PubMed

    Studholme, Colin

    2011-08-15

    The development of tools to construct and investigate probabilistic maps of the adult human brain from magnetic resonance imaging (MRI) has led to advances in both basic neuroscience and clinical diagnosis. These tools are increasingly being applied to brain development in adolescence and childhood, and even to neonatal and premature neonatal imaging. Even earlier in development, parallel advances in clinical fetal MRI have led to its growing use as a tool in challenging medical conditions. This has motivated new engineering developments encompassing optimal fast MRI scans and techniques derived from computer vision, the combination of which allows full 3D imaging of the moving fetal brain in utero without sedation. These promise to provide a new and unprecedented window into early human brain growth. This article reviews the developments that have led us to this point, examines the current state of the art in the fields of fast fetal imaging and motion correction, and describes the tools to analyze dynamically changing fetal brain structure. New methods to deal with developmental tissue segmentation and the construction of spatiotemporal atlases are examined, together with techniques to map fetal brain growth patterns.

  7. Images of Paris: Big C Culture for the Nonspeaker of French.

    ERIC Educational Resources Information Center

    Spangler, May; York, Holly U.

    2002-01-01

    Discusses a course offered in both French and English at Emory University in Atlanta, Georgia that is based on the study of representations of Paris from the Middle Ages to the present. It uses architecture as a point of departure and explores the myth of Paris as expressed through a profusion of images in literature, painting, and film.…

  8. Big capabilities in small packages: hyperspectral imaging from a compact platform

    NASA Astrophysics Data System (ADS)

    Beasley, Matthew; Goldberg, Hannah; Voorhees, Christopher; Illsley, Peter

    2016-09-01

    We present the Compact Holographic Aberration-corrected Platform (CHAP) instrument, designed and developed at Planetary Resources Development Corporation. By combining a dispersive element with the secondary of a telescope, we are able to produce a relatively long focal length with moderate dispersion at the focal plane. This design enables us to build a capable hyperspectral imaging instrument within the size constraints of the Cubesat form-factor. The advantages of our design revolves around its simplicity: there are only two optical elements, producing both a white light and diffracted image. With the use of a replicated grating, we can produce a long focal length hyperspectral imager at a price point far below other spaceflight instruments. The design is scalable for larger platforms and since it has no transmitting optics and only two reflective surfaces could be designed to function at any desired wavelength. Our system will be capable of spectral imaging across the 400 to 900 nm spectral range for use in small body surveys.

  9. Unstructured medical image query using big data - An epilepsy case study.

    PubMed

    Istephan, Sarmad; Siadat, Mohammad-Reza

    2016-02-01

    Big data technologies are critical to the medical field which requires new frameworks to leverage them. Such frameworks would benefit medical experts to test hypotheses by querying huge volumes of unstructured medical data to provide better patient care. The objective of this work is to implement and examine the feasibility of having such a framework to provide efficient querying of unstructured data in unlimited ways. The feasibility study was conducted specifically in the epilepsy field. The proposed framework evaluates a query in two phases. In phase 1, structured data is used to filter the clinical data warehouse. In phase 2, feature extraction modules are executed on the unstructured data in a distributed manner via Hadoop to complete the query. Three modules have been created, volume comparer, surface to volume conversion and average intensity. The framework allows for user-defined modules to be imported to provide unlimited ways to process the unstructured data hence potentially extending the application of this framework beyond epilepsy field. Two types of criteria were used to validate the feasibility of the proposed framework - the ability/accuracy of fulfilling an advanced medical query and the efficiency that Hadoop provides. For the first criterion, the framework executed an advanced medical query that spanned both structured and unstructured data with accurate results. For the second criterion, different architectures were explored to evaluate the performance of various Hadoop configurations and were compared to a traditional Single Server Architecture (SSA). The surface to volume conversion module performed up to 40 times faster than the SSA (using a 20 node Hadoop cluster) and the average intensity module performed up to 85 times faster than the SSA (using a 40 node Hadoop cluster). Furthermore, the 40 node Hadoop cluster executed the average intensity module on 10,000 models in 3h which was not even practical for the SSA. The current study is

  10. Cell classification using big data analytics plus time stretch imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Jalali, Bahram; Chen, Claire L.; Mahjoubfar, Ata

    2016-09-01

    We show that blood cells can be classified with high accuracy and high throughput by combining machine learning with time stretch quantitative phase imaging. Our diagnostic system captures quantitative phase images in a flow microscope at millions of frames per second and extracts multiple biophysical features from individual cells including morphological characteristics, light absorption and scattering parameters, and protein concentration. These parameters form a hyperdimensional feature space in which supervised learning and cell classification is performed. We show binary classification of T-cells against colon cancer cells, as well classification of algae cell strains with high and low lipid content. The label-free screening averts the negative impact of staining reagents on cellular viability or cell signaling. The combination of time stretch machine vision and learning offers unprecedented cell analysis capabilities for cancer diagnostics, drug development and liquid biopsy for personalized genomics.

  11. Biomass estimator for NIR image with a few additional spectral band images taken from light UAS

    NASA Astrophysics Data System (ADS)

    Pölönen, Ilkka; Salo, Heikki; Saari, Heikki; Kaivosoja, Jere; Pesonen, Liisa; Honkavaara, Eija

    2012-05-01

    A novel way to produce biomass estimation will offer possibilities for precision farming. Fertilizer prediction maps can be made based on accurate biomass estimation generated by a novel biomass estimator. By using this knowledge, a variable rate amount of fertilizers can be applied during the growing season. The innovation consists of light UAS, a high spatial resolution camera, and VTT's novel spectral camera. A few properly selected spectral wavelengths with NIR images and point clouds extracted by automatic image matching have been used in the estimation. The spectral wavelengths were chosen from green, red, and NIR channels.

  12. Three-dimensional oxygen isotope imaging of convective fluid flow around the Big Bonanza, Comstock lode mining district, Nevada

    USGS Publications Warehouse

    Criss, R.E.; Singleton, M.J.; Champion, D.E.

    2000-01-01

    Oxygen isotope analyses of propylitized andesites from the Con Virginia and California mines allow construction of a detailed, three-dimensional image of the isotopic surfaces produced by the convective fluid flows that deposited the famous Big Bonanza orebody. On a set of intersecting maps and sections, the δ18O isopleths clearly show the intricate and conformable relationship of the orebody to a deep, ~500 m gyre of meteoric-hydrothermal fluid that circulated along and above the Comstock fault, near the contact of the Davidson Granodiorite. The core of this gyre (δ18O = 0 to 3.8‰) encompasses the bonanza and is almost totally surrounded by rocks having much lower δ18O values (–1.0 to –4.4‰). This deep gyre may represent a convective longitudinal roll superimposed on a large unicellular meteoric-hydrothermal system, producing a complex flow field with both radial and longitudinal components that is consistent with experimentally observed patterns of fluid convection in permeable media.

  13. A Three-Step Approach with Adaptive Additive Magnitude Selection for the Sharpening of Images

    PubMed Central

    Lee, Tien-Lin

    2014-01-01

    Aimed to find the additive magnitude automatically and adaptively, we propose a three-step and model-based approach for the sharpening of images in this paper. In the first pass, a Grey prediction model is applied to find a global maximal additive magnitude so that the condition of oversharpening in images to be sharpened can be avoided. During the second pass, edge pixels are picked out with our previously proposed edge detection mechanism. In this pass, a low-pass filter is also applied so that isolated pixels will not be regarded as around an edge. In the final pass, those pixels detected as around an edge are adjusted adaptively based on the local statistics, and those nonedge pixels are kept unaltered. Extensive experiments on natural images as well as medical images with subjective and objective evaluations will be given to demonstrate the usefulness of the proposed approach. PMID:25309951

  14. The BigBoss Experiment

    SciTech Connect

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  15. Image and compositional characteristics of the LDEF Big Guy impact crater

    NASA Technical Reports Server (NTRS)

    Bunch, T. E.; Paque, Julie M.; Zolensky, Michael

    1995-01-01

    A 5.2 mm crater in Al-metal represents the largest found on LDEF. We have examined this crater by field emission scanning electron microscopy (FESEM), energy dispersive spectroscopy (EDS) and time-of-flight/secondary ion mass spectroscopy (TOF-SIMS) in order to determine if there is any evidence of impactor residue. Droplet and dome-shaped columns, along with flow features, are evidence of melting. EDS from the crater cavity and rim show Mg, C, O and variable amounts of Si, in addition to Al. No evidence for a chondritic impactor was found, and it hypothesized that the crater may be the result of impact with space debris.

  16. Incidental vesicocolic fistula on routine bone scintigraphy: Value of additional delayed images and direct radionuclide cystography.

    PubMed

    Sohn, Myung-Hee; Tae Lim, Seok; Jin Jeong, Young; Wook Kim, Dong; Jeong, Hwan-Jeong; Yim, Chang-Yeol

    2010-09-01

    An unexpected vesicocolic fistula can be detected incidentally on routine bone scintigraphy. A 55-year-old man who had a radical colectomy for carcinoma of the sigmoid colon 1 year previously underwent bone scintigraphy to evaluate bone metastasis. Whole-body images showed an abnormal accumulation of radioactivity in the right lower quadrant of the abdomen, but the radioactivity did not precisely define a structure. Additional delayed images obtained after 15 and 24 hours of the initial image localized a vesicocolic fistula. Subsequent radionuclide cystography confirmed leakage of the radioactivity from the bladder.

  17. Sample preparation for mass spectrometry imaging: small mistakes can lead to big consequences.

    PubMed

    Goodwin, Richard J A

    2012-08-30

    Mass spectrometry imaging (MSI) enables the direct analysis of molecules from the surface of a wide variety of samples, allowing the multiplex measurement of both abundance and distribution of small molecules, lipids, peptides and proteins. As the technology has been refined an increasing number of ionization methods and mass analyzers has been used that enable increased spatial and spectral resolution measurements to be made at an increased speed. Alongside the instrumentation improvements there has been optimization of sample preparation procedures that allow the highest quality data to be obtained, reproducibly, from an ever increasing diversity of samples. This review will consider the development and standardization of sample preparation methods applicable to MSI, describing the stages and procedures undertaken from the instance of sample collection, through storage, preparation and on through final processing prior to analysis. Recent technical advancements will be highlighted and areas where further experimentation and optimization may well be required will be described. All aspects of the sample preparation pipeline will be considered in detail, with examples from the literature used to emphasize why rigorous sample preparation for MSI is vital to achieve the most accurate, reproducible and validated MSI data possible.

  18. Big Heart Data: Advancing Health Informatics through Data Sharing in Cardiovascular Imaging

    PubMed Central

    Suinesiaputra, Avan; Medrano-Gracia, Pau; Cowan, Brett R.; Young, Alistair A.

    2015-01-01

    The burden of heart disease is rapidly worsening due to increasing prevalence of obesity and diabetes. Data sharing and open database resources for heart health informatics are important for advancing our understanding of cardiovascular function, disease progression and therapeutics. Data sharing enables valuable information, often obtained at considerable expense and effort, to be re-used beyond the specific objectives of the original study. Many government funding agencies and journal publishers are requiring data re-use, and are providing mechanisms for data curation and archival. Tools and infrastructure are available to archive anonymous data from a wide range of studies, from descriptive epidemiological data to gigabytes of imaging data. Meta-analyses can be performed to combine raw data from disparate studies to obtain unique comparisons or to enhance statistical power. Open benchmark datasets are invaluable for validating data analysis algorithms and objectively comparing results. This review provides a rationale for increased data sharing and surveys recent progress in the cardiovascular domain. We also highlight the potential of recent large cardiovascular epidemiological studies enabling collaborative efforts to facilitate data sharing, algorithms benchmarking, disease modeling and statistical atlases. PMID:25415993

  19. Big heart data: advancing health informatics through data sharing in cardiovascular imaging.

    PubMed

    Suinesiaputra, Avan; Medrano-Gracia, Pau; Cowan, Brett R; Young, Alistair A

    2015-07-01

    The burden of heart disease is rapidly worsening due to the increasing prevalence of obesity and diabetes. Data sharing and open database resources for heart health informatics are important for advancing our understanding of cardiovascular function, disease progression and therapeutics. Data sharing enables valuable information, often obtained at considerable expense and effort, to be reused beyond the specific objectives of the original study. Many government funding agencies and journal publishers are requiring data reuse, and are providing mechanisms for data curation and archival. Tools and infrastructure are available to archive anonymous data from a wide range of studies, from descriptive epidemiological data to gigabytes of imaging data. Meta-analyses can be performed to combine raw data from disparate studies to obtain unique comparisons or to enhance statistical power. Open benchmark datasets are invaluable for validating data analysis algorithms and objectively comparing results. This review provides a rationale for increased data sharing and surveys recent progress in the cardiovascular domain. We also highlight the potential of recent large cardiovascular epidemiological studies enabling collaborative efforts to facilitate data sharing, algorithms benchmarking, disease modeling and statistical atlases.

  20. Additive global cerebral blood flow normalization in arterial spin labeling perfusion imaging.

    PubMed

    Stewart, Stephanie B; Koller, Jonathan M; Campbell, Meghan C; Perlmutter, Joel S; Black, Kevin J

    2015-01-01

    To determine how different methods of normalizing for global cerebral blood flow (gCBF) affect image quality and sensitivity to cortical activation, pulsed arterial spin labeling (pASL) scans obtained during a visual task were normalized by either additive or multiplicative normalization of modal gCBF. Normalization by either method increased the statistical significance of cortical activation by a visual stimulus. However, image quality was superior with additive normalization, whether judged by intensity histograms or by reduced variability within gray and white matter.

  1. Image and compositional characteristics of the LDEF Big Guy impact crater

    SciTech Connect

    Bunch, T.E.; Paque, J.M.; Zolensky, M. |

    1995-02-01

    A 5.2 mm crater in Al-metal represents the largest found on LDEF. The authors have examined this crater by field emission scanning electron microscopy (FESEM), energy dispersive spectroscopy (EDS) and time-of-flight/secondary ion mass spectroscopy (TOF-SIMS) in order to determine if there is any evidence of impactor residue. Droplet and dome-shaped columns, along with flow features, are evidence of melting. EDS from the crater cavity and rim show Mg, C, O and variable amounts of Si, in addition to Al. No evidence for a chondritic impactor was found, and it is hypothesized that the crater may be the result of impact with space debris.

  2. How Big Is Too Big?

    ERIC Educational Resources Information Center

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  3. Astronomy in the Cloud: Using MapReduce for Image Co-Addition

    NASA Astrophysics Data System (ADS)

    Wiley, K.; Connolly, A.; Gardner, J.; Krughoff, S.; Balazinska, M.; Howe, B.; Kwon, Y.; Bu, Y.

    2011-03-01

    In the coming decade, astronomical surveys of the sky will generate tens of terabytes of images and detect hundreds of millions of sources every night. The study of these sources will involve computation challenges such as anomaly detection and classification and moving-object tracking. Since such studies benefit from the highest-quality data, methods such as image co-addition, i.e., astrometric registration followed by per-pixel summation, will be a critical preprocessing step prior to scientific investigation. With a requirement that these images be analyzed on a nightly basis to identify moving sources such as potentially hazardous asteroids or transient objects such as supernovae, these data streams present many computational challenges. Given the quantity of data involved, the computational load of these problems can only be addressed by distributing the workload over a large number of nodes. However, the high data throughput demanded by these applications may present scalability challenges for certain storage architectures. One scalable data-processing method that has emerged in recent years is MapReduce, and in this article we focus on its popular open-source implementation called Hadoop. In the Hadoop framework, the data are partitioned among storage attached directly to worker nodes, and the processing workload is scheduled in parallel on the nodes that contain the required input data. A further motivation for using Hadoop is that it allows us to exploit cloud computing resources: i.e., platforms where Hadoop is offered as a service. We report on our experience of implementing a scalable image-processing pipeline for the SDSS imaging database using Hadoop. This multiterabyte imaging data set provides a good testbed for algorithm development, since its scope and structure approximate future surveys. First, we describe MapReduce and how we adapted image co-addition to the MapReduce framework. Then we describe a number of optimizations to our basic approach

  4. Optomap ultrawide field imaging identifies additional retinal abnormalities in patients with diabetic retinopathy

    PubMed Central

    Price, Liam D; Au, Stephanie; Chong, N Victor

    2015-01-01

    Purpose To compare diabetic retinopathy (DR) severity grading between Optomap ultrawide field scanning laser ophthalmoscope (UWFSLO) 200° images and an Early Treatment Diabetic Retinopathy Study (ETDRS) seven-standard field view. Methods Optomap UWFSLO images (total: 266) were retrospectively selected for evidence of DR from a database of eye clinic attendees. The Optomap UWFSLO images were graded for DR severity by two masked assessors. An ETDRS seven-field mask was overlaid on the Optomap UWFSLO images, and the DR grade was assessed for the region inside the mask. Any interassessor discrepancies were adjudicated by a senior retinal specialist. Kappa agreement levels were used for statistical analysis. Results Fifty images (19%) (P<0.001) were assigned a higher DR level in the Optomap UWFSLO view compared to the ETDRS seven-field view, which resulted in 40 images (15%) (P<0.001) receiving a higher DR severity grade. DR severity grades in the ETDRS seven-field view compared with the Optomap UWFSLO view were identical in 85% (226) of the images and within one severity level in 100% (266) of the images. Agreement between the two views was substantial: unweighted κ was 0.74±0.04 (95% confidence interval: 0.67–0.81) and weighted κ was 0.80±0.03 (95% confidence interval: 0.74–0.86). Conclusion Compared to the ETDRS seven-field view, a significant minority of patients are diagnosed with more severe DR when using the Optomap UWFSLO view. The clinical significance of additional peripheral lesions requires evaluation in future prospective studies using large cohorts. PMID:25848202

  5. High resolution seismic-reflection imaging of shallow deformation beneath the northeast margin of the Manila high at Big Lake, Arkansas

    USGS Publications Warehouse

    Odum, J.K.; Stephenson, W.J.; Williams, R.A.; Worley, D.M.; Guccione, M.J.; Van Arsdale, R.B.

    2001-01-01

    The Manila high, an elliptical area 19 km long (N-S) by 6 km wide (E-W) located west-southwest of Big Lake. Arkansas, has less than 3 m of topographic relief. Geomorphic, stratigraphic and chronology data indicate that Big Lake formed during at least two periods of Holocene uplift and subsequent damming of the south-flowing Little River. Age data of an organic mat located at the base of an upper lacustrine deposit indicates an abrupt, possibly tectonic, formation of the present Big Lake between AD 1640 and 1950. We acquired 7 km of high-resolution seismic-reflection data across the northeastern margin of the Manila high to examine its near-surface bedrock structure and possible association with underlying structures such as the Blytheville arch. Sense of displacement and character of imaged faults support interpretations for either a northwest trending, 1.5 km-wide, block of uplifted strata or a series of parallel northeast-trending faults that bound horst and graben structures. We interpret deformation of the Manila high to result from faulting generated by the reactivation of right-lateral strike-slip fault motion along this portion of the Blytheville arch. The most recent uplift of the Manila high may have occurred during the December 16, 1811, New Madrid earthquake. Published by Elsevier Science B.V.

  6. Terahertz imaging and tomography as efficient instruments for testing polymer additive manufacturing objects.

    PubMed

    Perraud, J B; Obaton, A F; Bou-Sleiman, J; Recur, B; Balacey, H; Darracq, F; Guillet, J P; Mounaix, P

    2016-05-01

    Additive manufacturing (AM) technology is not only used to make 3D objects but also for rapid prototyping. In industry and laboratories, quality controls for these objects are necessary though difficult to implement compared to classical methods of fabrication because the layer-by-layer printing allows for very complex object manufacturing that is unachievable with standard tools. Furthermore, AM can induce unknown or unexpected defects. Consequently, we demonstrate terahertz (THz) imaging as an innovative method for 2D inspection of polymer materials. Moreover, THz tomography may be considered as an alternative to x-ray tomography and cheaper 3D imaging for routine control. This paper proposes an experimental study of 3D polymer objects obtained by additive manufacturing techniques. This approach allows us to characterize defects and to control dimensions by volumetric measurements on 3D data reconstructed by tomography.

  7. Thermal imaging for assessment of electron-beam freeform fabrication (EBF3) additive manufacturing deposits

    NASA Astrophysics Data System (ADS)

    Zalameda, Joseph N.; Burke, Eric R.; Hafley, Robert A.; Taminger, Karen M.; Domack, Christopher S.; Brewer, Amy; Martin, Richard E.

    2013-05-01

    Additive manufacturing is a rapidly growing field where 3-dimensional parts can be produced layer by layer. NASA's electron beam freeform fabrication (EBF3) technology is being evaluated to manufacture metallic parts in a space environment. The benefits of EBF3 technology are weight savings to support space missions, rapid prototyping in a zero gravity environment, and improved vehicle readiness. The EBF3 system is composed of 3 main components: electron beam gun, multi-axis position system, and metallic wire feeder. The electron beam is used to melt the wire and the multi-axis positioning system is used to build the part layer by layer. To insure a quality deposit, a near infrared (NIR) camera is used to image the melt pool and solidification areas. This paper describes the calibration and application of a NIR camera for temperature measurement. In addition, image processing techniques are presented for deposit assessment metrics.

  8. Color reproductivity improvement with additional virtual color filters for WRGB image sensor

    NASA Astrophysics Data System (ADS)

    Kawada, Shun; Kuroda, Rihito; Sugawa, Shigetoshi

    2013-02-01

    We have developed a high accuracy color reproduction method based on an estimated spectral reflectance of objects using additional virtual color filters for a wide dynamic range WRGB color filter CMOS image sensor. The four virtual color filters are created by multiplying the spectral sensitivity of White pixel by gauss functions which have different central wave length and standard deviation, and the virtual sensor outputs of those virtual filters are estimated from the four real output signals of the WRGB image sensor. The accuracy of color reproduction was evaluated with a Macbeth Color Checker (MCC), and the averaged value of the color difference ΔEab of 24 colors was 1.88 with our approach.

  9. Target detection in active polarization images perturbed with additive noise and illumination nonuniformity.

    PubMed

    Bénière, Arnaud; Goudail, François; Dolfi, Daniel; Alouini, Mehdi

    2009-07-01

    Active imaging systems that illuminate a scene with polarized light and acquire two images in two orthogonal polarizations yield information about the intensity contrast and the orthogonal state contrast (OSC) in the scene. Both contrasts are relevant for target detection. However, in real systems, the illumination is often spatially or temporally nonuniform. This creates artificial intensity contrasts that can lead to false alarms. We derive generalized likelihood ratio test (GLRT) detectors, for which intensity information is taken into account or not and determine the relevant expressions of the contrast in these two situations. These results are used to determine in which cases considering intensity information in addition to polarimetric information is relevant or not.

  10. Variation in Additional Breast Imaging Orders and Impact on Surgical Wait Times at a Comprehensive Cancer Center

    PubMed Central

    Golshan, Mehra; Losk, Katya; Mallory, Melissa A.; Camuso, Kristen; Troyan, Susan; Lin, Nancy U.; Kadish, Sarah; Bunnell, Craig A.

    2015-01-01

    Background In the multidisciplinary care model, breast imagers frequently provide second opinion reviews of imaging studies performed at outside institutions. However, the need for additional imaging and timeliness of obtaining these studies has yet to be established. We sought to evaluate the frequency of additional imaging orders by breast surgeons and to evaluate the impact of this supplementary imaging on timeliness of surgery. Methods We identified 2,489 consecutive women with breast cancer who underwent first definitive surgery (FDS) at our comprehensive cancer center between 2011 and 2013. The number of breast-specific imaging studies performed for each patient between initial consultation and FDS was obtained. Chi-squared tests were used to quantify the proportion of patients undergoing additional imaging by surgeon. Interval time between initial consultation and additional imaging and/or biopsy was calculated. The delay of additional imaging on time to FDS was assessed by t-test. Results Of 2,489 patients, 615 (24.7%) had at least one additional breast-specific imaging study performed between initial consultation and FDS, with 222 patients undergoing additional biopsies (8.9%). The proportion of patients receiving imaging tests by breast surgeon ranged from 15% to 39% (p<0.0001). Patients receiving additional imaging had statistically longer wait times to FDS for BCT (21.4 to 28.5 days, p<0.0001). Conclusions Substantial variability exists in the utilization of additional breast-specific imaging and in the timeliness of obtaining these tests among breast surgeons. Further research is warranted to assess the sources and impact of this variation on patient care, cost and outcomes. PMID:26307233

  11. Big data bioinformatics.

    PubMed

    Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao

    2014-12-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia.

  12. Big Opportunities and Big Concerns of Big Data in Education

    ERIC Educational Resources Information Center

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  13. Big Dreams

    ERIC Educational Resources Information Center

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  14. Big bluestem

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Big Bluestem (Andropogon gerardii) is a warm season grass native to North America, accounting for 40% of the herbaceous biomass of the tall grass prairie, and a candidate for bioenergy feedstock production. The goal of this study was to measure among and within population genetic variation of natura...

  15. Additive controlled synthesis of gold nanorods (GNRs) for two-photon luminescence imaging of cancer cells.

    PubMed

    Zhu, Jing; Yong, Ken-Tye; Roy, Indrajit; Hu, Rui; Ding, Hong; Zhao, Lingling; Swihart, Mark T; He, Guang S; Cui, Yiping; Prasad, Paras N

    2010-07-16

    Gold nanorods (GNRs) with a longitudinal surface plasmon resonance peak that is tunable from 600 to 1100 nm have been fabricated in a cetyl trimethylammoniumbromide (CTAB) micellar medium using hydrochloric acid and silver nitrate as additives to control their shape and size. By manipulating the concentrations of silver nitrate and hydrochloric acid, the aspect ratio of the GNRs was reliably and reproducibly tuned from 2.5 to 8. The GNRs were first coated with polyelectrolyte multilayers and then bioconjugated to transferrin (Tf) to target pancreatic cancer cells. Two-photon imaging excited from the bioconjugated GNRs demonstrated receptor-mediated uptake of the bioconjugates into Panc-1 cells, overexpressing the transferrin receptor (TfR). The bioconjugated GNR formulation exhibited very low toxicity, suggesting that it is biocompatible and potentially suitable for targeted two-photon bioimaging.

  16. Additive white Gaussian noise level estimation in SVD domain for images.

    PubMed

    Liu, Wei; Lin, Weisi

    2013-03-01

    Accurate estimation of Gaussian noise level is of fundamental interest in a wide variety of vision and image processing applications as it is critical to the processing techniques that follow. In this paper, a new effective noise level estimation method is proposed on the basis of the study of singular values of noise-corrupted images. Two novel aspects of this paper address the major challenges in noise estimation: 1) the use of the tail of singular values for noise estimation to alleviate the influence of the signal on the data basis for the noise estimation process and 2) the addition of known noise to estimate the content-dependent parameter, so that the proposed scheme is adaptive to visual signals, thereby enabling a wider application scope of the proposed scheme. The analysis and experiment results demonstrate that the proposed algorithm can reliably infer noise levels and show robust behavior over a wide range of visual content and noise conditions, and that is outperforms relevant existing methods.

  17. Big Bend sees big environmental push

    SciTech Connect

    Blankinship, S.

    2007-10-15

    The 1800 MW Big Bend Power Station is a coal-fired facility in Tampa Bay, Florida, USA owned by Tampa Electric. It has four pulverized coal- fired steam units equipped with FGD scrubbers and electrostatic precipitators. Currently the addition of selective catalytic reduction (SCR) systems is under consideration. The Unit 4 SCR retrofit was completed in June 2007; the remaining three systems are scheduled for completion by 2010. Boiler draft systems will be modified to a balance draft design to accommodate the increased pressure drop of the new systems. 3-D computer models were developed to determine constructability due to the tight clearance at the site. 1 photo.

  18. Big data in multiple sclerosis: development of a web-based longitudinal study viewer in an imaging informatics-based eFolder system for complex data analysis and management

    NASA Astrophysics Data System (ADS)

    Ma, Kevin; Wang, Ximing; Lerner, Alex; Shiroishi, Mark; Amezcua, Lilyana; Liu, Brent

    2015-03-01

    In the past, we have developed and displayed a multiple sclerosis eFolder system for patient data storage, image viewing, and automatic lesion quantification results stored in DICOM-SR format. The web-based system aims to be integrated in DICOM-compliant clinical and research environments to aid clinicians in patient treatments and disease tracking. This year, we have further developed the eFolder system to handle big data analysis and data mining in today's medical imaging field. The database has been updated to allow data mining and data look-up from DICOM-SR lesion analysis contents. Longitudinal studies are tracked, and any changes in lesion volumes and brain parenchyma volumes are calculated and shown on the webbased user interface as graphical representations. Longitudinal lesion characteristic changes are compared with patients' disease history, including treatments, symptom progressions, and any other changes in the disease profile. The image viewer is updated such that imaging studies can be viewed side-by-side to allow visual comparisons. We aim to use the web-based medical imaging informatics eFolder system to demonstrate big data analysis in medical imaging, and use the analysis results to predict MS disease trends and patterns in Hispanic and Caucasian populations in our pilot study. The discovery of disease patterns among the two ethnicities is a big data analysis result that will help lead to personalized patient care and treatment planning.

  19. Correction of non-additive errors in variational and ensemble data assimilation using image registration

    NASA Astrophysics Data System (ADS)

    Landelius, Tomas; Bojarova, Jelena; Gustafsson, Nils; Lindskog, Magnus

    2013-04-01

    It is hard to forecast the position of localized weather phenomena such as clouds, precipitation, and fronts. Moreover, cloudy areas are important since this is where most of the active weather occurs. Position errors, also known as phase or alignment or displacement errors, can have several causes; timing errors, deficient model physics, inadequate model resolution, etc. Furthermore, position errors have been shown to be non-additive and non-Gaussian, which violates the error model most data assimilation methods rely on. Remote sensing data contain coherent information on the weather development in time and space. By comparing structures in radar or satellite images with the forecast model state it is possible to get information about position errors. We use an image registration (optical flow) method to find a transformation, in terms of a displacement field, that aligns the model state with the corresponding remote sensing data. In particular, we surmise that assimilation of radiances in cloudy areas will benefit from a better aligned first guess. Analysis perturbations should become smaller and be easier to handle by the linearizations in the observation operator. In the variational setting the displacement field is used as a mapping function to obtain a new, better aligned, first guess from the old one by means of interpolation (warping). To reduce the effect of imbalances, the aligned first guess is not used as is. Instead it is used for generation of pseudo observations that are assimilated in a first step to get an aligned and balanced first guess. This step reduces the non-additive errors due to mis-alignment and is followed by a second step with a standard variational assimilation to compensate for the remaining additive errors. In ensemble data assimilation a displacement field is estimated for each ensemble member and is used as a distance measure. In areas where a member has a smaller displacement (smaller position error) than the control it is given

  20. Solving the Big Data (BD) Problem in Advanced Manufacturing (Subcategory for work done at Georgia Tech. Study Process and Design Factors for Additive Manufacturing Improvement)

    SciTech Connect

    Clark, Brett W.; Diaz, Kimberly A.; Ochiobi, Chinaza Darlene; Paynabar, Kamran

    2015-09-01

    3D printing originally known as additive manufacturing is a process of making 3 dimensional solid objects from a CAD file. This ground breaking technology is widely used for industrial and biomedical purposes such as building objects, tools, body parts and cosmetics. An important benefit of 3D printing is the cost reduction and manufacturing flexibility; complex parts are built at the fraction of the price. However, layer by layer printing of complex shapes adds error due to the surface roughness. Any such error results in poor quality products with inaccurate dimensions. The main purpose of this research is to measure the amount of printing errors for parts with different geometric shapes and to analyze them for finding optimal printing settings to minimize the error. We use a Design of Experiments framework, and focus on studying parts with cone and ellipsoid shapes. We found that the orientation and the shape of geometric shapes have significant effect on the printing error. From our analysis, we also determined the optimal orientation that gives the least printing error.

  1. Additional value of biplane transoesophageal imaging in assessment of mitral valve prostheses.

    PubMed Central

    Groundstroem, K; Rittoo, D; Hoffman, P; Bloomfield, P; Sutherland, G R

    1993-01-01

    OBJECTIVES--To determine whether biplane transoesophageal imaging offers advantages in the evaluation of mitral prostheses when compared with standard single transverse plane imaging or the precordial approach in suspected prosthetic dysfunction. DESIGN--Prospective mitral valve prosthesis in situ using precordial and biplane transoesophageal ultrasonography. SETTING--Tertiary cardiac referral centre. SUBJECTS--67 consecutive patients with suspected dysfunction of a mitral valve prosthesis (16 had bioprostheses and 51 mechanical prostheses) who underwent precordial, transverse plane, and biplane transoesophageal echocardiography. Correlative invasive confirmation from surgery or angiography, or both, was available in 44 patients. MAIN OUTCOME MEASURES--Number, type, and site of leak according to the three means of scanning. RESULTS--Transverse plane transoesophageal imaging alone identified all 31 medial/lateral paravalvar leaks but only 24/30 of the anterior/posterior leaks. Combining the information from both imaging planes confirmed that biplane scanning identified all paravalvar leaks. Five of the six patients with prosthetic valve endocarditis, all three with valvar thrombus or obstruction, and all three with mitral annulus rupture were diagnosed from transverse plane imaging alone. Longitudinal plane imaging alone enabled diagnosis of the remaining case of prosthetic endocarditis and a further case of subvalvar pannus formation. CONCLUSIONS--Transverse plane transoesophageal imaging was superior to the longitudinal imaging in identifying medial and lateral lesions around the sewing ring of a mitral valve prosthesis. Longitudinal plane imaging was superior in identifying anterior and posterior lesions. Biplane imaging is therefore an important development in the study of mitral prosthesis function. Images PMID:8398497

  2. Big Sky Carbon Atlas

    DOE Data Explorer

    The Big Sky Carbon Atlas is an online geoportal designed for you to discover, interpret, and access geospatial data and maps relevant to decision support and education on carbon sequestration in the Big Sky Region. In serving as the public face of the Partnership's spatial Data Libraries, the Atlas provides a gateway to geographic information characterizing CO2 sources, potential geologic sinks, terrestrial carbon fluxes, civil and energy infrastructure, energy use, and related themes. In addition to directly serving the BSCSP and its stakeholders, the Atlas feeds regional data to the NatCarb Portal, contributing to a national perspective on carbon sequestration. Established components of the Atlas include a gallery of thematic maps and an interactive map that allows you to: • Navigate and explore regional characterization data through a user-friendly interface • Print your map views or publish them as PDFs • Identify technical references relevant to specific areas of interest • Calculate straight-line or pipeline-constrained distances from point sources of CO2 to potential geologic sink features • Download regional data layers (feature under development) (Acknowledgment to the Big Sky Carbon Sequestration Partnership (BSCSP); see home page at http://www.bigskyco2.org/)

  3. Satellite-based land use mapping: comparative analysis of Landsat-8, Advanced Land Imager, and big data Hyperion imagery

    NASA Astrophysics Data System (ADS)

    Pervez, Wasim; Uddin, Vali; Khan, Shoab Ahmad; Khan, Junaid Aziz

    2016-04-01

    Until recently, Landsat technology has suffered from low signal-to-noise ratio (SNR) and comparatively poor radiometric resolution, which resulted in limited application for inland water and land use/cover mapping. The new generation of Landsat, the Landsat Data Continuity Mission carrying the Operational Land Imager (OLI), has improved SNR and high radiometric resolution. This study evaluated the utility of orthoimagery from OLI in comparison with the Advanced Land Imager (ALI) and hyperspectral Hyperion (after preprocessing) with respect to spectral profiling of classes, land use/cover classification, classification accuracy assessment, classifier selection, study area selection, and other applications. For each data source, the support vector machine (SVM) model outperformed the spectral angle mapper (SAM) classifier in terms of class discrimination accuracy (i.e., water, built-up area, mixed forest, shrub, and bare soil). Using the SVM classifier, Hyperion hyperspectral orthoimagery achieved higher overall accuracy than OLI and ALI. However, OLI outperformed both hyperspectral Hyperion and multispectral ALI using the SAM classifier, and with the SVM classifier outperformed ALI in terms of overall accuracy and individual classes. The results show that the new generation of Landsat achieved higher accuracies in mapping compared with the previous Landsat multispectral satellite series.

  4. 75 FR 73090 - Medicare Program; Listening Session on Development of Additional Imaging Efficiency Measures for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-29

    ..., cell phones, and palm pilots, are subject to physical inspection. We cannot assume responsibility for... measures that CMS could consider. Measure developers, hospitals, medical specialty societies, medical... medical technology costs. The imaging efficiency measures fill a significant gap in the availability...

  5. Big Data Analytics in Healthcare.

    PubMed

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  6. Big Data Analytics in Healthcare

    PubMed Central

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S. M. Reza; Navidi, Fatemeh; Beard, Daniel A.; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  7. Big data for bipolar disorder.

    PubMed

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  8. SU-E-J-06: Additional Imaging Guidance Dose to Patient Organs Resulting From X-Ray Tubes Used in CyberKnife Image Guidance System

    SciTech Connect

    Sullivan, A; Ding, G

    2015-06-15

    Purpose: The use of image-guided radiation therapy (IGRT) has become increasingly common, but the additional radiation exposure resulting from repeated image guidance procedures raises concerns. Although there are many studies reporting imaging dose from different image guidance devices, imaging dose for the CyberKnife Robotic Radiosurgery System is not available. This study provides estimated organ doses resulting from image guidance procedures on the CyberKnife system. Methods: Commercially available Monte Carlo software, PCXMC, was used to calculate average organ doses resulting from x-ray tubes used in the CyberKnife system. There are seven imaging protocols with kVp ranging from 60 – 120 kV and 15 mAs for treatment sites in the Cranium, Head and Neck, Thorax, and Abdomen. The output of each image protocol was measured at treatment isocenter. For each site and protocol, Adult body sizes ranging from anorexic to extremely obese were simulated since organ dose depends on patient size. Doses for all organs within the imaging field-of-view of each site were calculated for a single image acquisition from both of the orthogonal x-ray tubes. Results: Average organ doses were <1.0 mGy for every treatment site and imaging protocol. For a given organ, dose increases as kV increases or body size decreases. Higher doses are typically reported for skeletal components, such as the skull, ribs, or clavicles, than for softtissue organs. Typical organ doses due to a single exposure are estimated as 0.23 mGy to the brain, 0.29 mGy to the heart, 0.08 mGy to the kidneys, etc., depending on the imaging protocol and site. Conclusion: The organ doses vary with treatment site, imaging protocol and patient size. Although the organ dose from a single image acquisition resulting from two orthogonal beams is generally insignificant, the sum of repeated image acquisitions (>100) could reach 10–20 cGy for a typical treatment fraction.

  9. Enhancement of Glossiness Perception by Retinal-Image Motion: Additional Effect of Head-Yoked Motion Parallax

    PubMed Central

    Tani, Yusuke; Araki, Keisuke; Nagai, Takehiro; Koida, Kowa; Nakauchi, Shigeki; Kitazaki, Michiteru

    2013-01-01

    It has been argued that when an observer moves, a contingent retinal-image motion of a stimulus would strengthen the perceived glossiness. This would be attributed to the veridical perception of three-dimensional structure by motion parallax. However, it has not been investigated whether the effect of motion parallax is more than that of retinal-image motion of the stimulus. Using a magnitude estimation method, we examine in this paper whether cross-modal coordination of the stimulus change and the observer's motion (i.e., motion parallax) is essential or the retinal-image motion alone is sufficient for enhancing the perceived glossiness. Our data show that a retinal-image motion simulating motion parallax without head motion strengthened the perceived glossiness but that its effect was weaker than that of motion parallax with head motion. These results suggest the existence of an additional effect of the cross-modal coordination between vision and proprioception on glossiness perception. That is, motion parallax enhances the perception of glossiness, in addition to retinal-image motions of specular surfaces. PMID:23336006

  10. Testing a Gender Additive Model: The Role of Body Image in Adolescent Depression

    ERIC Educational Resources Information Center

    Bearman, Sarah Kate; Stice, Eric

    2008-01-01

    Despite consistent evidence that adolescent girls are at greater risk of developing depression than adolescent boys, risk factor models that account for this difference have been elusive. The objective of this research was to examine risk factors proposed by the "gender additive" model of depression that attempts to partially explain the increased…

  11. Thermal Imaging for Assessment of Electron-Beam Free Form Fabrication (EBF(sup 3)) Additive Manufacturing Welds

    NASA Technical Reports Server (NTRS)

    Zalameda, Joseph N.; Burke, Eric R.; Hafley, Robert A.; Taminger, Karen M.; Domack, Christopher S.; Brewer, Amy R.; Martin, Richard E.

    2013-01-01

    Additive manufacturing is a rapidly growing field where 3-dimensional parts can be produced layer by layer. NASA s electron beam free-form fabrication (EBF(sup 3)) technology is being evaluated to manufacture metallic parts in a space environment. The benefits of EBF(sup 3) technology are weight savings to support space missions, rapid prototyping in a zero gravity environment, and improved vehicle readiness. The EBF(sup 3) system is composed of 3 main components: electron beam gun, multi-axis position system, and metallic wire feeder. The electron beam is used to melt the wire and the multi-axis positioning system is used to build the part layer by layer. To insure a quality weld, a near infrared (NIR) camera is used to image the melt pool and solidification areas. This paper describes the calibration and application of a NIR camera for temperature measurement. In addition, image processing techniques are presented for weld assessment metrics.

  12. Complementing and adding to SEM performance with the addition of XRF, Raman, CL and PL spectroscopy and imaging

    NASA Astrophysics Data System (ADS)

    Leroy, E.; Mamedov, S.; Teboul, E.; Whitley, A.; Meyer, D.; Casson, L.

    2010-06-01

    Electron microscopy, along with many other surface science and analytical techniques, offers an array of complementary sub-techniques that provide additional information to enhance the primary analysis or imaging mode. Most electron microscopes are built with several additional ports for the installation of complementary analysis modules. One type of analysis which is particular useful in geology and semiconductor analysis is cathodoluminescence (CL). A new technique has been developed to allow complementary optical measurements using the electron beam from the SEM, compatible with most standard commercial SEM systems. Among the optical measurements accessible using the Cathodoluminescence Universal Extension (CLUE) module are CL, Raman, PL and EDX spectroscopy and imaging. This paper shows the advantages of using these complementary techniques, and how they can be applied to analysis of geological and semiconductor materials.

  13. Real Time Optical Interferometric Image Addition and Subtraction by Wave Polarization.

    DTIC Science & Technology

    1981-10-15

    anal’ vs is of thei r experimental res u Its w-,i th reg: od to th)e var-ious opti cal components such as the lens and 1Wol lastor nri sm,. havo been...lens C. It was found that the ;.O iznrs F. ind P,, ...ore no! required. In that case, each input irane I t’V,( .) 1Vll v seurated images at the output...ti u sd n t - r a 7C I Ia t -!On , h2 6nIf 32 .3 nmii; t he i (-k tie s s o ft the i-e 1as t o 1 1;1 1110 ~ mjlfe cf the Wol laston nri -mi

  14. Assessing the use of an infrared spectrum hyperpixel array imager to measure temperature during additive and subtractive manufacturing

    NASA Astrophysics Data System (ADS)

    Whitenton, Eric; Heigel, Jarred; Lane, Brandon; Moylan, Shawn

    2016-05-01

    Accurate non-contact temperature measurement is important to optimize manufacturing processes. This applies to both additive (3D printing) and subtractive (material removal by machining) manufacturing. Performing accurate single wavelength thermography suffers numerous challenges. A potential alternative is hyperpixel array hyperspectral imaging. Focusing on metals, this paper discusses issues involved such as unknown or changing emissivity, inaccurate greybody assumptions, motion blur, and size of source effects. The algorithm which converts measured thermal spectra to emissivity and temperature uses a customized multistep non-linear equation solver to determine the best-fit emission curve. Emissivity dependence on wavelength may be assumed uniform or have a relationship typical for metals. The custom software displays residuals for intensity, temperature, and emissivity to gauge the correctness of the greybody assumption. Initial results are shown from a laser powder-bed fusion additive process, as well as a machining process. In addition, the effects of motion blur are analyzed, which occurs in both additive and subtractive manufacturing processes. In a laser powder-bed fusion additive process, the scanning laser causes the melt pool to move rapidly, causing a motion blur-like effect. In machining, measuring temperature of the rapidly moving chip is a desirable goal to develop and validate simulations of the cutting process. A moving slit target is imaged to characterize how the measured temperature values are affected by motion of a measured target.

  15. Determination of detergent and dispensant additives in gasoline by ring-oven and near infrared hypespectral imaging.

    PubMed

    Rodrigues e Brito, Lívia; da Silva, Michelle P F; Rohwedder, Jarbas J R; Pasquini, Celio; Honorato, Fernanda A; Pimentel, Maria Fernanda

    2015-03-10

    A method using the ring-oven technique for pre-concentration in filter paper discs and near infrared hyperspectral imaging is proposed to identify four detergent and dispersant additives, and to determine their concentration in gasoline. Different approaches were used to select the best image data processing in order to gather the relevant spectral information. This was attained by selecting the pixels of the region of interest (ROI), using a pre-calculated threshold value of the PCA scores arranged as histograms, to select the spectra set; summing up the selected spectra to achieve representativeness; and compensating for the superimposed filter paper spectral information, also supported by scores histograms for each individual sample. The best classification model was achieved using linear discriminant analysis and genetic algorithm (LDA/GA), whose correct classification rate in the external validation set was 92%. Previous classification of the type of additive present in the gasoline is necessary to define the PLS model required for its quantitative determination. Considering that two of the additives studied present high spectral similarity, a PLS regression model was constructed to predict their content in gasoline, while two additional models were used for the remaining additives. The results for the external validation of these regression models showed a mean percentage error of prediction varying from 5 to 15%.

  16. Five Big Ideas

    ERIC Educational Resources Information Center

    Morgan, Debbie

    2012-01-01

    Designing quality continuing professional development (CPD) for those teaching mathematics in primary schools is a challenge. If the CPD is to be built on the scaffold of five big ideas in mathematics, what might be these five big ideas? Might it just be a case of, if you tell me your five big ideas, then I'll tell you mine? Here, there is…

  17. Big data are coming to psychiatry: a general introduction.

    PubMed

    Monteith, Scott; Glenn, Tasha; Geddes, John; Bauer, Michael

    2015-12-01

    Big data are coming to the study of bipolar disorder and all of psychiatry. Data are coming from providers and payers (including EMR, imaging, insurance claims and pharmacy data), from omics (genomic, proteomic, and metabolomic data), and from patients and non-providers (data from smart phone and Internet activities, sensors and monitoring tools). Analysis of the big data will provide unprecedented opportunities for exploration, descriptive observation, hypothesis generation, and prediction, and the results of big data studies will be incorporated into clinical practice. Technical challenges remain in the quality, analysis and management of big data. This paper discusses some of the fundamental opportunities and challenges of big data for psychiatry.

  18. BigNeuron dataset V.0.0

    DOE Data Explorer

    Ramanathan, Arvind

    2016-01-01

    The cleaned bench testing reconstructions for the gold166 datasets have been put online at github https://github.com/BigNeuron/Events-and-News/wiki/BigNeuron-Events-and-News https://github.com/BigNeuron/Data/releases/tag/gold166_bt_v1.0 The respective image datasets were released a while ago from other sites (major pointer is available at github as well https://github.com/BigNeuron/Data/releases/tag/Gold166_v1 but since the files were big, the actual downloading was distributed at 3 continents separately)

  19. Big Data

    PubMed Central

    SOBEK, MATTHEW; CLEVELAND, LARA; FLOOD, SARAH; HALL, PATRICIA KELLY; KING, MIRIAM L.; RUGGLES, STEVEN; SCHROEDER, MATTHEW

    2011-01-01

    The Minnesota Population Center (MPC) provides aggregate data and microdata that have been integrated and harmonized to maximize crosstemporal and cross-spatial comparability. All MPC data products are distributed free of charge through an interactive Web interface that enables users to limit the data and metadata being analyzed to samples and variables of interest to their research. In this article, the authors describe the integrated databases available from the MPC, report on recent additions and enhancements to these data sets, and summarize new online tools and resources that help users to analyze the data over time. They conclude with a description of the MPC’s newest and largest infrastructure project to date: a global population and environment data network. PMID:21949459

  20. Keeping up with Big Data--Designing an Introductory Data Analytics Class

    ERIC Educational Resources Information Center

    Hijazi, Sam

    2016-01-01

    Universities need to keep up with the demand of the business world when it comes to Big Data. The exponential increase in data has put additional demands on academia to meet the big gap in education. Business demand for Big Data has surpassed 1.9 million positions in 2015. Big Data, Business Intelligence, Data Analytics, and Data Mining are the…

  1. Big Spherules near 'Victoria'

    NASA Technical Reports Server (NTRS)

    2006-01-01

    This frame from the microscopic imager on NASA's Mars Exploration Rover Opportunity shows spherules up to about 5 millimeters (one-fifth of an inch) in diameter. The camera took this image during the 924th Martian day, or sol, of Opportunity's Mars-surface mission (Aug. 30, 2006), when the rover was about 200 meters (650 feet) north of 'Victoria Crater.'

    Opportunity discovered spherules like these, nicknamed 'blueberries,' at its landing site in 'Eagle Crater,' and investigations determined them to be iron-rich concretions that formed inside deposits soaked with groundwater. However, such concretions were much smaller or absent at the ground surface along much of the rover's trek of more than 5 kilometers (3 miles) southward to Victoria. The big ones showed up again when Opportunity got to the ring, or annulus, of material excavated and thrown outward by the impact that created Victoria Crater. Researchers hypothesize that some layer beneath the surface in Victoria's vicinity was once soaked with water long enough to form the concretions, that the crater-forming impact dispersed some material from that layer, and that Opportunity might encounter that layer in place if the rover drives down into the crater.

  2. The big deal about big data.

    PubMed

    Moore, Keith D; Eyestone, Katherine; Coddington, Dean C

    2013-08-01

    Big data is a concept that is being widely applied in the retail industries as a means to understand customers' purchasing habits and preferences for followup promotional activity. It is characterized by vast amounts of diverse and rapidly multiplying data that are available at or near real-time. Conversations with executives of leading healthcare organizations provide a barometer for understanding where the industry stands in its adoption of big data as a means to meet the critical information requirements of value-based health care.

  3. Dual of big bang and big crunch

    SciTech Connect

    Bak, Dongsu

    2007-01-15

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory.

  4. Implementing Big History.

    ERIC Educational Resources Information Center

    Welter, Mark

    2000-01-01

    Contends that world history should be taught as "Big History," a view that includes all space and time beginning with the Big Bang. Discusses five "Cardinal Questions" that serve as a course structure and address the following concepts: perspectives, diversity, change and continuity, interdependence, and causes. (CMK)

  5. Big data: the management revolution.

    PubMed

    McAfee, Andrew; Brynjolfsson, Erik

    2012-10-01

    Big data, the authors write, is far more powerful than the analytics of the past. Executives can measure and therefore manage more precisely than ever before. They can make better predictions and smarter decisions. They can target more-effective interventions in areas that so far have been dominated by gut and intuition rather than by data and rigor. The differences between big data and analytics are a matter of volume, velocity, and variety: More data now cross the internet every second than were stored in the entire internet 20 years ago. Nearly real-time information makes it possible for a company to be much more agile than its competitors. And that information can come from social networks, images, sensors, the web, or other unstructured sources. The managerial challenges, however, are very real. Senior decision makers have to learn to ask the right questions and embrace evidence-based decision making. Organizations must hire scientists who can find patterns in very large data sets and translate them into useful business information. IT departments have to work hard to integrate all the relevant internal and external sources of data. The authors offer two success stories to illustrate how companies are using big data: PASSUR Aerospace enables airlines to match their actual and estimated arrival times. Sears Holdings directly analyzes its incoming store data to make promotions much more precise and faster.

  6. Big data, big knowledge: big data for personalized healthcare.

    PubMed

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  7. Big Data in industry

    NASA Astrophysics Data System (ADS)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  8. The Big Bang Theory

    ScienceCinema

    Lincoln, Don

    2016-07-12

    The Big Bang is the name of the most respected theory of the creation of the universe. Basically, the theory says that the universe was once smaller and denser and has been expending for eons. One common misconception is that the Big Bang theory says something about the instant that set the expansion into motion, however this isn’t true. In this video, Fermilab’s Dr. Don Lincoln tells about the Big Bang theory and sketches some speculative ideas about what caused the universe to come into existence.

  9. The Big Bang Theory

    SciTech Connect

    Lincoln, Don

    2014-09-30

    The Big Bang is the name of the most respected theory of the creation of the universe. Basically, the theory says that the universe was once smaller and denser and has been expending for eons. One common misconception is that the Big Bang theory says something about the instant that set the expansion into motion, however this isn’t true. In this video, Fermilab’s Dr. Don Lincoln tells about the Big Bang theory and sketches some speculative ideas about what caused the universe to come into existence.

  10. Facile preparation and biological imaging of luminescent polymeric nanoprobes with aggregation-induced emission characteristics through Michael addition reaction.

    PubMed

    Lv, Qiulan; Wang, Ke; Xu, Dazhuang; Liu, Meiying; Wan, Qing; Huang, Hongye; Liang, Shangdong; Zhang, Xiaoyong; Wei, Yen

    2016-09-01

    Water dispersion aggregation-induced emission (AIE) dyes based nanomaterials have recently attracted increasing attention in the biomedical fields because of their unique optical properties, outstanding performance as imaging and therapeutic agents. The methods to conjugate hydrophilic polymers with AIE dyes to solve the hydrophobic nature of AIE dyes and makeS them widely used in biomedicine, which have been extensively explored and paid great effort previously. Although great advance has been made in the fabrication and biomedical applications of AIE-active polymeric nanoprobes, facile and efficient strategies for fabrication of biodegradable AIE-active nanoprobes are still high desirable. In this work, amphiphilic biodegradable fluorescent organic nanoparticles (PLL-TPE-O-E FONs) have been fabricated for the first time by conjugation of AIE dye tetraphenylethene acrylate (TPE-O-E) with Poly-l-Lysine (PLL) through a facile one-step Michael addition reaction, which was carried out under rather mild conditions, included air atmosphere, near room temperature and absent of metal catalysts or hazardous reagents. Due to the unique AIE properties, these amphiphilic copolymers tend to self-assemble into high luminescent water dispersible nanoparticles with size range from 400 to 600nm. Laser scanning microscope and cytotoxicity results revealed that PLL-TPE-O-E FONs can be internalized into cytoplasm with negative cytotoxicity, which implied that PLL-TPE-O-E FONs are promising for biological applications.

  11. Genesis of the big bang

    NASA Astrophysics Data System (ADS)

    Alpher, Ralph A.; Herman, Robert

    The authors of this volume have been intimately connected with the conception of the big bang model since 1947. Following the late George Gamov's ideas in 1942 and more particularly in 1946 that the early universe was an appropriate site for the synthesis of the elements, they became deeply involved in the question of cosmic nucleosynthesis and particularly the synthesis of the light elements. In the course of this work they developed a general relativistic model of the expanding universe with physics folded in, which led in a progressive, logical sequence to our prediction of the existence of a present cosmic background radiation some seventeen years before the observation of such radiation was reported by Penzias and Wilson. In addition, they carried out with James W. Follin, Jr., a detailed study of the physics of what was then considered to be the very early universe, starting a few seconds after the big bang, which still provides a methodology for studies of light element nucleosynthesis. Because of their involvement, they bring a personal perspective to the subject. They present a picture of what is now believed to be the state of knowledge about the evolution of the expanding universe and delineate the story of the development of the big bang model as they have seen and lived it from their own unique vantage point.

  12. Thinking big thoughts

    NASA Astrophysics Data System (ADS)

    Vedral, Vlatko

    2016-08-01

    The short synopsis of The Big Picture by Sean Carroll is that it explores the question of whether science can explain everything in the world, and analyses the emerging reality that such an explanation entails.

  13. The Big Bang Singularity

    NASA Astrophysics Data System (ADS)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  14. Big data need big theory too.

    PubMed

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'.

  15. Big data need big theory too

    PubMed Central

    Dougherty, Edward R.; Highfield, Roger R.

    2016-01-01

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their ‘depth’ and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote ‘blind’ big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698035

  16. Functional connectomics from a "big data" perspective.

    PubMed

    Xia, Mingrui; He, Yong

    2017-02-14

    In the last decade, explosive growth regarding functional connectome studies has been observed. Accumulating knowledge has significantly contributed to our understanding of the brain's functional network architectures in health and disease. With the development of innovative neuroimaging techniques, the establishment of large brain datasets and the increasing accumulation of published findings, functional connectomic research has begun to move into the era of "big data", which generates unprecedented opportunities for discovery in brain science and simultaneously encounters various challenging issues, such as data acquisition, management and analyses. Big data on the functional connectome exhibits several critical features: high spatial and/or temporal precision, large sample sizes, long-term recording of brain activity, multidimensional biological variables (e.g., imaging, genetic, demographic, cognitive and clinic) and/or vast quantities of existing findings. We review studies regarding functional connectomics from a big data perspective, with a focus on recent methodological advances in state-of-the-art image acquisition (e.g., multiband imaging), analysis approaches and statistical strategies (e.g., graph theoretical analysis, dynamic network analysis, independent component analysis, multivariate pattern analysis and machine learning), as well as reliability and reproducibility validations. We highlight the novel findings in the application of functional connectomic big data to the exploration of the biological mechanisms of cognitive functions, normal development and aging and of neurological and psychiatric disorders. We advocate the urgent need to expand efforts directed at the methodological challenges and discuss the direction of applications in this field.

  17. Will Big Data Mean the End of Privacy?

    ERIC Educational Resources Information Center

    Pence, Harry E.

    2015-01-01

    Big Data is currently a hot topic in the field of technology, and many campuses are considering the addition of this topic into their undergraduate courses. Big Data tools are not just playing an increasingly important role in many commercial enterprises; they are also combining with new digital devices to dramatically change privacy. This article…

  18. Dual-source dual-energy CT with additional tin filtration: Dose and image quality evaluation in phantoms and in-vivo

    PubMed Central

    Primak, Andrew N.; Giraldo, Juan Carlos Ramirez; Eusemann, Christian D.; Schmidt, Bernhard; Kantor, B.; Fletcher, Joel G.; McCollough, Cynthia H.

    2010-01-01

    Purpose To investigate the effect on radiation dose and image quality of the use of additional spectral filtration for dual-energy CT (DECT) imaging using dual-source CT (DSCT). Materials and Methods A commercial DSCT scanner was modified by adding tin filtration to the high-kV tube, and radiation output and noise measured in water phantoms. Dose values for equivalent image noise were compared among DE-modes with and without tin filtration and single-energy (SE) mode. To evaluate DECT material discrimination, the material-specific DEratio for calcium and iodine were determined using images of anthropomorphic phantoms. Data were additionally acquired in 38 and 87 kg pigs, and noise for the linearly mixed and virtual non-contrast (VNC) images compared between DE-modes. Finally, abdominal DECT images from two patients of similar sizes undergoing clinically-indicated CT were compared. Results Adding tin filtration to the high-kV tube improved the DE contrast between iodine and calcium as much as 290%. Pig data showed that the tin filtration had no effect on noise in the DECT mixed images, but decreased noise by as much as 30% in the VNC images. Patient VNC-images acquired using 100/140 kV with added tin filtration had improved image quality compared to those generated with 80/140 kV without tin filtration. Conclusion Tin filtration of the high-kV tube of a DSCT scanner increases the ability of DECT to discriminate between calcium and iodine, without increasing dose relative to SECT. Furthermore, use of 100/140 kV tube potentials allows improved DECT imaging of large patients. PMID:20966323

  19. 'Big Crater' in 360-degree panorama

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The crater dubbed 'Big Crater', approximately 2200 meters (7200 feet)away was imaged by the Imager for Mars Pathfinder (IMP) as part of a 360-degree color panorama, taken over sols 8, 9 and 10. 'Big Crater' is actually a relatively small Martian crater to the southeast of the Mars Pathfinder landing site. It is 1500 meters (4900 feet) in diameter, or about the same size as Meteor Crater in Arizona.

    Mars Pathfinder is the second in NASA's Discovery program of low-cost spacecraft with highly focused science goals. The Jet Propulsion Laboratory, Pasadena, CA, developed and manages the Mars Pathfinder mission for NASA's Office of Space Science, Washington, D.C. JPL is an operating division of the California Institute of Technology (Caltech). The Imager for Mars Pathfinder (IMP) was developed by the University of Arizona Lunar and Planetary Laboratory under contract to JPL. Peter Smith is the Principal Investigator.

  20. Comparative validity of brief to medium-length Big Five and Big Six Personality Questionnaires.

    PubMed

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-12-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are faced with a variety of options as to inventory length. Furthermore, a 6-factor model has been proposed to extend and update the Big Five model, in part by adding a dimension of Honesty/Humility or Honesty/Propriety. In this study, 3 popular brief to medium-length Big Five measures (NEO Five Factor Inventory, Big Five Inventory [BFI], and International Personality Item Pool), and 3 six-factor measures (HEXACO Personality Inventory, Questionnaire Big Six Scales, and a 6-factor version of the BFI) were placed in competition to best predict important student life outcomes. The effect of test length was investigated by comparing brief versions of most measures (subsets of items) with original versions. Personality questionnaires were administered to undergraduate students (N = 227). Participants' college transcripts and student conduct records were obtained 6-9 months after data was collected. Six-factor inventories demonstrated better predictive ability for life outcomes than did some Big Five inventories. Additional behavioral observations made on participants, including their Facebook profiles and cell-phone text usage, were predicted similarly by Big Five and 6-factor measures. A brief version of the BFI performed surprisingly well; across inventory platforms, increasing test length had little effect on predictive validity. Comparative validity of the models and measures in terms of outcome prediction and parsimony is discussed.

  1. Big data in biomedicine.

    PubMed

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science.

  2. Big Questions: Missing Antimatter

    ScienceCinema

    Lincoln, Don

    2016-07-12

    Einstein's equation E = mc2 is often said to mean that energy can be converted into matter. More accurately, energy can be converted to matter and antimatter. During the first moments of the Big Bang, the universe was smaller, hotter and energy was everywhere. As the universe expanded and cooled, the energy converted into matter and antimatter. According to our best understanding, these two substances should have been created in equal quantities. However when we look out into the cosmos we see only matter and no antimatter. The absence of antimatter is one of the Big Mysteries of modern physics. In this video, Fermilab's Dr. Don Lincoln explains the problem, although doesn't answer it. The answer, as in all Big Mysteries, is still unknown and one of the leading research topics of contemporary science.

  3. Big Questions: Missing Antimatter

    SciTech Connect

    Lincoln, Don

    2013-08-27

    Einstein's equation E = mc2 is often said to mean that energy can be converted into matter. More accurately, energy can be converted to matter and antimatter. During the first moments of the Big Bang, the universe was smaller, hotter and energy was everywhere. As the universe expanded and cooled, the energy converted into matter and antimatter. According to our best understanding, these two substances should have been created in equal quantities. However when we look out into the cosmos we see only matter and no antimatter. The absence of antimatter is one of the Big Mysteries of modern physics. In this video, Fermilab's Dr. Don Lincoln explains the problem, although doesn't answer it. The answer, as in all Big Mysteries, is still unknown and one of the leading research topics of contemporary science.

  4. Astronomical surveys and big data

    NASA Astrophysics Data System (ADS)

    Mickaelian, Areg M.

    Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum, from γ -rays to radio waves, are reviewed, including such as Fermi-GLAST and INTEGRAL in γ -ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and POSS II-based catalogues (APM, MAPS, USNO, GSC) in the optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio range, and many others, as well as the most important surveys giving optical images (DSS I and II, SDSS, etc.), proper motions (Tycho, USNO, Gaia), variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS), and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA). An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era, with Astrophysical Virtual Observatories and Computational Astrophysics playing an important role in using and analyzing big data for new discoveries.

  5. Big Data Analytics for Genomic Medicine.

    PubMed

    He, Karen Y; Ge, Dongliang; He, Max M

    2017-02-15

    Genomic medicine attempts to build individualized strategies for diagnostic or therapeutic decision-making by utilizing patients' genomic information. Big Data analytics uncovers hidden patterns, unknown correlations, and other insights through examining large-scale various data sets. While integration and manipulation of diverse genomic data and comprehensive electronic health records (EHRs) on a Big Data infrastructure exhibit challenges, they also provide a feasible opportunity to develop an efficient and effective approach to identify clinically actionable genetic variants for individualized diagnosis and therapy. In this paper, we review the challenges of manipulating large-scale next-generation sequencing (NGS) data and diverse clinical data derived from the EHRs for genomic medicine. We introduce possible solutions for different challenges in manipulating, managing, and analyzing genomic and clinical data to implement genomic medicine. Additionally, we also present a practical Big Data toolset for identifying clinically actionable genetic variants using high-throughput NGS data and EHRs.

  6. Unsupervised Tensor Mining for Big Data Practitioners.

    PubMed

    Papalexakis, Evangelos E; Faloutsos, Christos

    2016-09-01

    Multiaspect data are ubiquitous in modern Big Data applications. For instance, different aspects of a social network are the different types of communication between people, the time stamp of each interaction, and the location associated to each individual. How can we jointly model all those aspects and leverage the additional information that they introduce to our analysis? Tensors, which are multidimensional extensions of matrices, are a principled and mathematically sound way of modeling such multiaspect data. In this article, our goal is to popularize tensors and tensor decompositions to Big Data practitioners by demonstrating their effectiveness, outlining challenges that pertain to their application in Big Data scenarios, and presenting our recent work that tackles those challenges. We view this work as a step toward a fully automated, unsupervised tensor mining tool that can be easily and broadly adopted by practitioners in academia and industry.

  7. Big Data Analytics for Genomic Medicine

    PubMed Central

    He, Karen Y.; Ge, Dongliang; He, Max M.

    2017-01-01

    Genomic medicine attempts to build individualized strategies for diagnostic or therapeutic decision-making by utilizing patients’ genomic information. Big Data analytics uncovers hidden patterns, unknown correlations, and other insights through examining large-scale various data sets. While integration and manipulation of diverse genomic data and comprehensive electronic health records (EHRs) on a Big Data infrastructure exhibit challenges, they also provide a feasible opportunity to develop an efficient and effective approach to identify clinically actionable genetic variants for individualized diagnosis and therapy. In this paper, we review the challenges of manipulating large-scale next-generation sequencing (NGS) data and diverse clinical data derived from the EHRs for genomic medicine. We introduce possible solutions for different challenges in manipulating, managing, and analyzing genomic and clinical data to implement genomic medicine. Additionally, we also present a practical Big Data toolset for identifying clinically actionable genetic variants using high-throughput NGS data and EHRs. PMID:28212287

  8. Changing the personality of a face: Perceived Big Two and Big Five personality factors modeled in real photographs.

    PubMed

    Walker, Mirella; Vetter, Thomas

    2016-04-01

    General, spontaneous evaluations of strangers based on their faces have been shown to reflect judgments of these persons' intention and ability to harm. These evaluations can be mapped onto a 2D space defined by the dimensions trustworthiness (intention) and dominance (ability). Here we go beyond general evaluations and focus on more specific personality judgments derived from the Big Two and Big Five personality concepts. In particular, we investigate whether Big Two/Big Five personality judgments can be mapped onto the 2D space defined by the dimensions trustworthiness and dominance. Results indicate that judgments of the Big Two personality dimensions almost perfectly map onto the 2D space. In contrast, at least 3 of the Big Five dimensions (i.e., neuroticism, extraversion, and conscientiousness) go beyond the 2D space, indicating that additional dimensions are necessary to describe more specific face-based personality judgments accurately. Building on this evidence, we model the Big Two/Big Five personality dimensions in real facial photographs. Results from 2 validation studies show that the Big Two/Big Five are perceived reliably across different samples of faces and participants. Moreover, results reveal that participants differentiate reliably between the different Big Two/Big Five dimensions. Importantly, this high level of agreement and differentiation in personality judgments from faces likely creates a subjective reality which may have serious consequences for those being perceived-notably, these consequences ensue because the subjective reality is socially shared, irrespective of the judgments' validity. The methodological approach introduced here might prove useful in various psychological disciplines. (PsycINFO Database Record

  9. A Big Bang Lab

    ERIC Educational Resources Information Center

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  10. The Big Sky inside

    ERIC Educational Resources Information Center

    Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna

    2009-01-01

    The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…

  11. Big Enough for Everyone?

    ERIC Educational Resources Information Center

    Coote, Anna

    2010-01-01

    The UK's coalition government wants to build a "Big Society." The Prime Minister says "we are all in this together" and building it is the responsibility of every citizen as well as every government department. The broad vision is welcome, but everything depends on how the vision is translated into policy and practice. The…

  12. The big bang

    NASA Astrophysics Data System (ADS)

    Silk, Joseph

    Our universe was born billions of years ago in a hot, violent explosion of elementary particles and radiation - the big bang. What do we know about this ultimate moment of creation, and how do we know it? Drawing upon the latest theories and technology, this new edition of The big bang, is a sweeping, lucid account of the event that set the universe in motion. Joseph Silk begins his story with the first microseconds of the big bang, on through the evolution of stars, galaxies, clusters of galaxies, quasars, and into the distant future of our universe. He also explores the fascinating evidence for the big bang model and recounts the history of cosmological speculation. Revised and updated, this new edition features all the most recent astronomical advances, including: Photos and measurements from the Hubble Space Telescope, Cosmic Background Explorer Satellite (COBE), and Infrared Space Observatory; the latest estimates of the age of the universe; new ideas in string and superstring theory; recent experiments on neutrino detection; new theories about the presence of dark matter in galaxies; new developments in the theory of the formation and evolution of galaxies; the latest ideas about black holes, worm holes, quantum foam, and multiple universes.

  13. A Sobering Big Idea

    ERIC Educational Resources Information Center

    Wineburg, Sam

    2006-01-01

    Since Susan Adler, Alberta Dougan, and Jesus Garcia like "big ideas," the author offers one to ponder: young people in this country can not read with comprehension. The saddest thing about this crisis is that it is no secret. The 2001 results of the National Assessment of Educational Progress (NAEP) for reading, published in every major…

  14. The Big Fish

    ERIC Educational Resources Information Center

    DeLisle, Rebecca; Hargis, Jace

    2005-01-01

    The Killer Whale, Shamu jumps through hoops and splashes tourists in hopes for the big fish, not because of passion, desire or simply the enjoyment of doing so. What would happen if those fish were obsolete? Would this killer whale be able to find the passion to continue to entertain people? Or would Shamu find other exciting activities to do…

  15. Big Data and Chemical Education

    ERIC Educational Resources Information Center

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  16. Business and Science - Big Data, Big Picture

    NASA Astrophysics Data System (ADS)

    Rosati, A.

    2013-12-01

    Data Science is more than the creation, manipulation, and transformation of data. It is more than Big Data. The business world seems to have a hold on the term 'data science' and, for now, they define what it means. But business is very different than science. In this talk, I address how large datasets, Big Data, and data science are conceptually different in business and science worlds. I focus on the types of questions each realm asks, the data needed, and the consequences of findings. Gone are the days of datasets being created or collected to serve only one purpose or project. The trick with data reuse is to become familiar enough with a dataset to be able to combine it with other data and extract accurate results. As a Data Curator for the Advanced Cooperative Arctic Data and Information Service (ACADIS), my specialty is communication. Our team enables Arctic sciences by ensuring datasets are well documented and can be understood by reusers. Previously, I served as a data community liaison for the North American Regional Climate Change Assessment Program (NARCCAP). Again, my specialty was communicating complex instructions and ideas to a broad audience of data users. Before entering the science world, I was an entrepreneur. I have a bachelor's degree in economics and a master's degree in environmental social science. I am currently pursuing a Ph.D. in Geography. Because my background has embraced both the business and science worlds, I would like to share my perspectives on data, data reuse, data documentation, and the presentation or communication of findings. My experiences show that each can inform and support the other.

  17. Differentiation between Glioblastoma Multiforme and Primary Cerebral Lymphoma: Additional Benefits of Quantitative Diffusion-Weighted MR Imaging

    PubMed Central

    Li, Chien Feng; Chen, Tai Yuan; Shu, Ginger; Kuo, Yu Ting; Lee, Yu Chang

    2016-01-01

    The differentiation between glioblastoma multiforme (GBM) and primary cerebral lymphoma (PCL) is important because the treatments are substantially different. The purpose of this article is to describe the MR imaging characteristics of GBM and PCL with emphasis on the quantitative ADC analysis in the tumor necrosis, the most strongly-enhanced tumor area, and the peritumoral edema. This retrospective cohort study collected 104 GBM (WHO grade IV) patients and 22 immune-competent PCL (diffuse large B cell lymphoma) patients. All these patients had pretreatment brain MR DWI and ADC imaging. Analysis of conventional MR imaging and quantitative ADC measurement including the tumor necrosis (ADCn), the most strongly-enhanced tumor area (ADCt), and the peritumoral edema (ADCe) were done. ROC analysis with optimal cut-off values and area-under-the ROC curve (AUC) was performed. For conventional MR imaging, there are statistical differences in tumor size, tumor location, tumor margin, and the presence of tumor necrosis between GBM and PCL. Quantitative ADC analysis shows that GBM tended to have significantly (P<0.05) higher ADC in the most strongly-enhanced area (ADCt) and lower ADC in the peritumoral edema (ADCe) as compared with PCL. Excellent AUC (0.94) with optimal sensitivity of 90% and specificity of 86% for differentiating between GBM and PCL was obtained by combination of ADC in the tumor necrosis (ADCn), the most strongly-enhanced tumor area (ADCt), and the peritumoral edema (ADCe). Besides, there are positive ADC gradients in the peritumoral edema in a subset of GBMs but not in the PCLs. Quantitative ADC analysis in these three areas can thus be implemented to improve diagnostic accuracy for these two brain tumor types. The histological correlation of the ADC difference deserves further investigation. PMID:27631626

  18. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    ERIC Educational Resources Information Center

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  19. Big Data and the Future of Radiology Informatics.

    PubMed

    Kansagra, Akash P; Yu, John-Paul J; Chatterjee, Arindam R; Lenchik, Leon; Chow, Daniel S; Prater, Adam B; Yeh, Jean; Doshi, Ankur M; Hawkins, C Matthew; Heilbrun, Marta E; Smith, Stacy E; Oselkin, Martin; Gupta, Pushpender; Ali, Sayed

    2016-01-01

    Rapid growth in the amount of data that is electronically recorded as part of routine clinical operations has generated great interest in the use of Big Data methodologies to address clinical and research questions. These methods can efficiently analyze and deliver insights from high-volume, high-variety, and high-growth rate datasets generated across the continuum of care, thereby forgoing the time, cost, and effort of more focused and controlled hypothesis-driven research. By virtue of an existing robust information technology infrastructure and years of archived digital data, radiology departments are particularly well positioned to take advantage of emerging Big Data techniques. In this review, we describe four areas in which Big Data is poised to have an immediate impact on radiology practice, research, and operations. In addition, we provide an overview of the Big Data adoption cycle and describe how academic radiology departments can promote Big Data development.

  20. The addition of computer simulated noise to investigate radiation dose and image quality in images with spatial correlation of statistical noise: an example application to X-ray CT of the brain.

    PubMed

    Britten, A J; Crotty, M; Kiremidjian, H; Grundy, A; Adam, E J

    2004-04-01

    This study validates a method to add spatially correlated statistical noise to an image, applied to transaxial X-ray CT images of the head to simulate exposure reduction by up to 50%. 23 patients undergoing routine head CT had three additional slices acquired for validation purposes, two at the same clinical 420 mAs exposure and one at 300 mAs. Images at the level of the cerebrospinal fluid filled ventricles gave readings of noise from a single image, with subtraction of image pairs to obtain noise readings from non-uniform tissue regions. The spatial correlation of the noise was determined and added to the acquired 420 mAs image to simulate images at 340 mAs, 300 mAs, 260 mAs and 210 mAs. Two radiologists assessed the images, finding little difference between the 300 mAs simulated and acquired images. The presence of periventricular low density lesions (PVLD) was used as an example of the effect of simulated dose reduction on diagnostic accuracy, and visualization of the internal capsule was used as a measure of image quality. Diagnostic accuracy for the diagnosis of PVLD did not fall significantly even down to 210 mAs, though visualization of the internal capsule was poorer at lower exposure. Further work is needed to investigate means of measuring statistical noise without the need for uniform tissue areas, or image pairs. This technique has been shown to allow sufficiently accurate simulation of dose reduction and image quality degradation, even when the statistical noise is spatially correlated.

  1. DARPA's Big Mechanism program

    NASA Astrophysics Data System (ADS)

    Cohen, Paul R.

    2015-07-01

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  2. A holographic big bang?

    NASA Astrophysics Data System (ADS)

    Afshordi, N.; Mann, R. B.; Pourhasan, R.

    2015-11-01

    We present a cosmological model in which the Universe emerges out of the collapse of a five-dimensional (5D) star as a spherical three-brane. The initial singularity of the big bang becomes hidden behind a causal horizon. Near scale-invariant primordial curvature perturbations can be induced on the brane via a thermal atmosphere that is in equilibrium with the brane, circumventing the need for a separate inflationary process and providing an important test of the model.

  3. The Next Big Idea

    PubMed Central

    2013-01-01

    Abstract George S. Eisenbarth will remain in our memories as a brilliant scientist and great collaborator. His quest to discover the cause and prevention of type 1 (autoimmune) diabetes started from building predictive models based on immunogenetic markers. Despite his tremendous contributions to our understanding of the natural history of pre-type 1 diabetes and potential mechanisms, George left us with several big questions to answer before his quest is completed. PMID:23786296

  4. DARPA's Big Mechanism program.

    PubMed

    Cohen, Paul R

    2015-07-16

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  5. Big3. Editorial

    PubMed Central

    Lehmann, Christoph U.; Séroussi, Brigitte; Jaulent, Marie-Christine

    2014-01-01

    Summary Objectives To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. Methods A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. Results ‘Big Data’ has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that ‘Big Data’ will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics – some to a higher degree than others. It was our goal to provide a comprehensive view at the state of ‘Big Data’ today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. Conclusions For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016. PMID:24853037

  6. Optical image hiding based on computational ghost imaging

    NASA Astrophysics Data System (ADS)

    Wang, Le; Zhao, Shengmei; Cheng, Weiwen; Gong, Longyan; Chen, Hanwu

    2016-05-01

    Imaging hiding schemes play important roles in now big data times. They provide copyright protections of digital images. In the paper, we propose a novel image hiding scheme based on computational ghost imaging to have strong robustness and high security. The watermark is encrypted with the configuration of a computational ghost imaging system, and the random speckle patterns compose a secret key. Least significant bit algorithm is adopted to embed the watermark and both the second-order correlation algorithm and the compressed sensing (CS) algorithm are used to extract the watermark. The experimental and simulation results show that the authorized users can get the watermark with the secret key. The watermark image could not be retrieved when the eavesdropping ratio is less than 45% with the second-order correlation algorithm, whereas it is less than 20% with the TVAL3 CS reconstructed algorithm. In addition, the proposed scheme is robust against the 'salt and pepper' noise and image cropping degradations.

  7. Disaggregating asthma: Big investigation versus big data.

    PubMed

    Belgrave, Danielle; Henderson, John; Simpson, Angela; Buchan, Iain; Bishop, Christopher; Custovic, Adnan

    2017-02-01

    We are facing a major challenge in bridging the gap between identifying subtypes of asthma to understand causal mechanisms and translating this knowledge into personalized prevention and management strategies. In recent years, "big data" has been sold as a panacea for generating hypotheses and driving new frontiers of health care; the idea that the data must and will speak for themselves is fast becoming a new dogma. One of the dangers of ready accessibility of health care data and computational tools for data analysis is that the process of data mining can become uncoupled from the scientific process of clinical interpretation, understanding the provenance of the data, and external validation. Although advances in computational methods can be valuable for using unexpected structure in data to generate hypotheses, there remains a need for testing hypotheses and interpreting results with scientific rigor. We argue for combining data- and hypothesis-driven methods in a careful synergy, and the importance of carefully characterized birth and patient cohorts with genetic, phenotypic, biological, and molecular data in this process cannot be overemphasized. The main challenge on the road ahead is to harness bigger health care data in ways that produce meaningful clinical interpretation and to translate this into better diagnoses and properly personalized prevention and treatment plans. There is a pressing need for cross-disciplinary research with an integrative approach to data science, whereby basic scientists, clinicians, data analysts, and epidemiologists work together to understand the heterogeneity of asthma.

  8. Big Crater as Viewed by Pathfinder Lander

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The 'Big Crater' is actually a relatively small Martian crater to the southeast of the Mars Pathfinder landing site. It is 1500 meters (4900 feet) in diameter, or about the same size as Meteor Crater in Arizona. Superimposed on the rim of Big Crater (the central part of the rim as seen here) is a smaller crater nicknamed 'Rimshot Crater.' The distance to this smaller crater, and the nearest portion of the rim of Big Crater, is 2200 meters (7200 feet). To the right of Big Crater, south from the spacecraft, almost lost in the atmospheric dust 'haze,' is the large streamlined mountain nicknamed 'Far Knob.' This mountain is over 450 meters (1480 feet) tall, and is over 30 kilometers (19 miles) from the spacecraft. Another, smaller and closer knob, nicknamed 'Southeast Knob' can be seen as a triangular peak to the left of the flanks of the Big Crater rim. This knob is 21 kilometers (13 miles) southeast from the spacecraft.

    The larger features visible in this scene - Big Crater, Far Knob, and Southeast Knob - were discovered on the first panoramas taken by the IMP camera on the 4th of July, 1997, and subsequently identified in Viking Orbiter images taken over 20 years ago. The scene includes rocky ridges and swales or 'hummocks' of flood debris that range from a few tens of meters away from the lander to the distance of South Twin Peak. The largest rock in the nearfield, just left of center in the foreground, nicknamed 'Otter', is about 1.5 meters (4.9 feet) long and 10 meters (33 feet) from the spacecraft.

    This view of Big Crater was produced by combining 6 individual 'Superpan' scenes from the left and right eyes of the IMP camera. Each frame consists of 8 individual frames (left eye) and 7 frames (right eye) taken with different color filters that were enlarged by 500% and then co-added using Adobe Photoshop to produce, in effect, a super-resolution panchromatic frame that is sharper than an individual frame would be.

    Mars Pathfinder is the second in NASA

  9. iPAINT: a general approach tailored to image the topology of interfaces with nanometer resolution† †Electronic supplementary information (ESI) available: Fig. S1–S8. See DOI: 10.1039/c6nr00445h Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file. Click here for additional data file.

    PubMed Central

    Aloi, A.; Vilanova, N.

    2016-01-01

    Understanding interfacial phenomena in soft materials such as wetting, colloidal stability, coalescence, and friction warrants non-invasive imaging with nanometer resolution. Super-resolution microscopy has emerged as an attractive method to visualize nanostructures labeled covalently with fluorescent tags, but this is not amenable to all interfaces. Inspired by PAINT we developed a simple and general strategy to overcome this limitation, which we coin ‘iPAINT: interface Point Accumulation for Imaging in Nanoscale Topography’. It enables three-dimensional, sub-diffraction imaging of interfaces irrespective of their nature via reversible adsorption of polymer chains end-functionalized with photo-activatable moieties. We visualized model dispersions, emulsions, and foams with ∼20 nm and ∼3° accuracy demonstrating the general applicability of iPAINT to study solid/liquid, liquid/liquid and liquid/air interfaces. iPAINT thus broadens the scope of super-resolution microscopy paving the way for non-invasive, high-resolution imaging of complex soft materials. PMID:27055489

  10. How Big is Earth?

    NASA Astrophysics Data System (ADS)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  11. Dark radiation emerging after big bang nucleosynthesis?

    SciTech Connect

    Fischler, Willy; Meyers, Joel

    2011-03-15

    We show how recent data from observations of the cosmic microwave background may suggest the presence of additional radiation density which appeared after big bang nucleosynthesis. We propose a general scheme by which this radiation could be produced from the decay of nonrelativistic matter, we place constraints on the properties of such matter, and we give specific examples of scenarios in which this general scheme may be realized.

  12. "Big Events" and Networks.

    PubMed

    Friedman, Samuel; Rossi, Diana; Flom, Peter L

    2006-01-01

    Some, but not all, "big events" such as wars, revolutions, socioeconomic transitions, economic collapses, and ecological disasters in recent years seem to lead to large-scale HIV outbreaks (Friedman et al, in press; Hankins et al 2002). This was true of transitions in the USSR, South Africa and Indonesia, for example, but not those in the Philippines or (so far) in Argentina. It has been hypothesized that whether or not HIV outbreaks occur is shaped in part by the nature and extent of changes in the numbers of voluntary or involuntary risk-takers, which itself may be related to the growth of roles such as sex-sellers or drug sellers; the riskiness of the behaviors engaged in by risk-takers; and changes in sexual and injection networks and other "mixing patterns" variables. Each of these potential causal processes, in turn, is shaped by the nature of pre-existing social networks and the patterns and content of normative regulation and communication that happen within these social networks-and on how these social networks and their characteristics are changed by the "big event" in question. We will present ideas about what research is needed to help understand these events and to help guide both indigenous community-based efforts to prevent HIV outbreaks and also to guide those who organize external intervention efforts and aid.

  13. Formulation of radiographically detectable gastrointestinal contrast agents for magnetic resonance imaging: effects of a barium sulfate additive on MR contrast agent effectiveness.

    PubMed

    Rubin, D L; Muller, H H; Young, S W

    1992-01-01

    Complete and homogeneous distribution of gastrointestinal (GI) contrast media are important factors for their effective use in computed tomography as well as in magnetic resonance (MR) imaging. A radiographic method (using fluoroscopy or spot films) could be effective for monitoring intestinal filling with GI contrast agents for MR imaging (GICMR), but it would require the addition of a radiopaque agent to most GICMR. This study was conducted to determine the minimum amount of barium additive necessary to be radiographically visible and to evaluate whether this additive influences the signal characteristics of the GICMR. A variety of barium sulfate preparations (3-12% wt/vol) were tested in dogs to determine the minimum quantity needed to make the administered agent visible during fluoroscopy and on abdominal radiographs. Solutions of 10 different potential GI contrast agents (Gd-DTPA, ferric ammonium citrate, Mn-DPDP, chromium-EDTA, gadolinium-oxalate, ferrite particles, water, mineral oil, lipid emulsion, and methylcellulose) were prepared without ("nondoped") and with ("doped") the barium sulfate additive. MR images of the solutions in tubes were obtained at 0.38 T using 10 different spin-echo pulse sequences. Region of interest (ROI) measurements of contrast agent signal intensity (SI) were made. In addition, for the paramagnetic contrast media, the longitudinal and transverse relaxivity (R1 and R2) were measured. A 6% wt/vol suspension of barium was the smallest concentration yielding adequate radiopacity in the GI tract. Except for gadolinium-oxalate, there was no statistically significant difference in SI for doped and non-doped solutions with most pulse sequences used. In addition, the doped and nondoped solutions yielded R1 and R2 values which were comparable. We conclude that barium sulfate 6% wt/vol added to MR contrast agents produces a suspension with sufficient radiodensity to be viewed radiographically, and it does not cause significant alteration in

  14. The Big Read: Case Studies

    ERIC Educational Resources Information Center

    National Endowment for the Arts, 2009

    2009-01-01

    The Big Read evaluation included a series of 35 case studies designed to gather more in-depth information on the program's implementation and impact. The case studies gave readers a valuable first-hand look at The Big Read in context. Both formal and informal interviews, focus groups, attendance at a wide range of events--all showed how…

  15. The Rise of Big Data in Neurorehabilitation.

    PubMed

    Faroqi-Shah, Yasmeen

    2016-02-01

    In some fields, Big Data has been instrumental in analyzing, predicting, and influencing human behavior. However, Big Data approaches have so far been less central in speech-language pathology. This article introduces the concept of Big Data and provides examples of Big Data initiatives pertaining to adult neurorehabilitation. It also discusses the potential theoretical and clinical contributions that Big Data can make. The article also recognizes some impediments in building and using Big Data for scientific and clinical inquiry.

  16. Envisioning the Future of 'Big Data' Biomedicine.

    PubMed

    Bui, Alex A T; Darrell Van Horn, John

    2017-03-30

    In our era of digital biomedicine, data take many forms, from "omics" to imaging, mobile health (mHealth), and electronic health records (EHRs). With the availability of more efficient digital collection methods, scientists in many domains now find themselves confronting ever larger sets of data and trying to make sense of it all (1-4). Indeed, data which used to be considered large now seems small as the amount of data now being collected in a single day by an investigator can surpass what might have been generated over his/her career even a decade ago (e.g., (5)). This deluge of biomedical information requires new thinking about how data are generated, managed, and ultimately leveraged to further scientific understanding and for improving healthcare. Responding to this challenge, the National Institutes of Health (NIH) has spearheaded the "Big Data to Knowledge" (BD2K) program (6). Data scientists are being engaged through BD2K to guide biomedical researchers through the thickets of data they are producing. NIH Director, Francis Collins, has noted, "Indeed, we are at a point in history where Big Data should not intimidate, but inspire us. We are in the midst of a revolution that is transforming the way we do biomedical research…we just have to devise creative ways to sift through this mountain of data and make sense of it" (7). The NIH is now taking its first major steps toward realizing biomedical science as an interdisciplinary "big data" science.

  17. Human neuroimaging as a "Big Data" science.

    PubMed

    Van Horn, John Darrell; Toga, Arthur W

    2014-06-01

    The maturation of in vivo neuroimaging has led to incredible quantities of digital information about the human brain. While much is made of the data deluge in science, neuroimaging represents the leading edge of this onslaught of "big data". A range of neuroimaging databasing approaches has streamlined the transmission, storage, and dissemination of data from such brain imaging studies. Yet few, if any, common solutions exist to support the science of neuroimaging. In this article, we discuss how modern neuroimaging research represents a multifactorial and broad ranging data challenge, involving the growing size of the data being acquired; sociological and logistical sharing issues; infrastructural challenges for multi-site, multi-datatype archiving; and the means by which to explore and mine these data. As neuroimaging advances further, e.g. aging, genetics, and age-related disease, new vision is needed to manage and process this information while marshalling of these resources into novel results. Thus, "big data" can become "big" brain science.

  18. Solid-phase synthesis of graphene quantum dots from the food additive citric acid under microwave irradiation and their use in live-cell imaging.

    PubMed

    Zhuang, Qianfen; Wang, Yong; Ni, Yongnian

    2016-05-01

    The work demonstrated that solid citric acid, one of the most common food additives, can be converted to graphene quantum dots (GQDs) under microwave heating. The as-prepared GQDs were further characterized by various analytical techniques like transmission electron microscopy, atomic force microscopy, X-ray diffraction, X-ray photoelectron spectroscopy, Fourier transform infrared spectroscopy, fluorescence and UV-visible spectroscopy. Cytotoxicity of the GQDs was evaluated using HeLa cells. The result showed that the GQDs almost did not exhibit cytotoxicity at concentrations as high as 1000 µg mL(-1). In addition, it was found that the GQDs showed good solubility, excellent photostability, and excitation-dependent multicolor photoluminescence. Subsequently, the multicolor GQDs were successfully used as a fluorescence light-up probe for live-cell imaging.

  19. BIG DATA AND STATISTICS

    PubMed Central

    Rossell, David

    2016-01-01

    Big Data brings unprecedented power to address scientific, economic and societal issues, but also amplifies the possibility of certain pitfalls. These include using purely data-driven approaches that disregard understanding the phenomenon under study, aiming at a dynamically moving target, ignoring critical data collection issues, summarizing or preprocessing the data inadequately and mistaking noise for signal. We review some success stories and illustrate how statistical principles can help obtain more reliable information from data. We also touch upon current challenges that require active methodological research, such as strategies for efficient computation, integration of heterogeneous data, extending the underlying theory to increasingly complex questions and, perhaps most importantly, training a new generation of scientists to develop and deploy these strategies. PMID:27722040

  20. Big cat genomics.

    PubMed

    O'Brien, Stephen J; Johnson, Warren E

    2005-01-01

    Advances in population and quantitative genomics, aided by the computational algorithms that employ genetic theory and practice, are now being applied to biological questions that surround free-ranging species not traditionally suitable for genetic enquiry. Here we review how applications of molecular genetic tools have been used to describe the natural history, present status, and future disposition of wild cat species. Insight into phylogenetic hierarchy, demographic contractions, geographic population substructure, behavioral ecology, and infectious diseases have revealed strategies for survival and adaptation of these fascinating predators. Conservation, stabilization, and management of the big cats are important areas that derive benefit from the genome resources expanded and applied to highly successful species, imperiled by an expanding human population.

  1. The Last Big Bang

    SciTech Connect

    McGuire, Austin D.; Meade, Roger Allen

    2016-09-13

    As one of the very few people in the world to give the “go/no go” decision to detonate a nuclear device, Austin “Mac” McGuire holds a very special place in the history of both the Los Alamos National Laboratory and the world. As Commander of Joint Task Force Unit 8.1.1, on Christmas Island in the spring and summer of 1962, Mac directed the Los Alamos data collection efforts for twelve of the last atmospheric nuclear detonations conducted by the United States. Since data collection was at the heart of nuclear weapon testing, it fell to Mac to make the ultimate decision to detonate each test device. He calls his experience THE LAST BIG BANG, since these tests, part of Operation Dominic, were characterized by the dramatic displays of the heat, light, and sounds unique to atmospheric nuclear detonations – never, perhaps, to be witnessed again.

  2. HARNESSING BIG DATA FOR PRECISION MEDICINE: INFRASTRUCTURES AND APPLICATIONS.

    PubMed

    Yu, Kun-Hsing; Hart, Steven N; Goldfeder, Rachel; Zhang, Qiangfeng Cliff; Parker, Stephen C J; Snyder, Michael

    2016-01-01

    Precision medicine is a health management approach that accounts for individual differences in genetic backgrounds and environmental exposures. With the recent advancements in high-throughput omics profiling technologies, collections of large study cohorts, and the developments of data mining algorithms, big data in biomedicine is expected to provide novel insights into health and disease states, which can be translated into personalized disease prevention and treatment plans. However, petabytes of biomedical data generated by multiple measurement modalities poses a significant challenge for data analysis, integration, storage, and result interpretation. In addition, patient privacy preservation, coordination between participating medical centers and data analysis working groups, as well as discrepancies in data sharing policies remain important topics of discussion. In this workshop, we invite experts in omics integration, biobank research, and data management to share their perspectives on leveraging big data to enable precision medicine.Workshop website: http://tinyurl.com/PSB17BigData; HashTag: #PSB17BigData.

  3. Big bang and big crunch in matrix string theory

    SciTech Connect

    Bedford, J.; Ward, J.; Papageorgakis, C.; Rodriguez-Gomez, D.

    2007-04-15

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe.

  4. Big questions, big science: meeting the challenges of global ecology.

    PubMed

    Schimel, David; Keller, Michael

    2015-04-01

    Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects.

  5. MISR Views the Big Island of Hawaii

    NASA Technical Reports Server (NTRS)

    2000-01-01

    MISR images of the Big Island of Hawaii. The images have been rotated so that north is at the left.

    Upper left: April 2, 2000 (Terra orbit 1551) Upper right: May 4, 2000 (Terra orbit 2017) Lower left: June 5, 2000 (Terra orbit 2483) Lower right: June 21, 2000 (Terra orbit 2716)

    The first three images are color views acquired by the vertical (nadir) camera. The last image is a stereo anaglyph generated from the aftward cameras viewing at 60.0 and 70.5 degree look angles. It requires red/blue glasses with the red filter over the left eye.

    The color images show the greater prevalence of vegetation on the eastern side of the island due to moisture brought in by the prevailing Pacific trade winds. The western (lee) side of the island is drier. In the center of the island, and poking through the clouds in the stereo image are the Mauna Kea and Mauna Loa volcanoes, each peaking at about 4.2 km above sea level. The southern face of a line of cumulus clouds off the north coast of Hawaii is also visible in the stereo image.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  6. Big Bang of Massenergy and Negative Big Bang of Spacetime

    NASA Astrophysics Data System (ADS)

    Cao, Dayong

    2017-01-01

    There is a balance between Big Bang of Massenergy and Negative Big Bang of Spacetime in the universe. Also some scientists considered there is an anti-Big Bang who could produce the antimatter. And the paper supposes there is a structure balance between Einstein field equation and negative Einstein field equation, a balance between massenergy structure and spacetime structure, a balance between an energy of nucleus of the stellar matter and a dark energy of nucleus of the dark matter-dark energy, and a balance between the particle and the wave-a balance system between massenergy (particle) and spacetime (wave). It should explain of the problems of the Big Bang. http://meetings.aps.org/Meeting/APR16/Session/M13.8

  7. Acquisition Reform: Three Big Ideas

    DTIC Science & Technology

    2015-05-19

    Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Three Acq Reform Big Ideas 5 /19/2015 2 (1) Competing Capability Needs Among Services...to sponsor and users • Not required to be an acquisition expert • Tenure not as important Summary: 3 Big Ideas 5 /19/2015 19 (1) Competing...Acquisition Reform Three Big Ideas The provocative views expressed here are not those of the Department of Defense, DAU, or perhaps even the

  8. Big Data and Ambulatory Care

    PubMed Central

    Thorpe, Jane Hyatt; Gray, Elizabeth Alexandra

    2015-01-01

    Big data is heralded as having the potential to revolutionize health care by making large amounts of data available to support care delivery, population health, and patient engagement. Critics argue that big data's transformative potential is inhibited by privacy requirements that restrict health information exchange. However, there are a variety of permissible activities involving use and disclosure of patient information that support care delivery and management. This article presents an overview of the legal framework governing health information, dispels misconceptions about privacy regulations, and highlights how ambulatory care providers in particular can maximize the utility of big data to improve care. PMID:25401945

  9. The challenges of big data

    PubMed Central

    2016-01-01

    ABSTRACT The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. PMID:27147249

  10. Homogeneous and isotropic big rips?

    SciTech Connect

    Giovannini, Massimo

    2005-10-15

    We investigate the way big rips are approached in a fully inhomogeneous description of the space-time geometry. If the pressure and energy densities are connected by a (supernegative) barotropic index, the spatial gradients and the anisotropic expansion decay as the big rip is approached. This behavior is contrasted with the usual big-bang singularities. A similar analysis is performed in the case of sudden (quiescent) singularities and it is argued that the spatial gradients may well be non-negligible in the vicinity of pressure singularities.

  11. Is There an Additional Value of {sup 11}C-Choline PET-CT to T2-weighted MRI Images in the Localization of Intraprostatic Tumor Nodules?

    SciTech Connect

    Van den Bergh, Laura; Koole, Michel; Isebaert, Sofie; Joniau, Steven; Deroose, Christophe M.; Oyen, Raymond; Lerut, Evelyne; Budiharto, Tom; Mottaghy, Felix; Bormans, Guy; Van Poppel, Hendrik; Haustermans, Karin

    2012-08-01

    Purpose: To investigate the additional value of {sup 11}C-choline positron emission tomography (PET)-computed tomography (CT) to T2-weighted (T2w) magnetic resonance imaging (MRI) for localization of intraprostatic tumor nodules. Methods and Materials: Forty-nine prostate cancer patients underwent T2w MRI and {sup 11}C-choline PET-CT before radical prostatectomy and extended lymphadenectomy. Tumor regions were outlined on the whole-mount histopathology sections and on the T2w MR images. Tumor localization was recorded in the basal, middle, and apical part of the prostate by means of an octant grid. To analyze {sup 11}C-choline PET-CT images, the same grid was used to calculate the standardized uptake values (SUV) per octant, after rigid registration with the T2w MR images for anatomic reference. Results: In total, 1,176 octants were analyzed. Sensitivity, specificity, and accuracy of T2w MRI were 33.5%, 94.6%, and 70.2%, respectively. For {sup 11}C-choline PET-CT, the mean SUV{sub max} of malignant octants was significantly higher than the mean SUV{sub max} of benign octants (3.69 {+-} 1.29 vs. 3.06 {+-} 0.97, p < 0.0001) which was also true for mean SUV{sub mean} values (2.39 {+-} 0.77 vs. 1.94 {+-} 0.61, p < 0.0001). A positive correlation was observed between SUV{sub mean} and absolute tumor volume (Spearman r = 0.3003, p = 0.0362). No correlation was found between SUVs and prostate-specific antigen, T-stage or Gleason score. The highest accuracy (61.1%) was obtained with a SUV{sub max} cutoff of 2.70, resulting in a sensitivity of 77.4% and a specificity of 44.9%. When both modalities were combined (PET-CT or MRI positive), sensitivity levels increased as a function of SUV{sub max} but at the cost of specificity. When only considering suspect octants on {sup 11}C-choline PET-CT (SUV{sub max} {>=} 2.70) and T2w MRI, 84.7% of these segments were in agreement with the gold standard, compared with 80.5% for T2w MRI alone. Conclusions: The additional value of {sup

  12. The NOAA Big Data Project

    NASA Astrophysics Data System (ADS)

    de la Beaujardiere, J.

    2015-12-01

    The US National Oceanic and Atmospheric Administration (NOAA) is a Big Data producer, generating tens of terabytes per day from hundreds of sensors on satellites, radars, aircraft, ships, and buoys, and from numerical models. These data are of critical importance and value for NOAA's mission to understand and predict changes in climate, weather, oceans, and coasts. In order to facilitate extracting additional value from this information, NOAA has established Cooperative Research and Development Agreements (CRADAs) with five Infrastructure-as-a-Service (IaaS) providers — Amazon, Google, IBM, Microsoft, Open Cloud Consortium — to determine whether hosting NOAA data in publicly-accessible Clouds alongside on-demand computational capability stimulates the creation of new value-added products and services and lines of business based on the data, and if the revenue generated by these new applications can support the costs of data transmission and hosting. Each IaaS provider is the anchor of a "Data Alliance" which organizations or entrepreneurs can join to develop and test new business or research avenues. This presentation will report on progress and lessons learned during the first 6 months of the 3-year CRADAs.

  13. Partnership between small biotech and big pharma.

    PubMed

    Wiederrecht, Gregory J; Hill, Raymond G; Beer, Margaret S

    2006-08-01

    The process involved in the identification and development of novel breakthrough medicines at big pharma has recently undergone significant changes, in part because of the extraordinary complexity that is associated with tackling diseases of high unmet need, and also because of the increasingly demanding requirements that have been placed on the pharmaceutical industry by investors and regulatory authorities. In addition, big pharma no longer have a monopoly on the tools and enabling technologies that are required to identify and discover new drugs, as many biotech companies now also have these capabilities. As a result, researchers at biotech companies are able to identify credible drug leads, as well as compounds that have the potential to become marketed medicinal products. This diversification of companies that are involved in drug discovery and development has in turn led to increased partnering interactions between the biotech sector and big pharma. This article examines how Merck and Co Inc, which has historically relied on a combination of internal scientific research and licensed products, has poised itself to become further engaged in partnering with biotech companies, as well as academic institutions, to increase the probability of success associated with identifying novel medicines to treat unmet medical needs--particularly in areas such as central nervous system disorders, obesity/metabolic diseases, atheroma and cancer, and also to cultivate its cardiovascular, respiratory, arthritis, bone, ophthalmology and infectious disease franchises.

  14. Navigating a Sea of Big Data

    NASA Astrophysics Data System (ADS)

    Kinkade, D.; Chandler, C. L.; Groman, R. C.; Shepherd, A.; Allison, M. D.; Rauch, S.; Wiebe, P. H.; Glover, D. M.

    2014-12-01

    Oceanographic research is evolving rapidly. New technologies, strategies, and related infrastructures have catalyzed a change in the nature of oceanographic data. Heterogeneous and complex data types can be produced and transferred at great speeds. This shift in volume, variety, and velocity of data produced has led to increased challenges in managing these Big Data. In addition, distributed research communities have greater needs for data quality control, discovery and public accessibility, and seamless integration for interdisciplinary study. Organizations charged with curating oceanographic data must also evolve to meet these needs and challenges, by employing new technologies and strategies. The Biological and Chemical Oceanography Data Management Office (BCO-DMO) was created in 2006, to fulfill the data management needs of investigators funded by the NSF Ocean Sciences Biological and Chemical Sections and Polar Programs Antarctic Organisms and Ecosystems Program. Since its inception, the Office has had to modify internal systems and operations to address Big Data challenges to meet the needs of the ever-evolving oceanographic research community. Some enhancements include automated procedures replacing labor-intensive manual tasks, adoption of metadata standards facilitating machine client access, a geospatial interface and the use of Semantic Web technologies to increase data discovery and interoperability. This presentation will highlight some of the BCO-DMO advances that enable us to successfully fulfill our mission in a Big Data world.

  15. Big Data and Perioperative Nursing.

    PubMed

    Westra, Bonnie L; Peterson, Jessica J

    2016-10-01

    Big data are large volumes of digital data that can be collected from disparate sources and are challenging to analyze. These data are often described with the five "Vs": volume, velocity, variety, veracity, and value. Perioperative nurses contribute to big data through documentation in the electronic health record during routine surgical care, and these data have implications for clinical decision making, administrative decisions, quality improvement, and big data science. This article explores methods to improve the quality of perioperative nursing data and provides examples of how these data can be combined with broader nursing data for quality improvement. We also discuss a national action plan for nursing knowledge and big data science and how perioperative nurses can engage in collaborative actions to transform health care. Standardized perioperative nursing data has the potential to affect care far beyond the original patient.

  16. The BigBOSS Experiment

    SciTech Connect

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  17. Failed heart rate control with oral metoprolol prior to coronary CT angiography: effect of additional intravenous metoprolol on heart rate, image quality and radiation dose.

    PubMed

    Jiménez-Juan, Laura; Nguyen, Elsie T; Wintersperger, Bernd J; Moshonov, Hadas; Crean, Andrew M; Deva, Djeven P; Paul, Narinder S; Torres, Felipe S

    2013-01-01

    The purpose of this study was to evaluate the effect of intravenous (i.v.) metoprolol after a suboptimal heart rate (HR) response to oral metoprolol (75-150 mg) on HR control, image quality (IQ) and radiation dose during coronary CTA using 320-MDCT. Fifty-three consecutive patients who failed to achieve a target HR of < 60 bpm after an oral dose of metoprolol and required supplementary i.v. metoprolol (5-20 mg) prior to coronary CTA were evaluated. Patients with HR < 60 bpm during image acquisition were defined as responders (R) and those with HR ≥ 60 bpm as non-responders (NR). Two observers assessed IQ using a 3-point scale (1-2, diagnostic and 3, non-diagnostic). Effective dose (ED) was estimated using dose-length product and a 0.014 mSV/mGy.cm conversion factor. Baseline characteristics and HR on arrival were similar in the two groups. 58% of patients didn't achieve the target HR after receiving i.v. metoprolol (NR). R had a significantly higher HR reduction after oral (mean HR 63.9 ± 4.5 bpm vs. 69.6 ± 5.6 bpm) (p < 0.005) and i.v. (mean HR 55.4 ± 3.9 bpm vs. 67.4 ± 5.3 bpm) (p < 0.005) doses of metoprolol. Studies from NR showed a significantly higher ED in comparison to R (8.0 ± 2.9 vs. 6.1 ± 2.2 mSv) (p = 0.016) and a significantly higher proportion of non-diagnostic coronary segments (9.2 vs. 2.5%) (p < 0.001). 58% of patients who do not achieve a HR of <60 bpm prior to coronary CTA with oral fail to respond to additional i.v. metoprolol and have studies with higher radiation dose and worse image quality.

  18. Complex optimization for big computational and experimental neutron datasets

    NASA Astrophysics Data System (ADS)

    Bao, Feng; Archibald, Richard; Niedziela, Jennifer; Bansal, Dipanshu; Delaire, Olivier

    2016-12-01

    We present a framework to use high performance computing to determine accurate solutions to the inverse optimization problem of big experimental data against computational models. We demonstrate how image processing, mathematical regularization, and hierarchical modeling can be used to solve complex optimization problems on big data. We also demonstrate how both model and data information can be used to further increase solution accuracy of optimization by providing confidence regions for the processing and regularization algorithms. We use the framework in conjunction with the software package SIMPHONIES to analyze results from neutron scattering experiments on silicon single crystals, and refine first principles calculations to better describe the experimental data.

  19. Challenges of Big Data Analysis.

    PubMed

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  20. Challenges of Big Data Analysis

    PubMed Central

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-01-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions. PMID:25419469

  1. Powering Big Data for Nursing Through Partnership.

    PubMed

    Harper, Ellen M; Parkerson, Sara

    2015-01-01

    The Big Data Principles Workgroup (Workgroup) was established with support of the Healthcare Information and Management Systems Society. Building on the Triple Aim challenge, the Workgroup sought to identify Big Data principles, barriers, and challenges to nurse-sensitive data inclusion into Big Data sets. The product of this pioneering partnership Workgroup was the "Guiding Principles for Big Data in Nursing-Using Big Data to Improve the Quality of Care and Outcomes."

  2. The Economics of Big Area Addtiive Manufacturing

    SciTech Connect

    Post, Brian; Lloyd, Peter D; Lindahl, John; Lind, Randall F; Love, Lonnie J; Kunc, Vlastimil

    2016-01-01

    Case studies on the economics of Additive Manufacturing (AM) suggest that processing time is the dominant cost in manufacturing. Most additive processes have similar performance metrics: small part sizes, low production rates and expensive feedstocks. Big Area Additive Manufacturing is based on transitioning polymer extrusion technology from a wire to a pellet feedstock. Utilizing pellets significantly increases deposition speed and lowers material cost by utilizing low cost injection molding feedstock. The use of carbon fiber reinforced polymers eliminates the need for a heated chamber, significantly reducing machine power requirements and size constraints. We hypothesize that the increase in productivity coupled with decrease in feedstock and energy costs will enable AM to become more competitive with conventional manufacturing processes for many applications. As a test case, we compare the cost of using traditional fused deposition modeling (FDM) with BAAM for additively manufacturing composite tooling.

  3. The Big Bang Theory: What It Is, Where It Came From, and Why It Works

    NASA Astrophysics Data System (ADS)

    Fox, Karen C.

    2002-02-01

    A lively, accessible look at the Big Bang theory This compelling book describes how the Big Bang theory arose, how it has evolved, and why it is the best theory so far to explain the current state of the universe. In addition to understanding the birth of the cosmos, readers will learn how the theory stands up to challenges and what it fails to explain. Karen Fox provides clear answers to some of the hardest questions including: Why was the Big Bang theory accepted to begin with? Will the Big Bang theory last into the next century or even the next decade? Is the theory at odds with new scientific findings? One of the most well-known theories in modern science, the Big Bang is the most accurate model yet devised in humanity's tireless serach for the ultimate moment of creation. The Big Bang Theory is the first title in a planned series on the major theories of modern science.

  4. Landing the Big One.

    ERIC Educational Resources Information Center

    Negroni, Peter J.

    1992-01-01

    A veteran school leader advises aspiring superintendents to focus their job search according to desired geographical area, type of community, and salary range and to prepare thoroughly for interviews. On-the-job success strategies include watching professional image, finding out about predecessors, communicating educational goals, honing public…

  5. Food additives

    PubMed Central

    Spencer, Michael

    1974-01-01

    Food additives are discussed from the food technology point of view. The reasons for their use are summarized: (1) to protect food from chemical and microbiological attack; (2) to even out seasonal supplies; (3) to improve their eating quality; (4) to improve their nutritional value. The various types of food additives are considered, e.g. colours, flavours, emulsifiers, bread and flour additives, preservatives, and nutritional additives. The paper concludes with consideration of those circumstances in which the use of additives is (a) justified and (b) unjustified. PMID:4467857

  6. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2004-06-01

    soil C in the partnership region, and to design a risk/cost effectiveness framework to make comparative assessments of each viable sink, taking into account economic costs, offsetting benefits, scale of sequestration opportunities, spatial and time dimensions, environmental risks, and long term viability. Scientifically sound information on MMV is critical for public acceptance of these technologies. Two key deliverables were completed this quarter--a literature review/database to assess the soil carbon on rangelands, and the draft protocols, contracting options for soil carbon trading. To date, there has been little research on soil carbon on rangelands, and since rangeland constitutes a major land use in the Big Sky region, this is important in achieving a better understanding of terrestrial sinks. The protocols developed for soil carbon trading are unique and provide a key component of the mechanisms that might be used to efficiently sequester GHG and reduce CO{sub 2} concentrations. Progress on other deliverables is noted in the PowerPoint presentations. A series of meetings held during the second quarter have laid the foundations for assessing the issues surrounding the implementation of a market-based setting for soil C credits. These meetings provide a connection to stakeholders in the region and a basis on which to draw for the DOE PEIS hearings. Finally, the education and outreach efforts have resulted in a comprehensive plan and process which serves as a guide for implementing the outreach activities under Phase I. While we are still working on the public website, we have made many presentations to stakeholders and policy makers, connections to other federal and state agencies concerned with GHG emissions, climate change, and efficient and environmentally-friendly energy production. In addition, we have laid plans for integration of our outreach efforts with the students, especially at the tribal colleges and at the universities involved in our partnership

  7. Big Sky Carbon Sequestration Partnership

    SciTech Connect

    Susan Capalbo

    2005-12-31

    has significant potential to sequester large amounts of CO{sub 2}. Simulations conducted to evaluate mineral trapping potential of mafic volcanic rock formations located in the Idaho province suggest that supercritical CO{sub 2} is converted to solid carbonate mineral within a few hundred years and permanently entombs the carbon. Although MMV for this rock type may be challenging, a carefully chosen combination of geophysical and geochemical techniques should allow assessment of the fate of CO{sub 2} in deep basalt hosted aquifers. Terrestrial carbon sequestration relies on land management practices and technologies to remove atmospheric CO{sub 2} where it is stored in trees, plants, and soil. This indirect sequestration can be implemented today and is on the front line of voluntary, market-based approaches to reduce CO{sub 2} emissions. Initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil Carbon (C) on rangelands, and forested, agricultural, and reclaimed lands. Rangelands can store up to an additional 0.05 mt C/ha/yr, while the croplands are on average four times that amount. Estimates of technical potential for soil sequestration within the region in cropland are in the range of 2.0 M mt C/yr over 20 year time horizon. This is equivalent to approximately 7.0 M mt CO{sub 2}e/yr. The forestry sinks are well documented, and the potential in the Big Sky region ranges from 9-15 M mt CO{sub 2} equivalent per year. Value-added benefits include enhanced yields, reduced erosion, and increased wildlife habitat. Thus the terrestrial sinks provide a viable, environmentally beneficial, and relatively low cost sink that is available to sequester C in the current time frame. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts

  8. Considerations on Geospatial Big Data

    NASA Astrophysics Data System (ADS)

    LIU, Zhen; GUO, Huadong; WANG, Changlin

    2016-11-01

    Geospatial data, as a significant portion of big data, has recently gained the full attention of researchers. However, few researchers focus on the evolution of geospatial data and its scientific research methodologies. When entering into the big data era, fully understanding the changing research paradigm associated with geospatial data will definitely benefit future research on big data. In this paper, we look deep into these issues by examining the components and features of geospatial big data, reviewing relevant scientific research methodologies, and examining the evolving pattern of geospatial data in the scope of the four ‘science paradigms’. This paper proposes that geospatial big data has significantly shifted the scientific research methodology from ‘hypothesis to data’ to ‘data to questions’ and it is important to explore the generality of growing geospatial data ‘from bottom to top’. Particularly, four research areas that mostly reflect data-driven geospatial research are proposed: spatial correlation, spatial analytics, spatial visualization, and scientific knowledge discovery. It is also pointed out that privacy and quality issues of geospatial data may require more attention in the future. Also, some challenges and thoughts are raised for future discussion.

  9. GEOSS: Addressing Big Data Challenges

    NASA Astrophysics Data System (ADS)

    Nativi, S.; Craglia, M.; Ochiai, O.

    2014-12-01

    In the sector of Earth Observation, the explosion of data is due to many factors including: new satellite constellations, the increased capabilities of sensor technologies, social media, crowdsourcing, and the need for multidisciplinary and collaborative research to face Global Changes. In this area, there are many expectations and concerns about Big Data. Vendors have attempted to use this term for their commercial purposes. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This presentation tries to explore and discuss the impact of Big Data challenges and new capabilities on the Global Earth Observation System of Systems (GEOSS) and particularly on its common digital infrastructure called GCI. GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information at their desk. The impact of the Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on GEOSS is discussed. The main solutions and experimentation developed by GEOSS along these axes are introduced and analyzed. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the Earth Observation realm; its experience on Big Data is valuable for the many lessons learned.

  10. [Big data in official statistics].

    PubMed

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  11. Big Data: Astronomical or Genomical?

    PubMed

    Stephens, Zachary D; Lee, Skylar Y; Faghri, Faraz; Campbell, Roy H; Zhai, Chengxiang; Efron, Miles J; Iyer, Ravishankar; Schatz, Michael C; Sinha, Saurabh; Robinson, Gene E

    2015-07-01

    Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade.

  12. Multiwavelength astronomy and big data

    NASA Astrophysics Data System (ADS)

    Mickaelian, A. M.

    2016-09-01

    Two major characteristics of modern astronomy are multiwavelength (MW) studies (fromγ-ray to radio) and big data (data acquisition, storage and analysis). Present astronomical databases and archives contain billions of objects observed at various wavelengths, both galactic and extragalactic, and the vast amount of data on them allows new studies and discoveries. Astronomers deal with big numbers. Surveys are the main source for discovery of astronomical objects and accumulation of observational data for further analysis, interpretation, and achieving scientific results. We review the main characteristics of astronomical surveys, compare photographic and digital eras of astronomical studies (including the development of wide-field observations), describe the present state of MW surveys, and discuss the Big Data in astronomy and related topics of Virtual Observatories and Computational Astrophysics. The review includes many numbers and data that can be compared to have a possibly overall understanding on the Universe, cosmic numbers and their relationship to modern computational facilities.

  13. Big Data: Astronomical or Genomical?

    PubMed Central

    Stephens, Zachary D.; Lee, Skylar Y.; Faghri, Faraz; Campbell, Roy H.; Zhai, Chengxiang; Efron, Miles J.; Iyer, Ravishankar; Schatz, Michael C.; Sinha, Saurabh; Robinson, Gene E.

    2015-01-01

    Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a “four-headed beast”—it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the “genomical” challenges of the next decade. PMID:26151137

  14. Seismic reflection data imaging and interpretation from Braniewo2014 experiment using additional wide-angle refraction and reflection and well-logs data

    NASA Astrophysics Data System (ADS)

    Trzeciak, Maciej; Majdański, Mariusz; Białas, Sebastian; Gaczyński, Edward; Maksym, Andrzej

    2015-04-01

    Braniewo2014 reflection and refraction experiment was realized in cooperation between Polish Oil and Gas Company (PGNiG) and the Institute of Geophysics (IGF), Polish Academy of Sciences, near the locality of Braniewo in northern Poland. PGNiG realized a 20-km-long reflection profile, using vibroseis and dynamite shooting; the aim of the reflection survey was to characterise Silurian shale gas reservoir. IGF deployed 59 seismic stations along this profile and registered additional full-spread wide-angle refraction and reflection data, with offsets up to 12 km; maximum offsets from the seismic reflection survey was 3 km. To improve the velocity information two velocity logs from near deep boreholes were used. The main goal of the joint reflection-refraction interpretation was to find relations between velocity field from reflection velocity analysis and refraction tomography, and to build a velocity model which would be consistent for both, reflection and refraction, datasets. In this paper we present imaging results and velocity models from Braniewo2014 experiment and the methodology we used.

  15. Rasdaman for Big Spatial Raster Data

    NASA Astrophysics Data System (ADS)

    Hu, F.; Huang, Q.; Scheele, C. J.; Yang, C. P.; Yu, M.; Liu, K.

    2015-12-01

    Spatial raster data have grown exponentially over the past decade. Recent advancements on data acquisition technology, such as remote sensing, have allowed us to collect massive observation data of various spatial resolution and domain coverage. The volume, velocity, and variety of such spatial data, along with the computational intensive nature of spatial queries, pose grand challenge to the storage technologies for effective big data management. While high performance computing platforms (e.g., cloud computing) can be used to solve the computing-intensive issues in big data analysis, data has to be managed in a way that is suitable for distributed parallel processing. Recently, rasdaman (raster data manager) has emerged as a scalable and cost-effective database solution to store and retrieve massive multi-dimensional arrays, such as sensor, image, and statistics data. Within this paper, the pros and cons of using rasdaman to manage and query spatial raster data will be examined and compared with other common approaches, including file-based systems, relational databases (e.g., PostgreSQL/PostGIS), and NoSQL databases (e.g., MongoDB and Hive). Earth Observing System (EOS) data collected from NASA's Atmospheric Scientific Data Center (ASDC) will be used and stored in these selected database systems, and a set of spatial and non-spatial queries will be designed to benchmark their performance on retrieving large-scale, multi-dimensional arrays of EOS data. Lessons learnt from using rasdaman will be discussed as well.

  16. Preliminary Geologic Map of the Big Pine Mountain Quadrangle, California

    USGS Publications Warehouse

    Vedder, J.G.; McLean, Hugh; Stanley, R.G.

    1995-01-01

    Reconnaissance geologic mapping of the San Rafael Primitive Area (now the San Rafael Wilderness) by Gower and others (1966) and Vedder an others (1967) showed s number of stratigraphic and structural ambiguities. To help resolve some of those problems, additional field work was done on parts of the Big Pine Moutain quadrangle during short intervals in 1981 and 1984, and 1990-1994.

  17. Big Data Goes Personal: Privacy and Social Challenges

    ERIC Educational Resources Information Center

    Bonomi, Luca

    2015-01-01

    The Big Data phenomenon is posing new challenges in our modern society. In addition to requiring information systems to effectively manage high-dimensional and complex data, the privacy and social implications associated with the data collection, data analytics, and service requirements create new important research problems. First, the high…

  18. Partnering with Big Pharma-What Academics Need to Know.

    PubMed

    Lipton, Stuart A; Nordstedt, Christer

    2016-04-21

    Knowledge of the parameters of drug development can greatly aid academic scientists hoping to partner with pharmaceutical companies. Here, we discuss the three major pillars of drug development-pharmacodynamics, pharmacokinetics, and toxicity studies-which, in addition to pre-clinical efficacy, are critical for partnering with Big Pharma to produce novel therapeutics.

  19. Big Sky Carbon Sequestration Partnership

    SciTech Connect

    Susan M. Capalbo

    2005-11-01

    state agencies concerned with GHG emissions, climate change, and efficient and environmentally-friendly energy production. In addition, the Partnership has plans for integration of our outreach efforts with students, especially at the tribal colleges and at the universities involved in our Partnership. This includes collaboration with MSU and with the U.S.-Norway Summer School, extended outreach efforts at LANL and INEEL, and with the student section of the ASME. Finally, the Big Sky Partnership was involved in key meetings and symposium in the 7th quarter including the USDOE Wye Institute Conference on Carbon Sequestration and Capture (April, 2005); the DOE/NETL Fourth Annual Conference on Carbon Capture and Sequestration (May 2005); Coal Power Development Conference (Denver, June 2005) and meetings with our Phase II industry partners and Governor's staff.

  20. [Big Data- challenges and risks].

    PubMed

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-06

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented.

  1. Mind the Scales: Harnessing Spatial Big Data for Infectious Disease Surveillance and Inference

    PubMed Central

    Lee, Elizabeth C.; Asher, Jason M.; Goldlust, Sandra; Kraemer, John D.; Lawson, Andrew B.; Bansal, Shweta

    2016-01-01

    Spatial big data have the velocity, volume, and variety of big data sources and contain additional geographic information. Digital data sources, such as medical claims, mobile phone call data records, and geographically tagged tweets, have entered infectious diseases epidemiology as novel sources of data to complement traditional infectious disease surveillance. In this work, we provide examples of how spatial big data have been used thus far in epidemiological analyses and describe opportunities for these sources to improve disease-mitigation strategies and public health coordination. In addition, we consider the technical, practical, and ethical challenges with the use of spatial big data in infectious disease surveillance and inference. Finally, we discuss the implications of the rising use of spatial big data in epidemiology to health risk communication, and public health policy recommendations and coordination across scales.

  2. Baryon symmetric big-bang cosmology. [matter-antimatter symmetry

    NASA Technical Reports Server (NTRS)

    Stecker, F. W.

    1978-01-01

    The framework of baryon-symmetric big-bang cosmology offers the greatest potential for deducing the evolution of the universe as a consequence of physical laws and processes with the minimum number of arbitrary assumptions as to initial conditions in the big-bang. In addition, it offers the possibility of explaining the photon-baryon ratio in the universe and how galaxies and galaxy clusters are formed, and also provides the only acceptable explanation at present for the origin of the cosmic gamma ray background radiation.

  3. Food additives

    MedlinePlus

    ... or natural. Natural food additives include: Herbs or spices to add flavor to foods Vinegar for pickling ... Certain colors improve the appearance of foods. Many spices, as well as natural and man-made flavors, ...

  4. True Randomness from Big Data

    NASA Astrophysics Data System (ADS)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-09-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  5. Big Opportunities in Small Science

    ERIC Educational Resources Information Center

    Dewey, T. Gregory

    2007-01-01

    A transformation is occurring that will have a major impact on how academic science is done and how scientists are trained. That transformation--driven by declining federal funds, as well as by the rising cost of technology and the need for costly, labor-intensive interdisciplinary approaches--is from small science to big science. It is…

  6. Big6 Turbotools and Synthesis

    ERIC Educational Resources Information Center

    Tooley, Melinda

    2005-01-01

    The different tools that are helpful during the Synthesis stage, their role in boosting students abilities in Synthesis and the way in which it can be customized to meet the needs of each group of students are discussed. Big6 TurboTools offers several tools to help complete the task. In Synthesis stage, these same tools along with Turbo Report and…

  7. Big Explosives Experimental Facility - BEEF

    SciTech Connect

    2014-10-31

    The Big Explosives Experimental Facility or BEEF is a ten acre fenced high explosive testing facility that provides data to support stockpile stewardship and other national security programs. At BEEF conventional high explosives experiments are safely conducted providing sophisticated diagnostics such as high speed optics and x-ray radiography.

  8. Big Explosives Experimental Facility - BEEF

    ScienceCinema

    None

    2016-07-12

    The Big Explosives Experimental Facility or BEEF is a ten acre fenced high explosive testing facility that provides data to support stockpile stewardship and other national security programs. At BEEF conventional high explosives experiments are safely conducted providing sophisticated diagnostics such as high speed optics and x-ray radiography.

  9. Chinchilla "big" and "little" gastrins.

    PubMed

    Shinomura, Y; Eng, J; Yalow, R S

    1987-02-27

    Gastrin heptadecapeptides (gastrins I and II which differ in the presence of sulfate on the tyrosine of the latter) have been purified and sequenced from several mammalian species including pig, dog, cat, sheep, cow, human and rat. A 34 amino acid precursor ("big" gastrin), generally accounting for only 5% of total gastrin immunoreactivity, has been purified and sequenced only from the pig, human, dog and goat. Recently we have demonstrated that guinea pig (GP) "little" gastrin is a hexadecapeptide due to a deletion of a glutamic acid in the region 6-9 from its NH2-terminus and that GP "big" gastrin is a 33 amino acid peptide. The chinchilla, like the GP, is a New World hystricomorph. This report describes the extraction and purification of "little" and "big" gastrins from 31 chinchilla antra. Chinchilla "little" gastrin is a hexadecapeptide with a sequence identical to that of the GP and its "big" gastrin is a 33 amino acid peptide with the following sequence: (See text)

  10. 1976 Big Thompson flood, Colorado

    USGS Publications Warehouse

    Jarrett, R. D.; Vandas, S.J.

    2006-01-01

    In the early evening of July 31, 1976, a large stationary thunderstorm released as much as 7.5 inches of rainfall in about an hour (about 12 inches in a few hours) in the upper reaches of the Big Thompson River drainage. This large amount of rainfall in such a short period of time produced a flash flood that caught residents and tourists by surprise. The immense volume of water that churned down the narrow Big Thompson Canyon scoured the river channel and destroyed everything in its path, including 418 homes, 52 businesses, numerous bridges, paved and unpaved roads, power and telephone lines, and many other structures. The tragedy claimed the lives of 144 people. Scores of other people narrowly escaped with their lives. The Big Thompson flood ranks among the deadliest of Colorado's recorded floods. It is one of several destructive floods in the United States that has shown the necessity of conducting research to determine the causes and effects of floods. The U.S. Geological Survey (USGS) conducts research and operates a Nationwide streamgage network to help understand and predict the magnitude and likelihood of large streamflow events such as the Big Thompson Flood. Such research and streamgage information are part of an ongoing USGS effort to reduce flood hazards and to increase public awareness.

  11. The Case for "Big History."

    ERIC Educational Resources Information Center

    Christian, David

    1991-01-01

    Urges an approach to the teaching of history that takes the largest possible perspective, crossing time as well as space. Discusses the problems and advantages of such an approach. Describes a course on "big" history that begins with time, creation myths, and astronomy, and moves on to paleontology and evolution. (DK)

  12. The International Big History Association

    ERIC Educational Resources Information Center

    Duffy, Michael; Duffy, D'Neil

    2013-01-01

    IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big…

  13. China: Big Changes Coming Soon

    ERIC Educational Resources Information Center

    Rowen, Henry S.

    2011-01-01

    Big changes are ahead for China, probably abrupt ones. The economy has grown so rapidly for many years, over 30 years at an average of nine percent a year, that its size makes it a major player in trade and finance and increasingly in political and military matters. This growth is not only of great importance internationally, it is already having…

  14. True Randomness from Big Data

    PubMed Central

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-01-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514

  15. True Randomness from Big Data.

    PubMed

    Papakonstantinou, Periklis A; Woodruff, David P; Yang, Guang

    2016-09-26

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  16. How do we identify big rivers? And how big is big?

    NASA Astrophysics Data System (ADS)

    Miall, Andrew D.

    2006-04-01

    "Big rivers" are the trunk rivers that carry the water and sediment load from major orogens, or that drain large areas of a continent. Identifying such rivers in the ancient record is a challenge. Some guidance may be provided by tectonic setting and sedimentological evidence, including the scale of architectural elements, and clues from provenance studies, but such data are not infallible guides to river magnitude. The scale of depositional elements is the most obvious clue to channel size, but evidence is typically sparse and inadequate, and may be misleading. For example, thick fining-upward successions may be tectonic cyclothems. Two examples of the analysis of large ancient river systems are discussed here in order to highlight problems of methodology and interpretation. The Hawkesbury Sandstone (Triassic) of the Sydney Basin, Australia, is commonly cited as the deposit of a large river, on the basis of abundant very large-scale crossbedding. An examination of very large outcrops of this unit, including a coastal cliff section 6 km long near Sydney, showed that even with 100% exposure there are ambiguities in the determination of channel scale. It was concluded in this case that the channel dimensions of the Hawkesbury rivers were about half the size of the modern Brahmaputra River. The tectonic setting of a major ancient fluvial system is commonly not a useful clue to river scale. The Hawkesbury Sandstone is a system draining transversely from a cratonic source into a foreland basin, whereas most large rivers in foreland basins flow axially and are derived mainly from the orogenic uplifts (e.g., the large tidally influenced rivers of the Athabasca Oil Sands, Alberta). Epeirogenic tilting of a continent by the dynamic topography process may generate drainages in unexpected directions. For example, analyses of detrital zircons in Upper Paleozoic-Mesozoic nonmarine successions in the SW United States suggests significant derivation from the Appalachian orogen

  17. A SWOT Analysis of Big Data

    ERIC Educational Resources Information Center

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  18. Big Data: Implications for Health System Pharmacy.

    PubMed

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  19. Big sagebrush seed bank densities following wildfires

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Big sagebrush (Artemisia spp.) is a critical shrub to many wildlife species including sage grouse (Centrocercus urophasianus), mule deer (Odocoileus hemionus), and pygmy rabbit (Brachylagus idahoensis). Big sagebrush is killed by wildfires and big sagebrush seed is generally short-lived and do not s...

  20. Big Sagebrush Seed Bank Densities Following Wildfires

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Big sagebrush (Artemisia sp.) is a critical shrub to such sagebrush obligate species as sage grouse, (Centocercus urophasianus), mule deer (Odocoileus hemionus), and pygmy rabbit (Brachylagus idahoensis). Big sagebrush do not sprout after wildfires wildfires and big sagebrush seed is generally sho...

  1. Judging Big Deals: Challenges, Outcomes, and Advice

    ERIC Educational Resources Information Center

    Glasser, Sarah

    2013-01-01

    This article reports the results of an analysis of five Big Deal electronic journal packages to which Hofstra University's Axinn Library subscribes. COUNTER usage reports were used to judge the value of each Big Deal. Limitations of usage statistics are also discussed. In the end, the author concludes that four of the five Big Deals are good deals…

  2. A survey of big data research

    PubMed Central

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  3. A Spectrograph for BigBOSS

    NASA Astrophysics Data System (ADS)

    CARTON, Pierre-Henri; Bebek, C.; Cazaux, S.; Ealet, A.; Eppelle, D.; Kneib, J.; Karst, P.; levi, M.; magneville, C.; Palanque-Delabrouille, N.; Ruhlmann-Kleider, V.; Schlegel, D.; Yeche, C.

    2012-01-01

    The Big-Boss spectrographs assembly will take in charge the light from the fiber output to the detector, including the optics, gratings, mechanics and cryostats. The 5000 fibers are split in 10 bundles of 500 ones. Each of these channel feed one spectrograph. The full bandwidth from 0.36µm to 1.05µm is split in 3 bands. Each channel is composed with one collimator (doublet lenses), a VPH grating, and a 6 lenses camera. The 500 fiber spectrum are imaged onto a 4kx4k detector thanks to the F/2 camera. Each fiber core is imaged onto 4 pixels. Each channel of the BigBOSS spectrograph will be equipped with a single-CCD camera, resulting in 30 cryostats in total for the instrument. Based on its experience of CCD cameras for projects like EROS and MegaCam, CEA/Saclay has designed small and autonomous cryogenic vessels which integrate cryo-cooling, CCD positioning and slow control interfacing capabilities. The use of a Linear Pulse Tube with its own control unit, both developed by Thales Cryogenics BV, will ensure versatility, reliability and operational flexibility. CCD's will be cooled down to 140K, with stability better than 1K. CCD's will be positioned within 15µm along the optical axis and 50µm in the XY Plan. Slow Control machines will be directly interfaced to an Ethernet network, which will allow them to be operated remotely. The concept of spectrograph leads to a very robust concept without any mechanics (except the shutters). This 30 channels has a impressive compactness with its 3m3 volume. The development of such number of channel will drive to a quasi mass production philosophy.

  4. Big Fat Wand: A Pointing Device for Open Space Edutainment

    NASA Astrophysics Data System (ADS)

    Takahashi, Toru; Namatame, Miki; Kusunoki, Fusako; Terano, Takao

    This paper presents principles, functions, and experiments of a new edutainment tool: Big Fat Wand (BFW). BFW is developed from a conventional laser show device, however, it is modified to a small enough one to be used at an open apace. BFW is connected to a laptop PC, which provides character, symbol images, and/or animations. From experimental results, we conclude that BFW is a good gear for a facilitator to educate and educate hearing-impaired students.

  5. Big Bang Cosmic Titanic: Cause for Concern?

    NASA Astrophysics Data System (ADS)

    Gentry, Robert

    2013-04-01

    This abstract alerts physicists to a situation that, unless soon addressed, may yet affect PRL integrity. I refer to Stanley Brown's and DAE Robert Caldwell's rejection of PRL submission LJ12135, A Cosmic Titanic: Big Bang Cosmology Unravels Upon Discovery of Serious Flaws in Its Foundational Expansion Redshift Assumption, by their claim that BB is an established theory while ignoring our paper's Titanic, namely, that BB's foundational spacetime expansion redshifts assumption has now been proven to be irrefutably false because it is contradicted by our seminal discovery that GPS operation unequivocally proves that GR effects do not produce in-flight photon wavelength changes demanded by this central assumption. This discovery causes the big bang to collapse as quickly as did Ptolemaic cosmology when Copernicus discovered its foundational assumption was heliocentric, not geocentric. Additional evidence that something is amiss in PRL's treatment of LJ12135 comes from both Brown and EiC Gene Spouse agreeing to meet at my exhibit during last year's Atlanta APS to discuss this cover-up issue. Sprouse kept his commitment; Brown didn't. Question: If Brown could have refuted my claim of a cover-up, why didn't he come to present it before Gene Sprouse? I am appealing LJ12135's rejection.

  6. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    ScienceCinema

    None

    2016-07-12

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  7. Antigravity and the big crunch/big bang transition

    NASA Astrophysics Data System (ADS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  8. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    SciTech Connect

    2009-10-13

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  9. Solution of a braneworld big crunch/big bang cosmology

    SciTech Connect

    McFadden, Paul L.; Turok, Neil; Steinhardt, Paul J.

    2007-11-15

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c){sup 2}. At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios.

  10. Potlining Additives

    SciTech Connect

    Rudolf Keller

    2004-08-10

    In this project, a concept to improve the performance of aluminum production cells by introducing potlining additives was examined and tested. Boron oxide was added to cathode blocks, and titanium was dissolved in the metal pool; this resulted in the formation of titanium diboride and caused the molten aluminum to wet the carbonaceous cathode surface. Such wetting reportedly leads to operational improvements and extended cell life. In addition, boron oxide suppresses cyanide formation. This final report presents and discusses the results of this project. Substantial economic benefits for the practical implementation of the technology are projected, especially for modern cells with graphitized blocks. For example, with an energy savings of about 5% and an increase in pot life from 1500 to 2500 days, a cost savings of $ 0.023 per pound of aluminum produced is projected for a 200 kA pot.

  11. Phosphazene additives

    DOEpatents

    Harrup, Mason K; Rollins, Harry W

    2013-11-26

    An additive comprising a phosphazene compound that has at least two reactive functional groups and at least one capping functional group bonded to phosphorus atoms of the phosphazene compound. One of the at least two reactive functional groups is configured to react with cellulose and the other of the at least two reactive functional groups is configured to react with a resin, such as an amine resin of a polycarboxylic acid resin. The at least one capping functional group is selected from the group consisting of a short chain ether group, an alkoxy group, or an aryloxy group. Also disclosed are an additive-resin admixture, a method of treating a wood product, and a wood product.

  12. Better big data.

    PubMed

    Al Kazzi, Elie S; Hutfless, Susan

    2015-01-01

    By 2018, Medicare payments will be tied to quality of care. The Centers for Medicare and Medicaid Services currently use quality-based metric for some reimbursements through their different programs. Existing and future quality metrics will rely on risk adjustment to avoid unfairly punishing those who see the sickest, highest-risk patients. Despite the limitations of the data used for risk adjustment, there are potential solutions to improve the accuracy of these codes by calibrating data by merging databases and compiling information collected for multiple reporting programs to improve accuracy. In addition, healthcare staff should be informed about the importance of risk adjustment for quality of care assessment and reimbursement. As the number of encounters tied to value-based reimbursements increases in inpatient and outpatient care, coupled with accurate data collection and utilization, the methods used for risk adjustment could be expanded to better account for differences in the care delivered in diverse settings.

  13. Big Questions: Dark Matter

    ScienceCinema

    Lincoln, Don

    2016-07-12

    Carl Sagan's oft-quoted statement that there are "billions and billions" of stars in the cosmos gives an idea of just how much "stuff" is in the universe. However scientists now think that in addition to the type of matter with which we are familiar, there is another kind of matter out there. This new kind of matter is called "dark matter" and there seems to be five times as much as ordinary matter. Dark matter interacts only with gravity, thus light simply zips right by it. Scientists are searching through their data, trying to prove that the dark matter idea is real. Fermilab's Dr. Don Lincoln tells us why we think this seemingly-crazy idea might not be so crazy after all.

  14. Big data and ophthalmic research.

    PubMed

    Clark, Antony; Ng, Jonathon Q; Morlet, Nigel; Semmens, James B

    2016-01-01

    Large population-based health administrative databases, clinical registries, and data linkage systems are a rapidly expanding resource for health research. Ophthalmic research has benefited from the use of these databases in expanding the breadth of knowledge in areas such as disease surveillance, disease etiology, health services utilization, and health outcomes. Furthermore, the quantity of data available for research has increased exponentially in recent times, particularly as e-health initiatives come online in health systems across the globe. We review some big data concepts, the databases and data linkage systems used in eye research-including their advantages and limitations, the types of studies previously undertaken, and the future direction for big data in eye research.

  15. Image Data Mining for Pattern Classification and Visualization of Morphological Changes in Brain MR Images.

    PubMed

    Murakawa, Saki; Ikuta, Rie; Uchiyama, Yoshikazu; Shiraishi, Junji

    2016-02-01

    Hospital information systems (HISs) and picture archiving and communication systems (PACSs) are archiving large amounts of data (i.e., "big data") that are not being used. Therefore, many research projects in progress are trying to use "big data" for the development of early diagnosis, prediction of disease onset, and personalized therapies. In this study, we propose a new method for image data mining to identify regularities and abnormalities in the large image data sets. We used 70 archived magnetic resonance (MR) images that were acquired using three-dimensional magnetization-prepared rapid acquisition with gradient echo (3D MP-RAGE). These images were obtained from the Alzheimer's disease neuroimaging initiative (ADNI) database. For anatomical standardization of the data, we used the statistical parametric mapping (SPM) software. Using a similarity matrix based on cross-correlation coefficients (CCs) calculated from an anatomical region and a hierarchical clustering technique, we classified all the abnormal cases into five groups. The Z score map identified the difference between a standard normal brain and each of those from the Alzheimer's groups. In addition, the scatter plot obtained from two similarity matrixes visualized the regularities and abnormalities in the image data sets. Image features identified using our method could be useful for understanding of image findings associated with Alzheimer's disease.

  16. Statistical Inference: The Big Picture.

    PubMed

    Kass, Robert E

    2011-02-01

    Statistics has moved beyond the frequentist-Bayesian controversies of the past. Where does this leave our ability to interpret results? I suggest that a philosophy compatible with statistical practice, labelled here statistical pragmatism, serves as a foundation for inference. Statistical pragmatism is inclusive and emphasizes the assumptions that connect statistical models with observed data. I argue that introductory courses often mis-characterize the process of statistical inference and I propose an alternative "big picture" depiction.

  17. District Bets Big on Standards

    ERIC Educational Resources Information Center

    Gewertz, Catherine

    2013-01-01

    The big clock in Dowan McNair-Lee's 8th grade classroom in the Stuart-Hobson Middle School is silent, but she can hear the minutes ticking away nonetheless. On this day, like any other, the clock is a constant reminder of how little time she has to prepare her students--for spring tests, and for high school and all that lies beyond it. The…

  18. EHR Big Data Deep Phenotyping

    PubMed Central

    Lenert, L.; Lopez-Campos, G.

    2014-01-01

    Summary Objectives Given the quickening speed of discovery of variant disease drivers from combined patient genotype and phenotype data, the objective is to provide methodology using big data technology to support the definition of deep phenotypes in medical records. Methods As the vast stores of genomic information increase with next generation sequencing, the importance of deep phenotyping increases. The growth of genomic data and adoption of Electronic Health Records (EHR) in medicine provides a unique opportunity to integrate phenotype and genotype data into medical records. The method by which collections of clinical findings and other health related data are leveraged to form meaningful phenotypes is an active area of research. Longitudinal data stored in EHRs provide a wealth of information that can be used to construct phenotypes of patients. We focus on a practical problem around data integration for deep phenotype identification within EHR data. The use of big data approaches are described that enable scalable markup of EHR events that can be used for semantic and temporal similarity analysis to support the identification of phenotype and genotype relationships. Conclusions Stead and colleagues’ 2005 concept of using light standards to increase the productivity of software systems by riding on the wave of hardware/processing power is described as a harbinger for designing future healthcare systems. The big data solution, using flexible markup, provides a route to improved utilization of processing power for organizing patient records in genotype and phenotype research. PMID:25123744

  19. Empowering Personalized Medicine with Big Data and Semantic Web Technology: Promises, Challenges, and Use Cases.

    PubMed

    Panahiazar, Maryam; Taslimitehrani, Vahid; Jadhav, Ashutosh; Pathak, Jyotishman

    2014-10-01

    In healthcare, big data tools and technologies have the potential to create significant value by improving outcomes while lowering costs for each individual patient. Diagnostic images, genetic test results and biometric information are increasingly generated and stored in electronic health records presenting us with challenges in data that is by nature high volume, variety and velocity, thereby necessitating novel ways to store, manage and process big data. This presents an urgent need to develop new, scalable and expandable big data infrastructure and analytical methods that can enable healthcare providers access knowledge for the individual patient, yielding better decisions and outcomes. In this paper, we briefly discuss the nature of big data and the role of semantic web and data analysis for generating "smart data" which offer actionable information that supports better decision for personalized medicine. In our view, the biggest challenge is to create a system that makes big data robust and smart for healthcare providers and patients that can lead to more effective clinical decision-making, improved health outcomes, and ultimately, managing the healthcare costs. We highlight some of the challenges in using big data and propose the need for a semantic data-driven environment to address them. We illustrate our vision with practical use cases, and discuss a path for empowering personalized medicine using big data and semantic web technology.

  20. Turning big bang into big bounce. I. Classical dynamics

    SciTech Connect

    Dzierzak, Piotr; Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz

    2009-11-15

    The big bounce (BB) transition within a flat Friedmann-Robertson-Walker model is analyzed in the setting of loop geometry underlying the loop cosmology. We solve the constraint of the theory at the classical level to identify physical phase space and find the Lie algebra of the Dirac observables. We express energy density of matter and geometrical functions in terms of the observables. It is the modification of classical theory by the loop geometry that is responsible for BB. The classical energy scale specific to BB depends on a parameter that should be fixed either by cosmological data or determined theoretically at quantum level, otherwise the energy scale stays unknown.

  1. A survey on platforms for big data analytics.

    PubMed

    Singh, Dilpreet; Reddy, Chandan K

    The primary purpose of this paper is to provide an in-depth analysis of different platforms available for performing big data analytics. This paper surveys different hardware platforms available for big data analytics and assesses the advantages and drawbacks of each of these platforms based on various metrics such as scalability, data I/O rate, fault tolerance, real-time processing, data size supported and iterative task support. In addition to the hardware, a detailed description of the software frameworks used within each of these platforms is also discussed along with their strengths and drawbacks. Some of the critical characteristics described here can potentially aid the readers in making an informed decision about the right choice of platforms depending on their computational needs. Using a star ratings table, a rigorous qualitative comparison between different platforms is also discussed for each of the six characteristics that are critical for the algorithms of big data analytics. In order to provide more insights into the effectiveness of each of the platform in the context of big data analytics, specific implementation level details of the widely used k-means clustering algorithm on various platforms are also described in the form pseudocode.

  2. Some experiences and opportunities for big data in translational research

    PubMed Central

    Chute, Christopher G.; Ullman-Cullere, Mollie; Wood, Grant M.; Lin, Simon M.; He, Min; Pathak, Jyotishman

    2014-01-01

    Health care has become increasingly information intensive. The advent of genomic data, integrated into patient care, significantly accelerates the complexity and amount of clinical data. Translational research in the present day increasingly embraces new biomedical discovery in this data-intensive world, thus entering the domain of “big data.” The Electronic Medical Records and Genomics consortium has taught us many lessons, while simultaneously advances in commodity computing methods enable the academic community to affordably manage and process big data. Although great promise can emerge from the adoption of big data methods and philosophy, the heterogeneity and complexity of clinical data, in particular, pose additional challenges for big data inferencing and clinical application. However, the ultimate comparability and consistency of heterogeneous clinical information sources can be enhanced by existing and emerging data standards, which promise to bring order to clinical data chaos. Meaningful Use data standards in particular have already simplified the task of identifying clinical phenotyping patterns in electronic health records. PMID:24008998

  3. Big Data” and the Electronic Health Record

    PubMed Central

    Ross, M. K.; Wei, Wei

    2014-01-01

    Summary Objectives Implementation of Electronic Health Record (EHR) systems continues to expand. The massive number of patient encounters results in high amounts of stored data. Transforming clinical data into knowledge to improve patient care has been the goal of biomedical informatics professionals for many decades, and this work is now increasingly recognized outside our field. In reviewing the literature for the past three years, we focus on “big data” in the context of EHR systems and we report on some examples of how secondary use of data has been put into practice. Methods We searched PubMed database for articles from January 1, 2011 to November 1, 2013. We initiated the search with keywords related to “big data” and EHR. We identified relevant articles and additional keywords from the retrieved articles were added. Based on the new keywords, more articles were retrieved and we manually narrowed down the set utilizing predefined inclusion and exclusion criteria. Results Our final review includes articles categorized into the themes of data mining (pharmacovigilance, phenotyping, natural language processing), data application and integration (clinical decision support, personal monitoring, social media), and privacy and security. Conclusion The increasing adoption of EHR systems worldwide makes it possible to capture large amounts of clinical data. There is an increasing number of articles addressing the theme of “big data”, and the concepts associated with these articles vary. The next step is to transform healthcare big data into actionable knowledge. PMID:25123728

  4. A Solution to ``Too Big to Fail''

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-10-01

    Its a tricky business to reconcile simulations of our galaxys formation with our current observations of the Milky Way and its satellites. In a recent study, scientists have addressed one discrepancy between simulations and observations: the so-called to big to fail problem.From Missing Satellites to Too Big to FailThe favored model of the universe is the lambda-cold-dark-matter (CDM) cosmological model. This model does a great job of correctly predicting the large-scale structure of the universe, but there are still a few problems with it on smaller scales.Hubble image of UGC 5497, a dwarf galaxy associated with Messier 81. In the missing satellite problem, simulations of galaxy formation predict that there should be more such satellite galaxies than we observe. [ESA/NASA]The first is the missing satellites problem: CDM cosmology predicts that galaxies like the Milky Way should have significantly more satellite galaxies than we observe. A proposed solution to this problem is the argument that there may exist many more satellites than weve observed, but these dwarf galaxies have had their stars stripped from them during tidal interactions which prevents us from being able to see them.This solution creates a new problem, though: the too big to fail problem. This problem states that many of the satellites predicted by CDM cosmology are simply so massive that theres no way they couldnt have visible stars. Another way of looking at it: the observed satellites of the Milky Way are not massive enough to be consistent with predictions from CDM.Artists illustration of a supernova, a type of stellar feedback that can modify the dark-matter distribution of a satellite galaxy. [NASA/CXC/M. Weiss]Density Profiles and Tidal StirringLed by Mihai Tomozeiu (University of Zurich), a team of scientists has published a study in which they propose a solution to the too big to fail problem. By running detailed cosmological zoom simulations of our galaxys formation, Tomozeiu and

  5. Big data and clinical research: focusing on the area of critical care medicine in mainland China.

    PubMed

    Zhang, Zhongheng

    2014-10-01

    Big data has long been found its way into clinical practice since the advent of information technology era. Medical records and follow-up data can be more efficiently stored and extracted with information technology. Immediately after admission a patient immediately produces a large amount of data including laboratory findings, medications, fluid balance, progressing notes and imaging findings. Clinicians and clinical investigators should make every effort to make full use of the big data that is being continuously generated by electronic medical record (EMR) system and other healthcare databases. At this stage, more training courses on data management and statistical analysis are required before clinicians and clinical investigators can handle big data and translate them into advances in medical science. China is a large country with a population of 1.3 billion and can contribute greatly to clinical researches by providing reliable and high-quality big data.

  6. Big data and clinical research: focusing on the area of critical care medicine in mainland China

    PubMed Central

    2014-01-01

    Big data has long been found its way into clinical practice since the advent of information technology era. Medical records and follow-up data can be more efficiently stored and extracted with information technology. Immediately after admission a patient immediately produces a large amount of data including laboratory findings, medications, fluid balance, progressing notes and imaging findings. Clinicians and clinical investigators should make every effort to make full use of the big data that is being continuously generated by electronic medical record (EMR) system and other healthcare databases. At this stage, more training courses on data management and statistical analysis are required before clinicians and clinical investigators can handle big data and translate them into advances in medical science. China is a large country with a population of 1.3 billion and can contribute greatly to clinical researches by providing reliable and high-quality big data. PMID:25392827

  7. Big Machines and Big Science: 80 Years of Accelerators at Stanford

    SciTech Connect

    Loew, Gregory

    2008-12-16

    Longtime SLAC physicist Greg Loew will present a trip through SLAC's origins, highlighting its scientific achievements, and provide a glimpse of the lab's future in 'Big Machines and Big Science: 80 Years of Accelerators at Stanford.'

  8. Big data: an introduction for librarians.

    PubMed

    Hoy, Matthew B

    2014-01-01

    Modern life produces data at an astounding rate and shows no signs of slowing. This has lead to new advances in data storage and analysis and the concept of "big data," that is, massive data sets that can yield surprising insights when analyzed. This column will briefly describe what big data is and why it is important. It will also briefly explore the possibilities and problems of big data and the implications it has for librarians. A list of big data projects and resources is also included.

  9. Medical big data: promise and challenges.

    PubMed

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  10. Traffic information computing platform for big data

    SciTech Connect

    Duan, Zongtao Li, Ying Zheng, Xibin Liu, Yan Dai, Jiting Kang, Jun

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  11. Medical big data: promise and challenges

    PubMed Central

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-01-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology. PMID:28392994

  12. The LHC's Next Big Mystery

    NASA Astrophysics Data System (ADS)

    Lincoln, Don

    2015-03-01

    When the sun rose over America on July 4, 2012, the world of science had radically changed. The Higgs boson had been discovered. Mind you, the press releases were more cautious than that, with "a new particle consistent with being the Higgs boson" being the carefully constructed phrase of the day. But, make no mistake, champagne corks were popped and backs were slapped. The data had spoken and a party was in order. Even if the observation turned out to be something other than the Higgs boson, the first big discovery from data taken at the Large Hadron Collider had been made.

  13. The faces of Big Science.

    PubMed

    Schatz, Gottfried

    2014-06-01

    Fifty years ago, academic science was a calling with few regulations or financial rewards. Today, it is a huge enterprise confronted by a plethora of bureaucratic and political controls. This change was not triggered by specific events or decisions but reflects the explosive 'knee' in the exponential growth that science has sustained during the past three-and-a-half centuries. Coming to terms with the demands and benefits of 'Big Science' is a major challenge for today's scientific generation. Since its foundation 50 years ago, the European Molecular Biology Organization (EMBO) has been of invaluable help in meeting this challenge.

  14. Big bang nucleosynthesis: An update

    SciTech Connect

    Olive, Keith A.

    2013-07-23

    An update on the standard model of big bang nucleosynthesis (BBN) is presented. With the value of the baryon-tophoton ratio determined to high precision by WMAP, standard BBN is a parameter-free theory. In this context, the theoretical prediction for the abundances of D, {sup 4}He, and {sup 7}Li is discussed and compared to their observational determination. While concordance for D and {sup 4}He is satisfactory, the prediction for {sup 7}Li exceeds the observational determination by a factor of about four. Possible solutions to this problem are discussed.

  15. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    ERIC Educational Resources Information Center

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  16. Fires Burning near Big Sur, California

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Fires near Big Sur, Calif., continued to burn unchecked when the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument on NASA's Terra satellite captured this image on Sunday, June 29. In Northern California alone, fires have consumed more than 346,000 acres.At least 18,000 people have deployed to attempt to extinguish or control the flames. Air quality as far away as San Francisco has been adversely impacted by the dense clouds of smoke and ash blowing towards the northwest. The satellite image combines a natural color portrayal of the landscape with thermal infrared data showing the active burning areas in red. The dark area in the lower right is a previous forest fire.

    ASTER is one of five Earth-observing instruments launched December 18, 1999, on NASA's Terra satellite. The instrument was built by Japan's Ministry of Economy, Trade and Industry. A joint U.S./Japan science team is responsible for validation and calibration of the instrument and the data products.

    The broad spectral coverage and high spectral resolution of ASTER provides scientists in numerous disciplines with critical information for surface mapping, and monitoring of dynamic conditions and temporal change. Example applications are: monitoring glacial advances and retreats; monitoring potentially active volcanoes; identifying crop stress; determining cloud morphology and physical properties; wetlands evaluation; thermal pollution monitoring; coral reef degradation; surface temperature mapping of soils and geology; and measuring surface heat balance.

    The U.S. science team is located at NASA's Jet Propulsion Laboratory, Pasadena, Calif. The Terra mission is part of NASA's Science Mission Directorate.

    Size: 35.4 by 57 kilometers (21.9 by 34.2 miles) Location: 36.1 degrees North latitude, 121.6 degrees West longitude Orientation: North at top Image Data: ASTER bands 3, 2, and 1 Original Data Resolution: 15 meters (49 feet) Dates Acquired: June 29

  17. Progress on the Big Optical Array (BOA)

    NASA Astrophysics Data System (ADS)

    Armstrong, John T.

    1994-06-01

    The Navy Prototype Optical Interferometer (NPOI) is nearing the completion of the first phase of construction at the Lowell Observatory on Anderson Mesa, AZ. The NPOI comprises two sub- arrays, the Big Optical Array (BOA) and the USNO Astrometric Interferometer (AI), which share delay lines, the optics laboratory, the control system, and parts of the feed optics. We describe the design of and progress on the BOA, the imaging component of the NPOI. The AI is described elsewhere (Hutter, these proceedings). As of the date of this symposium, most of the civil engineering is complete, including the control and laboratory buildings and the concrete piers for the initial array. Three AI siderostats and associated feed pipes, three delay lines, the initial three-way beam combiner, and much of the control system are in place. First fringes are anticipated in April. By the end of 1994, four AI and two BOA siderostats, as well as three more delay lines, will be installed, making imaging with all six siderostats possible. The complete BOA will consist of six 50 cm siderostats and 30 siderostat stations in a Y with 251 m arms, with baseline lengths from 4 m to 437 m. Nearly redundant baseline lengths will allow fringe tracking on long baselines on which the visibilities are too low for detection in real time. A six-way beam combiner (Mozurkewich, these proceedings) will allow simultaneous measurements of 15 visibilities and nine of 10 independent closure phases. The output beams will feed 32-channel spectrometers covering the range from 450 to 900 nm. We anticipate tracking fringes on stars brighter than 10(superscript m), imaging surfaces of stars brighter than 4(superscript m), measuring stellar diameters to 0.18 milliarcsec (mas), and measuring binary orbits with major axes as small as 0.4 mas.

  18. Three dimensional simulation for Big Hill Strategic Petroleum Reserve (SPR).

    SciTech Connect

    Ehgartner, Brian L.; Park, Byoung Yoon; Sobolik, Steven Ronald; Lee, Moo Yul

    2005-07-01

    3-D finite element analyses were performed to evaluate the structural integrity of caverns located at the Strategic Petroleum Reserve's Big Hill site. State-of-art analyses simulated the current site configuration and considered additional caverns. The addition of 5 caverns to account for a full site and a full dome containing 31 caverns were modeled. Operations including both normal and cavern workover pressures and cavern enlargement due to leaching were modeled to account for as many as 5 future oil drawdowns. Under the modeled conditions, caverns were placed very close to the edge of the salt dome. The web of salt separating the caverns and the web of salt between the caverns and edge of the salt dome were reduced due to leaching. The impacts on cavern stability, underground creep closure, surface subsidence and infrastructure, and well integrity were quantified. The analyses included recently derived damage criterion obtained from testing of Big Hill salt cores. The results show that from a structural view point, many additional caverns can be safely added to Big Hill.

  19. Untapped Potential: Fulfilling the Promise of Big Brothers Big Sisters and the Bigs and Littles They Represent

    ERIC Educational Resources Information Center

    Bridgeland, John M.; Moore, Laura A.

    2010-01-01

    American children represent a great untapped potential in our country. For many young people, choices are limited and the goal of a productive adulthood is a remote one. This report paints a picture of who these children are, shares their insights and reflections about the barriers they face, and offers ways forward for Big Brothers Big Sisters as…

  20. Big Challenges and Big Opportunities: The Power of "Big Ideas" to Change Curriculum and the Culture of Teacher Planning

    ERIC Educational Resources Information Center

    Hurst, Chris

    2014-01-01

    Mathematical knowledge of pre-service teachers is currently "under the microscope" and the subject of research. This paper proposes a different approach to teacher content knowledge based on the "big ideas" of mathematics and the connections that exist within and between them. It is suggested that these "big ideas"…

  1. Baryon symmetric big bang cosmology

    NASA Technical Reports Server (NTRS)

    Stecker, F. W.

    1978-01-01

    Both the quantum theory and Einsteins theory of special relativity lead to the supposition that matter and antimatter were produced in equal quantities during the big bang. It is noted that local matter/antimatter asymmetries may be reconciled with universal symmetry by assuming (1) a slight imbalance of matter over antimatter in the early universe, annihilation, and a subsequent remainder of matter; (2) localized regions of excess for one or the other type of matter as an initial condition; and (3) an extremely dense, high temperature state with zero net baryon number; i.e., matter/antimatter symmetry. Attention is given to the third assumption, which is the simplest and the most in keeping with current knowledge of the cosmos, especially as pertains the universality of 3 K background radiation. Mechanisms of galaxy formation are discussed, whereby matter and antimatter might have collided and annihilated each other, or have coexisted (and continue to coexist) at vast distances. It is pointed out that baryon symmetric big bang cosmology could probably be proved if an antinucleus could be detected in cosmic radiation.

  2. The BigBOSS spectrograph

    NASA Astrophysics Data System (ADS)

    Jelinsky, Patrick; Bebek, Chris; Besuner, Robert; Carton, Pierre-Henri; Edelstein, Jerry; Lampton, Michael; Levi, Michael E.; Poppett, Claire; Prieto, Eric; Schlegel, David; Sholl, Michael

    2012-09-01

    BigBOSS is a proposed ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a 14,000 square degree galaxy and quasi-stellar object redshift survey. It consists of a 5,000- fiber-positioner focal plane feeding the spectrographs. The optical fibers are separated into ten 500 fiber slit heads at the entrance of ten identical spectrographs in a thermally insulated room. Each of the ten spectrographs has a spectral resolution (λ/Δλ) between 1500 and 4000 over a wavelength range from 360 - 980 nm. Each spectrograph uses two dichroic beam splitters to separate the spectrograph into three arms. It uses volume phase holographic (VPH) gratings for high efficiency and compactness. Each arm uses a 4096x4096 15 μm pixel charge coupled device (CCD) for the detector. We describe the requirements and current design of the BigBOSS spectrograph. Design trades (e.g. refractive versus reflective) and manufacturability are also discussed.

  3. Theoretical and Experimental Investigation of Thermodynamics and Kinetics of Thiol-Michael Addition Reactions: A Case Study of Reversible Fluorescent Probes for Glutathione Imaging in Single Cells.

    PubMed

    Chen, Jianwei; Jiang, Xiqian; Carroll, Shaina L; Huang, Jia; Wang, Jin

    2015-12-18

    Density functional theory (DFT) was applied to study the thermodynamics and kinetics of reversible thiol-Michael addition reactions. M06-2X/6-31G(d) with the SMD solvation model can reliably predict the Gibbs free energy changes (ΔG) of thiol-Michael addition reactions with an error of less than 1 kcal·mol(-1) compared with the experimental benchmarks. Taking advantage of this computational model, the first reversible reaction-based fluorescent probe was developed that can monitor the changes in glutathione levels in single living cells.

  4. Big Data Application in Biomedical Research and Health Care: A Literature Review.

    PubMed

    Luo, Jake; Wu, Min; Gopukumar, Deepika; Zhao, Yiqing

    2016-01-01

    Big data technologies are increasingly used for biomedical and health-care informatics research. Large amounts of biological and clinical data have been generated and collected at an unprecedented speed and scale. For example, the new generation of sequencing technologies enables the processing of billions of DNA sequence data per day, and the application of electronic health records (EHRs) is documenting large amounts of patient data. The cost of acquiring and analyzing biomedical data is expected to decrease dramatically with the help of technology upgrades, such as the emergence of new sequencing machines, the development of novel hardware and software for parallel computing, and the extensive expansion of EHRs. Big data applications present new opportunities to discover new knowledge and create novel methods to improve the quality of health care. The application of big data in health care is a fast-growing field, with many new discoveries and methodologies published in the last five years. In this paper, we review and discuss big data application in four major biomedical subdisciplines: (1) bioinformatics, (2) clinical informatics, (3) imaging informatics, and (4) public health informatics. Specifically, in bioinformatics, high-throughput experiments facilitate the research of new genome-wide association studies of diseases, and with clinical informatics, the clinical field benefits from the vast amount of collected patient data for making intelligent decisions. Imaging informatics is now more rapidly integrated with cloud platforms to share medical image data and workflows, and public health informatics leverages big data techniques for predicting and monitoring infectious disease outbreaks, such as Ebola. In this paper, we review the recent progress and breakthroughs of big data applications in these health-care domains and summarize the challenges, gaps, and opportunities to improve and advance big data applications in health care.

  5. Big Data Application in Biomedical Research and Health Care: A Literature Review

    PubMed Central

    Luo, Jake; Wu, Min; Gopukumar, Deepika; Zhao, Yiqing

    2016-01-01

    Big data technologies are increasingly used for biomedical and health-care informatics research. Large amounts of biological and clinical data have been generated and collected at an unprecedented speed and scale. For example, the new generation of sequencing technologies enables the processing of billions of DNA sequence data per day, and the application of electronic health records (EHRs) is documenting large amounts of patient data. The cost of acquiring and analyzing biomedical data is expected to decrease dramatically with the help of technology upgrades, such as the emergence of new sequencing machines, the development of novel hardware and software for parallel computing, and the extensive expansion of EHRs. Big data applications present new opportunities to discover new knowledge and create novel methods to improve the quality of health care. The application of big data in health care is a fast-growing field, with many new discoveries and methodologies published in the last five years. In this paper, we review and discuss big data application in four major biomedical subdisciplines: (1) bioinformatics, (2) clinical informatics, (3) imaging informatics, and (4) public health informatics. Specifically, in bioinformatics, high-throughput experiments facilitate the research of new genome-wide association studies of diseases, and with clinical informatics, the clinical field benefits from the vast amount of collected patient data for making intelligent decisions. Imaging informatics is now more rapidly integrated with cloud platforms to share medical image data and workflows, and public health informatics leverages big data techniques for predicting and monitoring infectious disease outbreaks, such as Ebola. In this paper, we review the recent progress and breakthroughs of big data applications in these health-care domains and summarize the challenges, gaps, and opportunities to improve and advance big data applications in health care. PMID:26843812

  6. 2. Big Creek Road, worm fence and road at trailhead. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. Big Creek Road, worm fence and road at trailhead. - Great Smoky Mountains National Park Roads & Bridges, Big Creek Road, Between State Route 284 & Big Creek Campground, Gatlinburg, Sevier County, TN

  7. 5. Big Creek Road, old bridge on Walnut Bottom Road, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. Big Creek Road, old bridge on Walnut Bottom Road, deck view. - Great Smoky Mountains National Park Roads & Bridges, Big Creek Road, Between State Route 284 & Big Creek Campground, Gatlinburg, Sevier County, TN

  8. 4. Big Creek Road, old bridge on Walnut Bottom Road, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. Big Creek Road, old bridge on Walnut Bottom Road, elevation view. - Great Smoky Mountains National Park Roads & Bridges, Big Creek Road, Between State Route 284 & Big Creek Campground, Gatlinburg, Sevier County, TN

  9. Big sagebrush transplanting success in crested wheatgrass stands

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The conversion of formerly big sagebrush (Artemisia tridentate ssp. wyomingensis)/bunchgrass communities to annual grass dominance, primarily cheatgrass (Bromus tectorum), in Wyoming big sagebrush ecosystems has sparked the increasing demand to establish big sagebrush on disturbed rangelands. The e...

  10. Old Big Oak Flat Road at intersection with New Tioga ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Old Big Oak Flat Road at intersection with New Tioga Road. Note gate for road to Tamarack Campground - Big Oak Flat Road, Between Big Oak Flat Entrance & Merced River, Yosemite Village, Mariposa County, CA

  11. View of Old Big Oak Flat Road in Talus Slope. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of Old Big Oak Flat Road in Talus Slope. Bridal Veil Falls at center distance. Looking east - Big Oak Flat Road, Between Big Oak Flat Entrance & Merced River, Yosemite Village, Mariposa County, CA

  12. An embedding for the big bang

    NASA Technical Reports Server (NTRS)

    Wesson, Paul S.

    1994-01-01

    A cosmological model is given that has good physical properties for the early and late universe but is a hypersurface in a flat five-dimensional manifold. The big bang can therefore be regarded as an effect of a choice of coordinates in a truncated higher-dimensional geometry. Thus the big bang is in some sense a geometrical illusion.

  13. In Search of the Big Bubble

    ERIC Educational Resources Information Center

    Simoson, Andrew; Wentzky, Bethany

    2011-01-01

    Freely rising air bubbles in water sometimes assume the shape of a spherical cap, a shape also known as the "big bubble". Is it possible to find some objective function involving a combination of a bubble's attributes for which the big bubble is the optimal shape? Following the basic idea of the definite integral, we define a bubble's surface as…

  14. Structuring the Curriculum around Big Ideas

    ERIC Educational Resources Information Center

    Alleman, Janet; Knighton, Barbara; Brophy, Jere

    2010-01-01

    This article provides an inside look at Barbara Knighton's classroom teaching. She uses big ideas to guide her planning and instruction and gives other teachers suggestions for adopting the big idea approach and ways for making the approach easier. This article also represents a "small slice" of a dozen years of collaborative research,…

  15. Hom-Big Brackets: Theory and Applications

    NASA Astrophysics Data System (ADS)

    Cai, Liqiang; Sheng, Yunhe

    2016-02-01

    In this paper, we introduce the notion of hom-big brackets, which is a generalization of Kosmann-Schwarzbach's big brackets. We show that it gives rise to a graded hom-Lie algebra. Thus, it is a useful tool to study hom-structures. In particular, we use it to describe hom-Lie bialgebras and hom-Nijenhuis operators.

  16. Big system: Interactive graphics for the engineer

    NASA Technical Reports Server (NTRS)

    Quenneville, C. E.

    1975-01-01

    The BCS Interactive Graphics System (BIG System) approach to graphics was presented, along with several significant engineering applications. The BIG System precompiler, the graphics support library, and the function requirements of graphics applications are discussed. It was concluded that graphics standardization and a device independent code can be developed to assure maximum graphic terminal transferability.

  17. A New Look at Big History

    ERIC Educational Resources Information Center

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  18. Bioimage Informatics for Big Data.

    PubMed

    Peng, Hanchuan; Zhou, Jie; Zhou, Zhi; Bria, Alessandro; Li, Yujie; Kleissas, Dean Mark; Drenkow, Nathan G; Long, Brian; Liu, Xiaoxiao; Chen, Hanbo

    2016-01-01

    Bioimage informatics is a field wherein high-throughput image informatics methods are used to solve challenging scientific problems related to biology and medicine. When the image datasets become larger and more complicated, many conventional image analysis approaches are no longer applicable. Here, we discuss two critical challenges of large-scale bioimage informatics applications, namely, data accessibility and adaptive data analysis. We highlight case studies to show that these challenges can be tackled based on distributed image computing as well as machine learning of image examples in a multidimensional environment.

  19. Ethics and Epistemology in Big Data Research.

    PubMed

    Lipworth, Wendy; Mason, Paul H; Kerridge, Ian; Ioannidis, John P A

    2017-03-20

    Biomedical innovation and translation are increasingly emphasizing research using "big data." The hope is that big data methods will both speed up research and make its results more applicable to "real-world" patients and health services. While big data research has been embraced by scientists, politicians, industry, and the public, numerous ethical, organizational, and technical/methodological concerns have also been raised. With respect to technical and methodological concerns, there is a view that these will be resolved through sophisticated information technologies, predictive algorithms, and data analysis techniques. While such advances will likely go some way towards resolving technical and methodological issues, we believe that the epistemological issues raised by big data research have important ethical implications and raise questions about the very possibility of big data research achieving its goals.

  20. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods

  1. A comparison of 3D poly(ε-caprolactone) tissue engineering scaffolds produced with conventional and additive manufacturing techniques by means of quantitative analysis of SR μ-CT images

    NASA Astrophysics Data System (ADS)

    Brun, F.; Intranuovo, F.; Mohammadi, S.; Domingos, M.; Favia, P.; Tromba, G.

    2013-07-01

    The technique used to produce a 3D tissue engineering (TE) scaffold is of fundamental importance in order to guarantee its proper morphological characteristics. An accurate assessment of the resulting structural properties is therefore crucial in order to evaluate the effectiveness of the produced scaffold. Synchrotron radiation (SR) computed microtomography (μ-CT) combined with further image analysis seems to be one of the most effective techniques to this aim. However, a quantitative assessment of the morphological parameters directly from the reconstructed images is a non trivial task. This study considers two different poly(ε-caprolactone) (PCL) scaffolds fabricated with a conventional technique (Solvent Casting Particulate Leaching, SCPL) and an additive manufacturing (AM) technique (BioCell Printing), respectively. With the first technique it is possible to produce scaffolds with random, non-regular, rounded pore geometry. The AM technique instead is able to produce scaffolds with square-shaped interconnected pores of regular dimension. Therefore, the final morphology of the AM scaffolds can be predicted and the resulting model can be used for the validation of the applied imaging and image analysis protocols. It is here reported a SR μ-CT image analysis approach that is able to effectively and accurately reveal the differences in the pore- and throat-size distributions as well as connectivity of both AM and SCPL scaffolds.

  2. Was the Big Bang hot?

    NASA Technical Reports Server (NTRS)

    Wright, E. L.

    1983-01-01

    Techniques for verifying the spectrum defined by Woody and Richards (WR, 1981), which serves as a base for dust-distorted models of the 3 K background, are discussed. WR detected a sharp deviation from the Planck curve in the 3 K background. The absolute intensity of the background may be determined by the frequency dependence of the dipole anisotropy of the background or the frequency dependence effect in galactic clusters. Both methods involve the Doppler shift; analytical formulae are defined for characterization of the dipole anisotropy. The measurement of the 30-300 GHz spectra of cold galactic dust may reveal the presence of significant amounts of needle-shaped grains, which would in turn support a theory of a cold Big Bang.

  3. Big Mysteries: The Higgs Mass

    ScienceCinema

    Lincoln, Don

    2016-07-12

    With the discovery of what looks to be the Higgs boson, LHC researchers are turning their attention to the next big question, which is the predicted mass of the newly discovered particles. When the effects of quantum mechanics is taken into account, the mass of the Higgs boson should be incredibly high...perhaps upwards of a quadrillion times higher than what was observed. In this video, Fermilab's Dr. Don Lincoln explains how it is that the theory predicts that the mass is so large and gives at least one possible theoretical idea that might solve the problem. Whether the proposed idea is the answer or not, this question must be answered by experiments at the LHC or today's entire theoretical paradigm could be in jeopardy.

  4. Big Mysteries: The Higgs Mass

    SciTech Connect

    Lincoln, Don

    2014-04-28

    With the discovery of what looks to be the Higgs boson, LHC researchers are turning their attention to the next big question, which is the predicted mass of the newly discovered particles. When the effects of quantum mechanics is taken into account, the mass of the Higgs boson should be incredibly high...perhaps upwards of a quadrillion times higher than what was observed. In this video, Fermilab's Dr. Don Lincoln explains how it is that the theory predicts that the mass is so large and gives at least one possible theoretical idea that might solve the problem. Whether the proposed idea is the answer or not, this question must be answered by experiments at the LHC or today's entire theoretical paradigm could be in jeopardy.

  5. Evidence of the big fix

    NASA Astrophysics Data System (ADS)

    Hamada, Yuta; Kawai, Hikaru; Kawana, Kiyoharu

    2014-06-01

    We give an evidence of the Big Fix. The theory of wormholes and multiverse suggests that the parameters of the Standard Model are fixed in such a way that the total entropy at the late stage of the universe is maximized, which we call the maximum entropy principle. In this paper, we discuss how it can be confirmed by the experimental data, and we show that it is indeed true for the Higgs vacuum expectation value vh. We assume that the baryon number is produced by the sphaleron process, and that the current quark masses, the gauge couplings and the Higgs self-coupling are fixed when we vary vh. It turns out that the existence of the atomic nuclei plays a crucial role to maximize the entropy. This is reminiscent of the anthropic principle, however it is required by the fundamental law in our case.

  6. Pathfinder Landing Site Observed by Mars Orbiter Camera - 'Big Crater' in Stereo View

    NASA Technical Reports Server (NTRS)

    1998-01-01

    On its 256th orbit of Mars, the camera on-board the Mars Global Surveyor spacecraft successfully observed the vicinity of the Pathfinder landing site. The images shown are a stereoscopic image pair in anaglyph format, made from the overlapping area of MOC 25603 and 23703. This image is reproduced at a scale of 5 m (16.4 feet) per pixel. Image 23703 was acquired on 13 April at 7:50 AM PDT; Image 25603 was acquired on 22 April at 1:11 PM PDT. The P237 observation was made from a distance of 675 km while the P256 measurement was made from 800 km. The viewing angle for 23703 was 21.2o, for 25603, 30.67o, giving an angular difference of about 9.5o. Owing to the relief on 'Big Crater,' this relatively small angular difference was in this case sufficient to show good stereo parallax.

    The resolution of the MOC image that covered the Pathfinder landing site (MOC 25603) was about 3.3 m or 11 feet per pixel. The Pathfinder lander and airbags form a roughly equilateral triangle 5 m on a side. Noting that the camera has not yet been focussed (it needs to be in the stable temperature conditions of the low altitude, circular mapping orbit in order to achieve best focus) and the hazy atmospheric conditions, the effective scale of the image is probably closer to 5 m (16.4 feet). Thus, the scale of the image was insufficient to resolve the lander (more than one pixel is needed to resolve a feature). In addition, the relatively high sun angle of the image (the sun was 40o above the horizon) reduced the length of shadows (for example, only a few boulders are seen), also decreasing the ability to discriminate small features. Work continues to locate intermediate-scale features in the lander and orbiter images in the hope of identifying the precise landing site based on these comparisons.

    Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego

  7. Anger and hostility from the perspective of the Big Five personality model.

    PubMed

    Sanz, Jesús; García-Vera, María Paz; Magán, Inés

    2010-06-01

    This study was aimed at examining the relationships of the personality dimensions of the five-factor model or Big Five with trait anger and with two specific traits of hostility (mistrust and confrontational attitude), and identifying the similarities and differences between trait anger and hostility in the framework of the Big Five. In a sample of 353 male and female adults, the Big Five explained a significant percentage of individual differences in trait anger and hostility after controlling the effects due to the relationship between both constructs and content overlapping across scales. In addition, trait anger was primarily associated with neuroticism, whereas mistrust and confrontational attitude were principally related to low agreeableness. These findings are discussed in the context of the anger-hostility-aggression syndrome and the capability of the Big Five for organizing and clarifying related personality constructs.

  8. Microsystems - The next big thing

    SciTech Connect

    STINNETT,REGAN W.

    2000-05-11

    Micro-Electro-Mechanical Systems (MEMS) is a big name for tiny devices that will soon make big changes in everyday life and the workplace. These and other types of Microsystems range in size from a few millimeters to a few microns, much smaller than a human hair. These Microsystems have the capability to enable new ways to solve problems in commercial applications ranging from automotive, aerospace, telecommunications, manufacturing equipment, medical diagnostics to robotics, and in national security applications such as nuclear weapons safety and security, battlefield intelligence, and protection against chemical and biological weapons. This broad range of applications of Microsystems reflects the broad capabilities of future Microsystems to provide the ability to sense, think, act, and communicate, all in a single integrated package. Microsystems have been called the next silicon revolution, but like many revolutions, they incorporate more elements than their predecessors. Microsystems do include MEMS components fabricated from polycrystalline silicon processed using techniques similar to those used in the manufacture of integrated electrical circuits. They also include optoelectronic components made from gallium arsenide and other semiconducting compounds from the III-V groups of the periodic table. Microsystems components are also being made from pure metals and metal alloys using the LIGA process, which utilizes lithography, etching, and casting at the micron scale. Generically, Microsystems are micron scale, integrated systems that have the potential to combine the ability to sense light, heat, pressure, acceleration, vibration, and chemicals with the ability to process the collected data using CMOS circuitry, execute an electrical, mechanical, or photonic response, and communicate either optically or with microwaves.

  9. Big, Dark Dunes Northeast of Syrtis Major

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Big sand dunes! Mars is home to some very large, windblown dunes. The dunes shown here rise to almost 100 meters (275 feet) at their crests. Unlike dunes on Earth, the larger dunes of Mars are composed of dark, rather than light grains. This is probably related to the composition of the sand, since different materials will have different brightnesses. For example, beaches on the island of Oahu in Hawaii are light colored because they consist of ground-up particles of seashells, while beaches in the southern shores of the island of Hawaii (the 'Big Island' in the Hawaiian island chain) are dark because they consist of sand derived from dark lava rock.

    The dunes in this picture taken by the Mars Orbiter Camera (MOC) are located on the floor of an old, 72 km-(45 mi)-diameter crater located northeast of Syrtis Major. The sand is being blown from the upper right toward the lower left. The surface that the dunes have been travelling across is pitted and cratered. The substrate is also hard and bright--i.e., it is composed of a material of different composition than the sand in the dunes. The dark streaks on the dune surfaces area puzzle...at first glance one might conclude they are the result of holiday visitors with off-road vehicles. However, the streaks more likely result from passing dust devils or wind gusts that disturb the sand surface just enough to leave a streak. The image shown here covers an area approximately 2.6 km (1.6 mi) wide, and is illuminated from the lower right.

    Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.

  10. Perspectives on making big data analytics work for oncology.

    PubMed

    El Naqa, Issam

    2016-12-01

    Oncology, with its unique combination of clinical, physical, technological, and biological data provides an ideal case study for applying big data analytics to improve cancer treatment safety and outcomes. An oncology treatment course such as chemoradiotherapy can generate a large pool of information carrying the 5Vs hallmarks of big data. This data is comprised of a heterogeneous mixture of patient demographics, radiation/chemo dosimetry, multimodality imaging features, and biological markers generated over a treatment period that can span few days to several weeks. Efforts using commercial and in-house tools are underway to facilitate data aggregation, ontology creation, sharing, visualization and varying analytics in a secure environment. However, open questions related to proper data structure representation and effective analytics tools to support oncology decision-making need to be addressed. It is recognized that oncology data constitutes a mix of structured (tabulated) and unstructured (electronic documents) that need to be processed to facilitate searching and subsequent knowledge discovery from relational or NoSQL databases. In this context, methods based on advanced analytics and image feature extraction for oncology applications will be discussed. On the other hand, the classical p (variables)≫n (samples) inference problem of statistical learning is challenged in the Big data realm and this is particularly true for oncology applications where p-omics is witnessing exponential growth while the number of cancer incidences has generally plateaued over the past 5-years leading to a quasi-linear growth in samples per patient. Within the Big data paradigm, this kind of phenomenon may yield undesirable effects such as echo chamber anomalies, Yule-Simpson reversal paradox, or misleading ghost analytics. In this work, we will present these effects as they pertain to oncology and engage small thinking methodologies to counter these effects ranging from

  11. Big Earth observation data analytics for land use and land cover change information

    NASA Astrophysics Data System (ADS)

    Câmara, Gilberto

    2015-04-01

    Current scientific methods for extracting information for Earth observation data lag far behind our capacity to build complex satellites. In response to this challenge, our work explores a new type of knowledge platform to improve the extraction of land use and land cover change information from big Earth Observation data sets. We take a space-time perspective of Earth Observation data, considering that each sensor revisits the same place at regular intervals. Sensor data can, in principle, be calibrated so that observations of the same place in different times are comparable and each measure from a sensor is mapped into a three dimensional array in space-time. To fully enable the use of space-time arrays for working with Earth Observation data, we use the SciDB array database. Arrays naturally fit the data structure of Earth Observation images, breaking the image-as-a-snapshot paradigm. Thus, entire collections of images can be stored as multidimensional arrays. However, array databases do not understand the specific nature of geographical data, and do not capture the meaning and the differences between spatial and temporal dimensions. In our work, we have extended SciDB to include additional information about satellite image metadata, cartographical projections, and time. We are currently developing methods to extract land use and land cover information based on space-time analysis on array databases. Our experiments show these space-time methods give us significant improvements over current space-only remote sensing image processing methods. We have been able to capture tropical forest degradation and forest regrowth and also to distinguish between single-cropping and double-cropping practices in tropical agriculture.

  12. The Big Apple's Core: Exploring Manhattan

    ERIC Educational Resources Information Center

    Groce, Eric C.; Groce, Robin D.; Colby, Susan

    2005-01-01

    Children are exposed to a wide variety of images related to New York City through various media outlets. They may have seen glimpses of Manhattan by watching movies such as Spiderman or Stuart Little or by taking in annual television events such as the Macy's Thanksgiving Day Parade or the Times Square New Year's Eve celebration. Additionally,…

  13. Impact of Surgical Evaluation of Additional Cine Magnetic Resonance Imaging for Advanced Thymoma with Infiltration of Adjacent Structures: The Thoracic Surgeon's View.

    PubMed

    Ried, Michael; Hnevkovsky, Stefanie; Neu, Reiner; von Süßkind-Schwendi, Marietta; Götz, Andrea; Hamer, Okka W; Schalke, Berthold; Hofmann, Hans-Stefan

    2017-04-01

    Background Preoperative radiological assessment is important for clarification of surgical operability for advanced thymic tumors. Objective was to determine the feasibility of magnetic resonance imaging (MRI) with cine sequences for evaluation of cardiovascular tumor invasion. Patients and Methods This prospective study included patients with advanced thymoma, who underwent surgical resection. All patients received preoperative computed tomography (CT) scan and cine MRI. Results Tumor infiltration was surgically confirmed in the pericardium (n = 12), myocardium (n = 1), superior caval vein (SCV; n = 3), and aorta (n = 2). A macroscopic complete resection was possible in 10 patients, whereas 2 patients with aortic or myocardial tumor invasion had R2 resection. The positive predictive value (PPV) was 50% for cine MRI compared with 0% for CT scan regarding myocardial tumor infiltration. The PPV for tumor infiltration of the aorta was 50%, with a higher sensitivity for the CT scan (100 vs. 50%). Infiltration of the SCV could be detected slightly better with cine MRI (PPV 75 vs. 66.7%). Conclusion Cine MRI seems to improve the accuracy of preoperative staging of advanced thymoma regarding infiltration of cardiovascular structures and supports the surgical approach.

  14. [Big data in medicine and healthcare].

    PubMed

    Rüping, Stefan

    2015-08-01

    Healthcare is one of the business fields with the highest Big Data potential. According to the prevailing definition, Big Data refers to the fact that data today is often too large and heterogeneous and changes too quickly to be stored, processed, and transformed into value by previous technologies. The technological trends drive Big Data: business processes are more and more executed electronically, consumers produce more and more data themselves - e.g. in social networks - and finally ever increasing digitalization. Currently, several new trends towards new data sources and innovative data analysis appear in medicine and healthcare. From the research perspective, omics-research is one clear Big Data topic. In practice, the electronic health records, free open data and the "quantified self" offer new perspectives for data analytics. Regarding analytics, significant advances have been made in the information extraction from text data, which unlocks a lot of data from clinical documentation for analytics purposes. At the same time, medicine and healthcare is lagging behind in the adoption of Big Data approaches. This can be traced to particular problems regarding data complexity and organizational, legal, and ethical challenges. The growing uptake of Big Data in general and first best-practice examples in medicine and healthcare in particular, indicate that innovative solutions will be coming. This paper gives an overview of the potentials of Big Data in medicine and healthcare.

  15. Processing Solutions for Big Data in Astronomy

    NASA Astrophysics Data System (ADS)

    Fillatre, L.; Lepiller, D.

    2016-09-01

    This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.

  16. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2004-01-04

    The Big Sky Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts during the first performance period fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first Partnership meeting the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Complementary to the efforts on evaluation of sources and sinks is the development of the Big Sky Partnership Carbon Cyberinfrastructure (BSP-CC) and a GIS Road Map for the Partnership. These efforts will put in place a map-based integrated information management system for our Partnership, with transferability to the national carbon sequestration effort. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but other policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts begun in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best

  17. ATLAS: Big Data in a Small Package

    NASA Astrophysics Data System (ADS)

    Denneau, Larry; Tonry, John

    2015-08-01

    For even small telescope projects, the petabyte scale is now upon us. The Asteroid Terrestrial-impact Last Alert System (ATLAS; Tonry 2011) will robotically survey the entire visible sky from Hawaii multiple times per night to search for near-Earth asteroids (NEAs) on impact trajectories. While the ATLAS optical system is modest by modern astronomical standards -- two 0.5 m F/2.0 telescopes -- each year the ATLAS system will obtain ~103 measurements of 109 astronomical sources to a photometric accuracy of <5%. This ever-growing dataset must be searched in real-time for moving objects then archived for further analysis, and alerts for newly discovered near-Earth NEAs disseminated within tens of minutes from detection. ATLAS's all-sky coverage ensures it will discover many ``rifle shot'' near-misses moving rapidly on the sky as they shoot past the Earth, so the system will need software to automatically detect highly-trailed sources and discriminate them from the thousands of satellites and pieces of space junk that ATLAS will see each night. Additional interrogation will identify interesting phenomena from beyond the solar system occurring over millions of transient sources per night. The data processing and storage requirements for ATLAS demand a ``big data'' approach typical of commercial Internet enterprises. We describe our approach to deploying a nimble, scalable and reliable data processing infrastructure, and promote ATLAS as steppingstone to eventual processing scales in the era of LSST.

  18. Big Bang Nucleosynthesis in the New Cosmology

    SciTech Connect

    Fields, Brian D.

    2008-01-24

    Big bang nucleosynthesis (BBN) describes the production of the lightest elements in the first minutes of cosmic time. We review the physics of cosmological element production, and the observations of the primordial element abundances. The comparison between theory and observation has heretofore provided our earliest probe of the universe, and given the best measure of the cosmic baryon content. However, BBN has now taken a new role in cosmology, in light of new precision measurements of the cosmic microwave background (CMB). Recent CMB anisotropy data yield a wealth of cosmological parameters; in particular, the baryon-to-photon ratio {eta} = n{sub B}/n{sub {gamma}} is measured to high precision. The confrontation between the BBN and CMB ''baryometers'' poses a new and stringent test of the standard cosmology; the status of this test are discussed. Moreover, it is now possible to recast the role of BBN by using the CMB to fix the baryon density and even some light element abundances. This strategy sharpens BBN into a more powerful probe of early universe physics, and of galactic nucleosynthesis processes. The impact of the CMB results on particle physics beyond the Standard Model, and on non-standard cosmology, are illustrated. Prospects for improvement of these bounds via additional astronomical observations and nuclear experiments are discussed, as is the lingering ''lithium problem.''.

  19. Big Crater as Viewed by Pathfinder Lander - Anaglyph

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The 'Big Crater' is actually a relatively small Martian crater to the southeast of the Mars Pathfinder landing site. It is 1500 meters (4900 feet) in diameter, or about the same size as Meteor Crater in Arizona. Superimposed on the rim of Big Crater (the central part of the rim as seen here) is a smaller crater nicknamed 'Rimshot Crater.' The distance to this smaller crater, and the nearest portion of the rim of Big Crater, is 2200 meters (7200 feet). To the right of Big Crater, south from the spacecraft, almost lost in the atmospheric dust 'haze,' is the large streamlined mountain nicknamed 'Far Knob.' This mountain is over 450 meters (1480 feet) tall, and is over 30 kilometers (19 miles) from the spacecraft. Another, smaller and closer knob, nicknamed 'Southeast Knob' can be seen as a triangular peak to the left of the flanks of the Big Crater rim. This knob is 21 kilometers (13 miles) southeast from the spacecraft.

    The larger features visible in this scene - Big Crater, Far Knob, and Southeast Knob - were discovered on the first panoramas taken by the IMP camera on the 4th of July, 1997, and subsequently identified in Viking Orbiter images taken over 20 years ago. The scene includes rocky ridges and swales or 'hummocks' of flood debris that range from a few tens of meters away from the lander to the distance of South Twin Peak. The largest rock in the nearfield, just left of center in the foreground, nicknamed 'Otter', is about 1.5 meters (4.9 feet) long and 10 meters (33 feet) from the spacecraft.

    This view of Big Crater was produced by combining 6 individual 'Superpan' scenes from the left and right eyes of the IMP camera. Each frame consists of 8 individual frames (left eye) and 7 frames (right eye) taken with different color filters that were enlarged by 500% and then co-added using Adobe Photoshop to produce, in effect, a super-resolution panchromatic frame that is sharper than an individual frame would be.

    The anaglyph view of Big Crater was

  20. Big-Data RHEED analysis for understanding epitaxial film growth processes

    SciTech Connect

    Vasudevan, Rama K; Tselev, Alexander; Baddorf, Arthur P; Kalinin, Sergei V

    2014-10-28

    Reflection high energy electron diffraction (RHEED) has by now become a standard tool for in-situ monitoring of film growth by pulsed laser deposition and molecular beam epitaxy. Yet despite the widespread adoption and wealth of information in RHEED image, most applications are limited to observing intensity oscillations of the specular spot, and much additional information on growth is discarded. With ease of data acquisition and increased computation speeds, statistical methods to rapidly mine the dataset are now feasible. Here, we develop such an approach to the analysis of the fundamental growth processes through multivariate statistical analysis of RHEED image sequence. This approach is illustrated for growth of LaxCa1-xMnO3 films grown on etched (001) SrTiO3 substrates, but is universal. The multivariate methods including principal component analysis and k-means clustering provide insight into the relevant behaviors, the timing and nature of a disordered to ordered growth change, and highlight statistically significant patterns. Fourier analysis yields the harmonic components of the signal and allows separation of the relevant components and baselines, isolating the assymetric nature of the step density function and the transmission spots from the imperfect layer-by-layer (LBL) growth. These studies show the promise of big data approaches to obtaining more insight into film properties during and after epitaxial film growth. Furthermore, these studies open the pathway to use forward prediction methods to potentially allow significantly more control over growth process and hence final film quality.

  1. Big-data reflection high energy electron diffraction analysis for understanding epitaxial film growth processes.

    PubMed

    Vasudevan, Rama K; Tselev, Alexander; Baddorf, Arthur P; Kalinin, Sergei V

    2014-10-28

    Reflection high energy electron diffraction (RHEED) has by now become a standard tool for in situ monitoring of film growth by pulsed laser deposition and molecular beam epitaxy. Yet despite the widespread adoption and wealth of information in RHEED images, most applications are limited to observing intensity oscillations of the specular spot, and much additional information on growth is discarded. With ease of data acquisition and increased computation speeds, statistical methods to rapidly mine the data set are now feasible. Here, we develop such an approach to the analysis of the fundamental growth processes through multivariate statistical analysis of a RHEED image sequence. This approach is illustrated for growth of La(x)Ca(1-x)MnO(3) films grown on etched (001) SrTiO(3) substrates, but is universal. The multivariate methods including principal component analysis and k-means clustering provide insight into the relevant behaviors, the timing and nature of a disordered to ordered growth change, and highlight statistically significant patterns. Fourier analysis yields the harmonic components of the signal and allows separation of the relevant components and baselines, isolating the asymmetric nature of the step density function and the transmission spots from the imperfect layer-by-layer (LBL) growth. These studies show the promise of big data approaches to obtaining more insight into film properties during and after epitaxial film growth. Furthermore, these studies open the pathway to use forward prediction methods to potentially allow significantly more control over growth process and hence final film quality.

  2. NOAA Big Data Partnership RFI

    NASA Astrophysics Data System (ADS)

    de la Beaujardiere, J.

    2014-12-01

    In February 2014, the US National Oceanic and Atmospheric Administration (NOAA) issued a Big Data Request for Information (RFI) from industry and other organizations (e.g., non-profits, research laboratories, and universities) to assess capability and interest in establishing partnerships to position a copy of NOAA's vast data holdings in the Cloud, co-located with easy and affordable access to analytical capabilities. This RFI was motivated by a number of concerns. First, NOAA's data facilities do not necessarily have sufficient network infrastructure to transmit all available observations and numerical model outputs to all potential users, or sufficient infrastructure to support simultaneous computation by many users. Second, the available data are distributed across multiple services and data facilities, making it difficult to find and integrate data for cross-domain analysis and decision-making. Third, large datasets require users to have substantial network, storage, and computing capabilities of their own in order to fully interact with and exploit the latent value of the data. Finally, there may be commercial opportunities for value-added products and services derived from our data. Putting a working copy of data in the Cloud outside of NOAA's internal networks and infrastructures should reduce demands and risks on our systems, and should enable users to interact with multiple datasets and create new lines of business (much like the industries built on government-furnished weather or GPS data). The NOAA Big Data RFI therefore solicited information on technical and business approaches regarding possible partnership(s) that -- at no net cost to the government and minimum impact on existing data facilities -- would unleash the commercial potential of its environmental observations and model outputs. NOAA would retain the master archival copy of its data. Commercial partners would not be permitted to charge fees for access to the NOAA data they receive, but

  3. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2005-01-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. Efforts are underway to showcase the architecture of the GIS framework and initial results for sources and sinks. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is

  4. Interoperability Outlook in the Big Data Future

    NASA Astrophysics Data System (ADS)

    Kuo, K. S.; Ramachandran, R.

    2015-12-01

    The establishment of distributed active archive centers (DAACs) as data warehouses and the standardization of file format by NASA's Earth Observing System Data Information System (EOSDIS) had doubtlessly propelled interoperability of NASA Earth science data to unprecedented heights in the 1990s. However, we obviously still feel wanting two decades later. We believe the inadequate interoperability we experience is a result of the the current practice that data are first packaged into files before distribution and only the metadata of these files are cataloged into databases and become searchable. Data therefore cannot be efficiently filtered. Any extensive study thus requires downloading large volumes of data files to a local system for processing and analysis.The need to download data not only creates duplication and inefficiency but also further impedes interoperability, because the analysis has to be performed locally by individual researchers in individual institutions. Each institution or researcher often has its/his/her own preference in the choice of data management practice as well as programming languages. Analysis results (derived data) so produced are thus subject to the differences of these practices, which later form formidable barriers to interoperability. A number of Big Data technologies are currently being examined and tested to address Big Earth Data issues. These technologies share one common characteristics: exploiting compute and storage affinity to more efficiently analyze large volumes and great varieties of data. Distributed active "archive" centers are likely to evolve into distributed active "analysis" centers, which not only archive data but also provide analysis service right where the data reside. "Analysis" will become the more visible function of these centers. It is thus reasonable to expect interoperability to improve because analysis, in addition to data, becomes more centralized. Within a "distributed active analysis center

  5. BigFoot: Bayesian alignment and phylogenetic footprinting with MCMC

    PubMed Central

    Satija, Rahul; Novák, Ádám; Miklós, István; Lyngsø, Rune; Hein, Jotun

    2009-01-01

    Background We have previously combined statistical alignment and phylogenetic footprinting to detect conserved functional elements without assuming a fixed alignment. Considering a probability-weighted distribution of alignments removes sensitivity to alignment errors, properly accommodates regions of alignment uncertainty, and increases the accuracy of functional element prediction. Our method utilized standard dynamic programming hidden markov model algorithms to analyze up to four sequences. Results We present a novel approach, implemented in the software package BigFoot, for performing phylogenetic footprinting on greater numbers of sequences. We have developed a Markov chain Monte Carlo (MCMC) approach which samples both sequence alignments and locations of slowly evolving regions. We implement our method as an extension of the existing StatAlign software package and test it on well-annotated regions controlling the expression of the even-skipped gene in Drosophila and the α-globin gene in vertebrates. The results exhibit how adding additional sequences to the analysis has the potential to improve the accuracy of functional predictions, and demonstrate how BigFoot outperforms existing alignment-based phylogenetic footprinting techniques. Conclusion BigFoot extends a combined alignment and phylogenetic footprinting approach to analyze larger amounts of sequence data using MCMC. Our approach is robust to alignment error and uncertainty and can be applied to a variety of biological datasets. The source code and documentation are publicly available for download from PMID:19715598

  6. The big five personality traits: psychological entities or statistical constructs?

    PubMed

    Franić, Sanja; Borsboom, Denny; Dolan, Conor V; Boomsma, Dorret I

    2014-11-01

    The present study employed multivariate genetic item-level analyses to examine the ontology and the genetic and environmental etiology of the Big Five personality dimensions, as measured by the NEO Five Factor Inventory (NEO-FFI) [Costa and McCrae, Revised NEO personality inventory (NEO PI-R) and NEO five-factor inventory (NEO-FFI) professional manual, 1992; Hoekstra et al., NEO personality questionnaires NEO-PI-R, NEO-FFI: manual, 1996]. Common and independent pathway model comparison was used to test whether the five personality dimensions fully mediate the genetic and environmental effects on the items, as would be expected under the realist interpretation of the Big Five. In addition, the dimensionalities of the latent genetic and environmental structures were examined. Item scores of a population-based sample of 7,900 adult twins (including 2,805 complete twin pairs; 1,528 MZ and 1,277 DZ) on the Dutch version of the NEO-FFI were analyzed. Although both the genetic and the environmental covariance components display a 5-factor structure, applications of common and independent pathway modeling showed that they do not comply with the collinearity constraints entailed in the common pathway model. Implications for the substantive interpretation of the Big Five are discussed.

  7. Additive Manufacturing Infrared Inspection

    NASA Technical Reports Server (NTRS)

    Gaddy, Darrell

    2014-01-01

    Additive manufacturing is a rapid prototyping technology that allows parts to be built in a series of thin layers from plastic, ceramics, and metallics. Metallic additive manufacturing is an emerging form of rapid prototyping that allows complex structures to be built using various metallic powders. Significant time and cost savings have also been observed using the metallic additive manufacturing compared with traditional techniques. Development of the metallic additive manufacturing technology has advanced significantly over the last decade, although many of the techniques to inspect parts made from these processes have not advanced significantly or have limitations. Several external geometry inspection techniques exist such as Coordinate Measurement Machines (CMM), Laser Scanners, Structured Light Scanning Systems, or even traditional calipers and gages. All of the aforementioned techniques are limited to external geometry and contours or must use a contact probe to inspect limited internal dimensions. This presentation will document the development of a process for real-time dimensional inspection technique and digital quality record of the additive manufacturing process using Infrared camera imaging and processing techniques.

  8. Quantum nature of the big bang.

    PubMed

    Ashtekar, Abhay; Pawlowski, Tomasz; Singh, Parampreet

    2006-04-14

    Some long-standing issues concerning the quantum nature of the big bang are resolved in the context of homogeneous isotropic models with a scalar field. Specifically, the known results on the resolution of the big-bang singularity in loop quantum cosmology are significantly extended as follows: (i) the scalar field is shown to serve as an internal clock, thereby providing a detailed realization of the "emergent time" idea; (ii) the physical Hilbert space, Dirac observables, and semiclassical states are constructed rigorously; (iii) the Hamiltonian constraint is solved numerically to show that the big bang is replaced by a big bounce. Thanks to the nonperturbative, background independent methods, unlike in other approaches the quantum evolution is deterministic across the deep Planck regime.

  9. Cosmic relics from the big bang

    SciTech Connect

    Hall, L.J.

    1988-12-01

    A brief introduction to the big bang picture of the early universe is given. Dark matter is discussed; particularly its implications for elementary particle physics. A classification scheme for dark matter relics is given. 21 refs., 11 figs., 1 tab.

  10. Big Data and Analytics in Healthcare.

    PubMed

    Tan, S S-L; Gao, G; Koch, S

    2015-01-01

    This editorial is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". The amount of data being generated in the healthcare industry is growing at a rapid rate. This has generated immense interest in leveraging the availability of healthcare data (and "big data") to improve health outcomes and reduce costs. However, the nature of healthcare data, and especially big data, presents unique challenges in processing and analyzing big data in healthcare. This Focus Theme aims to disseminate some novel approaches to address these challenges. More specifically, approaches ranging from efficient methods of processing large clinical data to predictive models that could generate better predictions from healthcare data are presented.

  11. Big Bang 6Li nucleosynthesis studied deep underground (LUNA collaboration)

    NASA Astrophysics Data System (ADS)

    Trezzi, D.; Anders, M.; Aliotta, M.; Bellini, A.; Bemmerer, D.; Boeltzig, A.; Broggini, C.; Bruno, C. G.; Caciolli, A.; Cavanna, F.; Corvisiero, P.; Costantini, H.; Davinson, T.; Depalo, R.; Elekes, Z.; Erhard, M.; Ferraro, F.; Formicola, A.; Fülop, Zs.; Gervino, G.; Guglielmetti, A.; Gustavino, C.; Gyürky, Gy.; Junker, M.; Lemut, A.; Marta, M.; Mazzocchi, C.; Menegazzo, R.; Mossa, V.; Pantaleo, F.; Prati, P.; Rossi Alvarez, C.; Scott, D. A.; Somorjai, E.; Straniero, O.; Szücs, T.; Takacs, M.

    2017-03-01

    The correct prediction of the abundances of the light nuclides produced during the epoch of Big Bang Nucleosynthesis (BBN) is one of the main topics of modern cosmology. For many of the nuclear reactions that are relevant for this epoch, direct experimental cross section data are available, ushering the so-called "age of precision". The present work addresses an exception to this current status: the 2H(α,γ)6Li reaction that controls 6Li production in the Big Bang. Recent controversial observations of 6Li in metal-poor stars have heightened the interest in understanding primordial 6Li production. If confirmed, these observations would lead to a second cosmological lithium problem, in addition to the well-known 7Li problem. In the present work, the direct experimental cross section data on 2H(α,γ)6Li in the BBN energy range are reported. The measurement has been performed deep underground at the LUNA (Laboratory for Underground Nuclear Astrophysics) 400 kV accelerator in the Laboratori Nazionali del Gran Sasso, Italy. The cross section has been directly measured at the energies of interest for Big Bang Nucleosynthesis for the first time, at Ecm = 80, 93, 120, and 133 keV. Based on the new data, the 2H(α,γ)6Li thermonuclear reaction rate has been derived. Our rate is even lower than previously reported, thus increasing the discrepancy between predicted Big Bang 6Li abundance and the amount of primordial 6Li inferred from observations.

  12. Big-bang nucleosynthesis revisited

    NASA Technical Reports Server (NTRS)

    Olive, Keith A.; Schramm, David N.; Steigman, Gary; Walker, Terry P.

    1989-01-01

    The homogeneous big-bang nucleosynthesis yields of D, He-3, He-4, and Li-7 are computed taking into account recent measurements of the neutron mean-life as well as updates of several nuclear reaction rates which primarily affect the production of Li-7. The extraction of primordial abundances from observation and the likelihood that the primordial mass fraction of He-4, Y(sub p) is less than or equal to 0.24 are discussed. Using the primordial abundances of D + He-3 and Li-7 we limit the baryon-to-photon ratio (eta in units of 10 exp -10) 2.6 less than or equal to eta(sub 10) less than or equal to 4.3; which we use to argue that baryons contribute between 0.02 and 0.11 to the critical energy density of the universe. An upper limit to Y(sub p) of 0.24 constrains the number of light neutrinos to N(sub nu) less than or equal to 3.4, in excellent agreement with the LEP and SLC collider results. We turn this argument around to show that the collider limit of 3 neutrino species can be used to bound the primordial abundance of He-4: 0.235 less than or equal to Y(sub p) less than or equal to 0.245.

  13. COBE looks back to the Big Bang

    NASA Technical Reports Server (NTRS)

    Mather, John C.

    1993-01-01

    An overview is presented of NASA-Goddard's Cosmic Background Explorer (COBE), the first NASA satellite designed to observe the primeval explosion of the universe. The spacecraft carries three extremely sensitive IR and microwave instruments designed to measure the faint residual radiation from the Big Bang and to search for the formation of the first galaxies. COBE's far IR absolute spectrophotometer has shown that the Big Bang radiation has a blackbody spectrum, proving that there was no large energy release after the explosion.

  14. Data Confidentiality Challenges in Big Data Applications

    SciTech Connect

    Yin, Jian; Zhao, Dongfang

    2015-12-15

    In this paper, we address the problem of data confidentiality in big data analytics. In many fields, much useful patterns can be extracted by applying machine learning techniques to big data. However, data confidentiality must be protected. In many scenarios, data confidentiality could well be a prerequisite for data to be shared. We present a scheme to provide provable secure data confidentiality and discuss various techniques to optimize performance of such a system.

  15. Quality of Big Data in Healthcare

    SciTech Connect

    Sukumar, Sreenivas R.; Ramachandran, Natarajan; Ferrell, Regina Kay

    2015-01-01

    The current trend in Big Data Analytics and in particular Health information technology is towards building sophisticated models, methods and tools for business, operational and clinical intelligence, but the critical issue of data quality required for these models is not getting the attention it deserves. The objective of the paper is to highlight the issues of data quality in the context of Big Data Healthcare Analytics.

  16. Dark energy, wormholes, and the big rip

    SciTech Connect

    Faraoni, V.; Israel, W.

    2005-03-15

    The time evolution of a wormhole in a Friedmann universe approaching the big rip is studied. The wormhole is modeled by a thin spherical shell accreting the superquintessence fluid--two different models are presented. Contrary to recent claims that the wormhole overtakes the expansion of the universe and engulfs it before the big rip is reached, it is found that the wormhole becomes asymptotically comoving with the cosmic fluid and the future evolution of the universe is fully causal.

  17. Big data: survey, technologies, opportunities, and challenges.

    PubMed

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  18. Big Data: Survey, Technologies, Opportunities, and Challenges

    PubMed Central

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Mahmoud Ali, Waleed Kamaleldin; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  19. 78 FR 3911 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN; Final Comprehensive...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-17

    ... Fish and Wildlife Service Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN... (CCP) and finding of no significant impact (FONSI) for the environmental assessment (EA) for Big Stone.../FONSI on the planning Web site at http://www.fws.gov/midwest/planning/BigStoneNWR/index.html . A...

  20. Big biomedical data as the key resource for discovery science.

    PubMed

    Toga, Arthur W; Foster, Ian; Kesselman, Carl; Madduri, Ravi; Chard, Kyle; Deutsch, Eric W; Price, Nathan D; Glusman, Gustavo; Heavner, Benjamin D; Dinov, Ivo D; Ames, Joseph; Van Horn, John; Kramer, Roger; Hood, Leroy

    2015-11-01

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an "-ome to home" approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage, aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center's computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson's and Alzheimer's.

  1. Big biomedical data as the key resource for discovery science

    PubMed Central

    Toga, Arthur W; Foster, Ian; Kesselman, Carl; Madduri, Ravi; Chard, Kyle; Deutsch, Eric W; Price, Nathan D; Glusman, Gustavo; Heavner, Benjamin D; Dinov, Ivo D; Ames, Joseph; Van Horn, John; Kramer, Roger; Hood, Leroy

    2015-01-01

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an “-ome to home” approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage, aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center’s computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson’s and Alzheimer’s. PMID:26198305

  2. Native perennial forb variation between mountain big sagebrush and Wyoming big sagebrush plant communities.

    PubMed

    Davies, Kirk W; Bates, Jon D

    2010-09-01

    Big sagebrush (Artemisia tridentata Nutt.) occupies large portions of the western United States and provides valuable wildlife habitat. However, information is lacking quantifying differences in native perennial forb characteristics between mountain big sagebrush [A. tridentata spp. vaseyana (Rydb.) Beetle] and Wyoming big sagebrush [A. tridentata spp. wyomingensis (Beetle & A. Young) S.L. Welsh] plant communities. This information is critical to accurately evaluate the quality of habitat and forage that these communities can produce because many wildlife species consume large quantities of native perennial forbs and depend on them for hiding cover. To compare native perennial forb characteristics on sites dominated by these two subspecies of big sagebrush, we sampled 106 intact big sagebrush plant communities. Mountain big sagebrush plant communities produced almost 4.5-fold more native perennial forb biomass and had greater native perennial forb species richness and diversity compared to Wyoming big sagebrush plant communities (P < 0.001). Nonmetric multidimensional scaling (NMS) and the multiple-response permutation procedure (MRPP) demonstrated that native perennial forb composition varied between these plant communities (P < 0.001). Native perennial forb composition was more similar within plant communities grouped by big sagebrush subspecies than expected by chance (A = 0.112) and composition varied between community groups (P < 0.001). Indicator analysis did not identify any perennial forbs that were completely exclusive and faithful, but did identify several perennial forbs that were relatively good indicators of either mountain big sagebrush or Wyoming big sagebrush plant communities. Our results suggest that management plans and habitat guidelines should recognize differences in native perennial forb characteristics between mountain and Wyoming big sagebrush plant communities.

  3. Mosaicking Mexico - the Big Picture of Big Data

    NASA Astrophysics Data System (ADS)

    Hruby, F.; Melamed, S.; Ressl, R.; Stanley, D.

    2016-06-01

    The project presented in this article is to create a completely seamless and cloud-free mosaic of Mexico at a resolution of 5m, using approximately 4,500 RapidEye images. To complete this project in a timely manner and with limited operators, a number of processing architectures were required to handle a data volume of 12 terabytes. This paper will discuss the different operations realized to complete this project, which include, preprocessing, mosaic generation and post mosaic editing. Prior to mosaic generation, it was necessary to filter the 50,000 RapidEye images captured over Mexico between 2011 and 2014 to identify the top candidate images, based on season and cloud cover. Upon selecting the top candidate images, PCI Geomatics' GXL system was used to reproject, color balance and generate seamlines for the output 1TB+ mosaic. This paper will also discuss innovative techniques used by the GXL for color balancing large volumes of imagery with substantial radiometric differences. Furthermore, post-mosaicking steps, such as, exposure correction, cloud and cloud shadow elimination will be presented.

  4. Boosting Big National Lab Data

    SciTech Connect

    Kleese van Dam, Kerstin

    2013-02-21

    Introduction: Big data. Love it or hate it, solving the world’s most intractable problems requires the ability to make sense of huge and complex sets of data and do it quickly. Speeding up the process – from hours to minutes or from weeks to days – is key to our success. One major source of such big data are physical experiments. As many will know, these physical experiments are commonly used to solve challenges in fields such as energy security, manufacturing, medicine, pharmacology, environmental protection and national security. Experiments use different instruments and sensor types to research for example the validity of new drugs, the base cause for diseases, more efficient energy sources, new materials for every day goods, effective methods for environmental cleanup, the optimal ingredients composition for chocolate or determine how to preserve valuable antics. This is done by experimentally determining the structure, properties and processes that govern biological systems, chemical processes and materials. The speed and quality at which we can acquire new insights from experiments directly influences the rate of scientific progress, industrial innovation and competitiveness. And gaining new groundbreaking insights, faster, is key to the economic success of our nations. Recent years have seen incredible advances in sensor technologies, from house size detector systems in large experiments such as the Large Hadron Collider and the ‘Eye of Gaia’ billion pixel camera detector to high throughput genome sequencing. These developments have led to an exponential increase in data volumes, rates and variety produced by instruments used for experimental work. This increase is coinciding with a need to analyze the experimental results at the time they are collected. This speed is required to optimize the data taking and quality, and also to enable new adaptive experiments, where the sample is manipulated as it is observed, e.g. a substance is injected into a

  5. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2004-06-30

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop (see attached agenda). The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement

  6. Small Molecules-Big Data.

    PubMed

    Császár, Attila G; Furtenbacher, Tibor; Árendás, Péter

    2016-11-17

    Quantum mechanics builds large-scale graphs (networks): the vertices are the discrete energy levels the quantum system possesses, and the edges are the (quantum-mechanically allowed) transitions. Parts of the complete quantum mechanical networks can be probed experimentally via high-resolution, energy-resolved spectroscopic techniques. The complete rovibronic line list information for a given molecule can only be obtained through sophisticated quantum-chemical computations. Experiments as well as computations yield what we call spectroscopic networks (SN). First-principles SNs of even small, three to five atomic molecules can be huge, qualifying for the big data description. Besides helping to interpret high-resolution spectra, the network-theoretical view offers several ideas for improving the accuracy and robustness of the increasingly important information systems containing line-by-line spectroscopic data. For example, the smallest number of measurements necessary to perform to obtain the complete list of energy levels is given by the minimum-weight spanning tree of the SN and network clustering studies may call attention to "weakest links" of a spectroscopic database. A present-day application of spectroscopic networks is within the MARVEL (Measured Active Rotational-Vibrational Energy Levels) approach, whereby the transitions information on a measured SN is turned into experimental energy levels via a weighted linear least-squares refinement. MARVEL has been used successfully for 15 molecules and allowed to validate most of the transitions measured and come up with energy levels with well-defined and realistic uncertainties. Accurate knowledge of the energy levels with computed transition intensities allows the realistic prediction of spectra under many different circumstances, e.g., for widely different temperatures. Detailed knowledge of the energy level structure of a molecule coming from a MARVEL analysis is important for a considerable number of modeling

  7. Pockmarks off Big Sur, California

    USGS Publications Warehouse

    Paull, C.; Ussler, W.; Maher, N.; Greene, H. Gary; Rehder, G.; Lorenson, T.; Lee, H.

    2002-01-01

    A pockmark field was discovered during EM-300 multi-beam bathymetric surveys on the lower continental slope off the Big Sur coast of California. The field contains ??? 1500 pockmarks which are between 130 and 260 m in diameter, and typically are 8-12 m deep located within a 560 km2 area. To investigate the origin of these features, piston cores were collected from both the interior and the flanks of the pockmarks, and remotely operated vehicle observation (ROV) video and sampling transects were conducted which passed through 19 of the pockmarks. The water column within and above the pockmarks was sampled for methane concentration. Piston cores and ROV collected push cores show that the pockmark field is composed of monotonous fine silts and clays and the cores within the pockmarks are indistinguishable from those outside the pockmarks. No evidence for either sediment winnowing or diagenetic alteration suggestive of fluid venting was obtained. 14C measurements of the organic carbon in the sediments indicate continuous sedimentation throughout the time resolution of the radiocarbon technique ( ??? 45000 yr BP), with a sedimentation rate of ??? 10 cm per 1000 yr both within and between the pockmarks. Concentrations of methane, dissolved inorganic carbon, sulfate, chloride, and ammonium in pore water extracted from within the cores are generally similar in composition to seawater and show little change with depth, suggesting low biogeochemical activity. These pore water chemical gradients indicate that neither significant accumulations of gas are likely to exist in the shallow subsurface ( ??? 100 m) nor is active fluid advection occurring within the sampled sediments. Taken together the data indicate that these pockmarks are more than 45000 yr old, are presently inactive, and contain no indications of earlier fluid or gas venting events. ?? 2002 Elsevier Science B.V. All rights reserved.

  8. Big bang nucleosynthesis: Present status

    NASA Astrophysics Data System (ADS)

    Cyburt, Richard H.; Fields, Brian D.; Olive, Keith A.; Yeh, Tsung-Han

    2016-01-01

    Big bang nucleosynthesis (BBN) describes the production of the lightest nuclides via a dynamic interplay among the four fundamental forces during the first seconds of cosmic time. A brief overview of the essentials of this physics is given, and new calculations presented of light-element abundances through 6Li and 7Li, with updated nuclear reactions and uncertainties including those in the neutron lifetime. Fits are provided for these results as a function of baryon density and of the number of neutrino flavors Nν. Recent developments are reviewed in BBN, particularly new, precision Planck cosmic microwave background (CMB) measurements that now probe the baryon density, helium content, and the effective number of degrees of freedom Neff. These measurements allow for a tight test of BBN and cosmology using CMB data alone. Our likelihood analysis convolves the 2015 Planck data chains with our BBN output and observational data. Adding astronomical measurements of light elements strengthens the power of BBN. A new determination of the primordial helium abundance is included in our likelihood analysis. New D/H observations are now more precise than the corresponding theoretical predictions and are consistent with the standard model and the Planck baryon density. Moreover, D/H now provides a tight measurement of Nν when combined with the CMB baryon density and provides a 2 σ upper limit Nν<3.2 . The new precision of the CMB and D/H observations together leaves D/H predictions as the largest source of uncertainties. Future improvement in BBN calculations will therefore rely on improved nuclear cross-section data. In contrast with D/H and 4He, 7Li predictions continue to disagree with observations, perhaps pointing to new physics. This paper concludes with a look at future directions including key nuclear reactions, astronomical observations, and theoretical issues.

  9. Benchmarking Big Data Systems and the BigData Top100 List.

    PubMed

    Baru, Chaitanya; Bhandarkar, Milind; Nambiar, Raghunath; Poess, Meikel; Rabl, Tilmann

    2013-03-01

    "Big data" has become a major force of innovation across enterprises of all sizes. New platforms with increasingly more features for managing big datasets are being announced almost on a weekly basis. Yet, there is currently a lack of any means of comparability among such platforms. While the performance of traditional database systems is well understood and measured by long-established institutions such as the Transaction Processing Performance Council (TCP), there is neither a clear definition of the performance of big data systems nor a generally agreed upon metric for comparing these systems. In this article, we describe a community-based effort for defining a big data benchmark. Over the past year, a Big Data Benchmarking Community has become established in order to fill this void. The effort focuses on defining an end-to-end application-layer benchmark for measuring the performance of big data applications, with the ability to easily adapt the benchmark specification to evolving challenges in the big data space. This article describes the efforts that have been undertaken thus far toward the definition of a BigData Top100 List. While highlighting the major technical as well as organizational challenges, through this article, we also solicit community input into this process.

  10. Mentoring in Schools: An Impact Study of Big Brothers Big Sisters School-Based Mentoring

    ERIC Educational Resources Information Center

    Herrera, Carla; Grossman, Jean Baldwin; Kauh, Tina J.; McMaken, Jennifer

    2011-01-01

    This random assignment impact study of Big Brothers Big Sisters School-Based Mentoring involved 1,139 9- to 16-year-old students in 10 cities nationwide. Youth were randomly assigned to either a treatment group (receiving mentoring) or a control group (receiving no mentoring) and were followed for 1.5 school years. At the end of the first school…

  11. Enhancement of β-catenin activity by BIG1 plus BIG2 via Arf activation and cAMP signals

    PubMed Central

    Li, Chun-Chun; Le, Kang; Kato, Jiro; Moss, Joel; Vaughan, Martha

    2016-01-01

    Multifunctional β-catenin, with critical roles in both cell–cell adhesion and Wnt-signaling pathways, was among HeLa cell proteins coimmunoprecipitated by antibodies against brefeldin A-inhibited guanine nucleotide-exchange factors 1 and 2 (BIG1 or BIG2) that activate ADP-ribosylation factors (Arfs) by accelerating the replacement of bound GDP with GTP. BIG proteins also contain A-kinase anchoring protein (AKAP) sequences that can act as scaffolds for multimolecular assemblies that facilitate and limit cAMP signaling temporally and spatially. Direct interaction of BIG1 N-terminal sequence with β-catenin was confirmed using yeast two-hybrid assays and in vitro synthesized proteins. Depletion of BIG1 and/or BIG2 or overexpression of guanine nucleotide-exchange factor inactive mutant, but not wild-type, proteins interfered with β-catenin trafficking, leading to accumulation at perinuclear Golgi structures. Both phospholipase D activity and vesicular trafficking were required for effects of BIG1 and BIG2 on β-catenin activation. Levels of PKA-phosphorylated β-catenin S675 and β-catenin association with PKA, BIG1, and BIG2 were also diminished after BIG1/BIG2 depletion. Inferring a requirement for BIG1 and/or BIG2 AKAP sequence in PKA modification of β-catenin and its effect on transcription activation, we confirmed dependence of S675 phosphorylation and transcription coactivator function on BIG2 AKAP-C sequence. PMID:27162341

  12. Big Data in Caenorhabditis elegans: quo vadis?

    PubMed

    Hutter, Harald; Moerman, Donald

    2015-11-05

    A clear definition of what constitutes "Big Data" is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of "complete" data sets for this organism is actually rather small--not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein-protein interaction--important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell.

  13. Big Data in Caenorhabditis elegans: quo vadis?

    PubMed Central

    Hutter, Harald; Moerman, Donald

    2015-01-01

    A clear definition of what constitutes “Big Data” is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of “complete” data sets for this organism is actually rather small—not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein–protein interaction—important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell. PMID:26543198

  14. Breaking Barriers in Polymer Additive Manufacturing

    SciTech Connect

    Love, Lonnie J; Duty, Chad E; Post, Brian K; Lind, Randall F; Lloyd, Peter D; Kunc, Vlastimil; Peter, William H; Blue, Craig A

    2015-01-01

    Additive Manufacturing (AM) enables the creation of complex structures directly from a computer-aided design (CAD). There are limitations that prevent the technology from realizing its full potential. AM has been criticized for being slow and expensive with limited build size. Oak Ridge National Laboratory (ORNL) has developed a large scale AM system that improves upon each of these areas by more than an order of magnitude. The Big Area Additive Manufacturing (BAAM) system directly converts low cost pellets into a large, three-dimensional part at a rate exceeding 25 kg/h. By breaking these traditional barriers, it is possible for polymer AM to penetrate new manufacturing markets.

  15. Big Data access and infrastructure for modern biology: case studies in data repository utility.

    PubMed

    Boles, Nathan C; Stone, Tyler; Bergeron, Charles; Kiehl, Thomas R

    2017-01-01

    Big Data is no longer solely the purview of big organizations with big resources. Today's routine tools and experimental methods can generate large slices of data. For example, high-throughput sequencing can quickly interrogate biological systems for the expression levels of thousands of different RNAs, examine epigenetic marks throughout the genome, and detect differences in the genomes of individuals. Multichannel electrophysiology platforms produce gigabytes of data in just a few minutes of recording. Imaging systems generate videos capturing biological behaviors over the course of days. Thus, any researcher now has access to a veritable wealth of data. However, the ability of any given researcher to utilize that data is limited by her/his own resources and skills for downloading, storing, and analyzing the data. In this paper, we examine the necessary resources required to engage Big Data, survey the state of modern data analysis pipelines, present a few data repository case studies, and touch on current institutions and programs supporting the work that relies on Big Data.

  16. Heat Exchange, Additive Manufacturing, and Neutron Imaging

    SciTech Connect

    Geoghegan, Patrick

    2015-02-23

    Researchers at the Oak Ridge National Laboratory have captured undistorted snapshots of refrigerants flowing through small heat exchangers, helping them to better understand heat transfer in heating, cooling and ventilation systems.

  17. Heat Exchange, Additive Manufacturing, and Neutron Imaging

    ScienceCinema

    Geoghegan, Patrick

    2016-07-12

    Researchers at the Oak Ridge National Laboratory have captured undistorted snapshots of refrigerants flowing through small heat exchangers, helping them to better understand heat transfer in heating, cooling and ventilation systems.

  18. Small Things Draw Big Interest

    ERIC Educational Resources Information Center

    Green, Susan; Smith III, Julian

    2005-01-01

    Although the microscope is a basic tool in both physical and biological sciences, it is notably absent from most elementary school science programs. One reason teachers find it challenging to introduce microscopy at the elementary level is because children can have a hard time connecting the image of an object seen through a microscope with what…

  19. Functional Generalized Additive Models.

    PubMed

    McLean, Mathew W; Hooker, Giles; Staicu, Ana-Maria; Scheipl, Fabian; Ruppert, David

    2014-01-01

    We introduce the functional generalized additive model (FGAM), a novel regression model for association studies between a scalar response and a functional predictor. We model the link-transformed mean response as the integral with respect to t of F{X(t), t} where F(·,·) is an unknown regression function and X(t) is a functional covariate. Rather than having an additive model in a finite number of principal components as in Müller and Yao (2008), our model incorporates the functional predictor directly and thus our model can be viewed as the natural functional extension of generalized additive models. We estimate F(·,·) using tensor-product B-splines with roughness penalties. A pointwise quantile transformation of the functional predictor is also considered to ensure each tensor-product B-spline has observed data on its support. The methods are evaluated using simulated data and their predictive performance is compared with other competing scalar-on-function regression alternatives. We illustrate the usefulness of our approach through an application to brain tractography, where X(t) is a signal from diffusion tensor imaging at position, t, along a tract in the brain. In one example, the response is disease-status (case or control) and in a second example, it is the score on a cognitive test. R code for performing the simulations and fitting the FGAM can be found in supplemental materials available online.

  20. Transcriptome marker diagnostics using big data.

    PubMed

    Han, Henry; Liu, Ying

    2016-02-01

    The big omics data are challenging translational bioinformatics in an unprecedented way for its complexities and volumes. How to employ big omics data to achieve a rivalling-clinical, reproducible disease diagnosis from a systems approach is an urgent problem to be solved in translational bioinformatics and machine learning. In this study, the authors propose a novel transcriptome marker diagnosis to tackle this problem using big RNA-seq data by viewing whole transcriptome as a profile marker systematically. The systems diagnosis not only avoids the reproducibility issue of the existing gene-/network-marker-based diagnostic methods, but also achieves rivalling-clinical diagnostic results by extracting true signals from big RNA-seq data. Their method demonstrates a better fit for personalised diagnostics by attaining exceptional diagnostic performance via using systems information than its competitive methods and prepares itself as a good candidate for clinical usage. To the best of their knowledge, it is the first study on this topic and will inspire the more investigations in big omics data diagnostics.

  1. A Review of Big Graph Mining Research

    NASA Astrophysics Data System (ADS)

    Atastina, I.; Sitohang, B.; Saptawati, G. A. P.; Moertini, V. S.

    2017-03-01

    Big Graph Mining” is a continuously developing research that was started in 2009 until now. After 7 years, there are many researches that put this topic as the main concern. However, there is no mapping or summary concerning the important issues and solutions to explain this topic. This paper contains a summary of researches that have been conducted since 2009. The result is grouped based on the algorithms, built system and also preprocess techniques that have been developed. Based on survey, there are 11 algorithms and 6 distributed systems to analyse the Big Graph have been improved. While improved pre-process algorithm only covers: sampling and compression technique. These improving algorithms are usually aimed to frequent sub graphs discovery, whereas slightly those of is aimed to cluster Big Graph, and there is no algorithm to classify Big Graph. As a conclusion of this survey, there is a need for more researches to be conducted to improve a comprehensive Graph Mining System, especially for very big Graph.

  2. Volume and Value of Big Healthcare Data.

    PubMed

    Dinov, Ivo D

    Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions.

  3. Volume and Value of Big Healthcare Data

    PubMed Central

    Dinov, Ivo D.

    2016-01-01

    Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions. PMID:26998309

  4. ATLAS: Big Data in a Small Package?

    NASA Astrophysics Data System (ADS)

    Denneau, Larry

    2016-01-01

    For even small astronomy projects, the petabyte scale is now upon us. The Asteroid Terrestrial-impact Last Alert System (Tonry 2011) will survey the entire visible sky from Hawaii multiple times per night to search for near-Earth asteroids on impact trajectories. While the ATLAS optical system is modest by modern astronomical standards - two 0.5 m F/2.0 telescopes - each night the ATLAS system will measure nearly 109 astronomical sources to a photometric accuracy of <5%, totaling 1012 individual observations over its initial 3-year mission. This ever-growing dataset must be searched in real-time for moving objects and transients then archived for further analysis, and alerts for newly discovered near-Earth asteroids (NEAs) disseminated within tens of minutes from detection. ATLAS's all-sky coverage ensures it will discover many `rifle shot' near-misses moving rapidly on the sky as they shoot past the Earth, so the system will need software to automatically detect highly-trailed sources and discriminate them from the thousands of low-Earth orbit (LEO) and geosynchronous orbit (GEO) satellites ATLAS will see each night. Additional interrogation will identify interesting phenomena from millions of transient sources per night beyond the solar system. The data processing and storage requirements for ATLAS demand a `big data' approach typical of commercial internet enterprises. We describe our experience in deploying a nimble, scalable and reliable data processing infrastructure, and suggest ATLAS as steppingstone to data processing capability needed as we enter the era of LSST.

  5. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2004-10-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification

  6. Big Science and the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Giudice, Gian Francesco

    2012-03-01

    The Large Hadron Collider (LHC), the particle accelerator operating at CERN, is probably the most complex and ambitious scientific project ever accomplished by humanity. The sheer size of the enterprise, in terms of financial and human resources, naturally raises the question whether society should support such costly basic-research programs. I address this question by first reviewing the process that led to the emergence of Big Science and the role of large projects in the development of science and technology. I then compare the methodologies of Small and Big Science, emphasizing their mutual linkage. Finally, after examining the cost of Big Science projects, I highlight several general aspects of their beneficial implications for society.

  7. Little Big Horn River Water Quality Project

    SciTech Connect

    Bad Bear, D.J.; Hooker, D.

    1995-10-01

    This report summarizes the accomplishments of the Water Quality Project on the Little Big horn River during the summer of 1995. The majority of the summer was spent collecting data on the Little Big Horn River, then testing the water samples for a number of different tests which was done at the Little Big Horn College in Crow Agency, Montana. The intention of this study is to preform stream quality analysis to gain an understanding of the quality of selected portion of the river, to assess any impact that the existing developments may be causing to the environment and to gather base-line data which will serve to provide information concerning the proposed development. Citizens of the reservation have expressed a concern of the quality of the water on the reservation; surface waters, ground water, and well waters.

  8. Big data in food safety; an overview.

    PubMed

    Marvin, Hans J P; Janssen, Esmée M; Bouzembrak, Yamine; Hendriksen, Peter J M; Staats, Martijn

    2016-11-07

    Technology is now being developed that is able to handle vast amounts of structured and unstructured data from diverse sources and origins. These technologies are often referred to as big data, and opens new areas of research and applications that will have an increasing impact in all sectors of our society. In this paper we assessed to which extent big data is being applied in the food safety domain and identified several promising trends. In several parts of the world, governments stimulate the publication on internet of all data generated in public funded research projects. This policy opens new opportunities for stakeholders dealing with food safety to address issues which were not possible before. Application of mobile phones as detection devices for food safety and the use of social media as early warning of food safety problems are a few examples of the new developments that are possible due to big data.

  9. The dominance of big pharma: power.

    PubMed

    Edgar, Andrew

    2013-05-01

    The purpose of this paper is to provide a normative model for the assessment of the exercise of power by Big Pharma. By drawing on the work of Steven Lukes, it will be argued that while Big Pharma is overtly highly regulated, so that its power is indeed restricted in the interests of patients and the general public, the industry is still able to exercise what Lukes describes as a third dimension of power. This entails concealing the conflicts of interest and grievances that Big Pharma may have with the health care system, physicians and patients, crucially through rhetorical engagements with Patient Advocacy Groups that seek to shape public opinion, and also by marginalising certain groups, excluding them from debates over health care resource allocation. Three issues will be examined: the construction of a conception of the patient as expert patient or consumer; the phenomenon of disease mongering; the suppression or distortion of debates over resource allocation.

  10. Granular computing with multiple granular layers for brain big data processing.

    PubMed

    Wang, Guoyin; Xu, Ji

    2014-12-01

    Big data is the term for a collection of datasets so huge and complex that it becomes difficult to be processed using on-hand theoretical models and technique tools. Brain big data is one of the most typical, important big data collected using powerful equipments of functional magnetic resonance imaging, multichannel electroencephalography, magnetoencephalography, Positron emission tomography, near infrared spectroscopic imaging, as well as other various devices. Granular computing with multiple granular layers, referred to as multi-granular computing (MGrC) for short hereafter, is an emerging computing paradigm of information processing, which simulates the multi-granular intelligent thinking model of human brain. It concerns the processing of complex information entities called information granules, which arise in the process of data abstraction and derivation of information and even knowledge from data. This paper analyzes three basic mechanisms of MGrC, namely granularity optimization, granularity conversion, and multi-granularity joint computation, and discusses the potential of introducing MGrC into intelligent processing of brain big data.

  11. Design and development of a medical big data processing system based on Hadoop.

    PubMed

    Yao, Qin; Tian, Yu; Li, Peng-Fei; Tian, Li-Li; Qian, Yang-Ming; Li, Jing-Song

    2015-03-01

    Secondary use of medical big data is increasingly popular in healthcare services and clinical research. Understanding the logic behind medical big data demonstrates tendencies in hospital information technology and shows great significance for hospital information systems that are designing and expanding services. Big data has four characteristics--Volume, Variety, Velocity and Value (the 4 Vs)--that make traditional systems incapable of processing these data using standalones. Apache Hadoop MapReduce is a promising software framework for developing applications that process vast amounts of data in parallel with large clusters of commodity hardware in a reliable, fault-tolerant manner. With the Hadoop framework and MapReduce application program interface (API), we can more easily develop our own MapReduce applications to run on a Hadoop framework that can scale up from a single node to thousands of machines. This paper investigates a practical case of a Hadoop-based medical big data processing system. We developed this system to intelligently process medical big data and uncover some features of hospital information system user behaviors. This paper studies user behaviors regarding various data produced by different hospital information systems for daily work. In this paper, we also built a five-node Hadoop cluster to execute distributed MapReduce algorithms. Our distributed algorithms show promise in facilitating efficient data processing with medical big data in healthcare services and clinical research compared with single nodes. Additionally, with medical big data analytics, we can design our hospital information systems to be much more intelligent and easier to use by making personalized recommendations.

  12. The Reliability and Validity of Big Five Inventory Scores with African American College Students

    ERIC Educational Resources Information Center

    Worrell, Frank C.; Cross, William E., Jr.

    2004-01-01

    This article describes a study that examined the reliability and validity of scores on the Big Five Inventory (BFI; O. P. John, E. M. Donahue, & R. L. Kentle, 1991) in a sample of 336 African American college students. Results from the study indicated moderate reliability and structural validity for BFI scores. Additionally, BFI subscales had few…

  13. How quantum is the big bang?

    PubMed

    Bojowald, Martin

    2008-06-06

    When quantum gravity is used to discuss the big bang singularity, the most important, though rarely addressed, question is what role genuine quantum degrees of freedom play. Here, complete effective equations are derived for isotropic models with an interacting scalar to all orders in the expansions involved. The resulting coupling terms show that quantum fluctuations do not affect the bounce much. Quantum correlations, however, do have an important role and could even eliminate the bounce. How quantum gravity regularizes the big bang depends crucially on properties of the quantum state.

  14. Livermore Big Trees Park: 1998 Results

    SciTech Connect

    Mac Queen, D; Gallegos, G; Surano, K

    2002-04-18

    This report is an in-depth study of results from environmental sampling conducted in 1998 by the Lawrence Livermore National Laboratory (LLNL) at Big Trees Park in the city of Livermore. The purpose of the sampling was to determine the extent and origin of plutonium found in soil at concentrations above fallout-background levels in the park. This report describes the sampling that was conducted, the chemical and radio-chemical analyses of the samples, the quality control assessments and statistical analyses of the analytical results, and LLNL's interpretations of the results. It includes a number of data analyses not presented in LLNL's previous reports on Big Trees Park.

  15. Harnessing the Heart of Big Data

    PubMed Central

    Scruggs, Sarah B.; Watson, Karol; Su, Andrew I.; Hermjakob, Henning; Yates, John R.; Lindsey, Merry L.; Ping, Peipei

    2015-01-01

    The exponential increase in Big Data generation combined with limited capitalization on the wealth of information embedded within Big Data have prompted us to revisit our scientific discovery paradigms. A successful transition into this digital era of medicine holds great promise for advancing fundamental knowledge in biology, innovating human health and driving personalized medicine, however, this will require a drastic shift of research culture in how we conceptualize science and use data. An e-transformation will require global adoption and synergism among computational science, biomedical research and clinical domains. PMID:25814682

  16. The origin of the big-bang

    NASA Astrophysics Data System (ADS)

    Thakur, R. K.

    1992-04-01

    A singularity-free model of the universe is developed within the framework of the Friedmann-Lemaitre-Robertson-Walker cosmology, which gives a physical explanation for the origin of the big bang and for the preponderance of matter over antimatter. It is shown that the model retains all the useful features of the standard-cosmology (Weinberg, 1972; Sandage, 1988) hot big-bang (HBB) model and resolves, in a very natural way, all the difficulties of the HBB model, such as the occurrence of space-time singularity. The new model also resolves the problem of the baryon asymmetry and can accont for the currently observed value of nu.

  17. Effective dynamics of the matrix big bang

    SciTech Connect

    Craps, Ben; Rajaraman, Arvind; Sethi, Savdeep

    2006-05-15

    We study the leading quantum effects in the recently introduced matrix big bang model. This amounts to a study of supersymmetric Yang-Mills theory compactified on the Milne orbifold. We find a one-loop potential that is attractive near the big bang. Surprisingly, the potential decays very rapidly at late times where it appears to be generated by D-brane effects. Usually, general covariance constrains the form of any effective action generated by renormalization group flow. However, the form of our one-loop potential seems to violate these constraints in a manner that suggests a connection between the cosmological singularity and long wavelength, late time physics.

  18. [Research applications in digital radiology. Big data and co].

    PubMed

    Müller, H; Hanbury, A

    2016-02-01

    Medical imaging produces increasingly complex images (e.g. thinner slices and higher resolution) with more protocols, so that image reading has also become much more complex. More information needs to be processed and usually the number of radiologists available for these tasks has not increased to the same extent. The objective of this article is to present current research results from projects on the use of image data for clinical decision support. An infrastructure that can allow large volumes of data to be accessed is presented. In this way the best performing tools can be identified without the medical data having to leave secure servers. The text presents the results of the VISCERAL and Khresmoi EU-funded projects, which allow the analysis of previous cases from institutional archives to support decision-making and for process automation. The results also represent a secure evaluation environment for medical image analysis. This allows the use of data extracted from past cases to solve information needs occurring when diagnosing new cases. The presented research prototypes allow direct extraction of knowledge from the visual data of the images and to use this for decision support or process automation. Real clinical use has not been tested but several subjective user tests showed the effectiveness and efficiency of the process. The future in radiology will clearly depend on better use of the important knowledge in clinical image archives to automate processes and aid decision-making via big data analysis. This can help concentrate the work of radiologists towards the most important parts of diagnostics.

  19. Big Events in Greece and HIV Infection Among People Who Inject Drugs

    PubMed Central

    Nikolopoulos, Georgios K.; Sypsa, Vana; Bonovas, Stefanos; Paraskevis, Dimitrios; Malliori-Minerva, Melpomeni; Hatzakis, Angelos; Friedman, Samuel R.

    2015-01-01

    Big Events are processes like macroeconomic transitions that have lowered social well-being in various settings in the past. Greece has been hit by the global crisis and experienced an HIV outbreak among people who inject drugs. Since the crisis began (2008), Greece has seen population displacement, inter-communal violence, cuts in governmental expenditures, and social movements. These may have affected normative regulation, networks, and behaviors. However, most pathways to risk remain unknown or unmeasured. We use what is known and unknown about the Greek HIV outbreak to suggest modifications in Big Events models and the need for additional research. PMID:25723309

  20. Device Data Ingestion for Industrial Big Data Platforms with a Case Study.

    PubMed

    Ji, Cun; Shao, Qingshi; Sun, Jiao; Liu, Shijun; Pan, Li; Wu, Lei; Yang, Chenglei

    2016-02-26

    Despite having played a significant role in the Industry 4.0 era, the Internet of Things is currently faced with the challenge of how to ingest large-scale heterogeneous and multi-type device data. In response to this problem we present a heterogeneous device data ingestion model for an industrial big data platform. The model includes device templates and four strategies for data synchronization, data slicing, data splitting and data indexing, respectively. We can ingest device data from multiple sources with this heterogeneous device data ingestion model, which has been verified on our industrial big data platform. In addition, we present a case study on device data-based scenario analysis of industrial big data.

  1. Device Data Ingestion for Industrial Big Data Platforms with a Case Study †

    PubMed Central

    Ji, Cun; Shao, Qingshi; Sun, Jiao; Liu, Shijun; Pan, Li; Wu, Lei; Yang, Chenglei

    2016-01-01

    Despite having played a significant role in the Industry 4.0 era, the Internet of Things is currently faced with the challenge of how to ingest large-scale heterogeneous and multi-type device data. In response to this problem we present a heterogeneous device data ingestion model for an industrial big data platform. The model includes device templates and four strategies for data synchronization, data slicing, data splitting and data indexing, respectively. We can ingest device data from multiple sources with this heterogeneous device data ingestion model, which has been verified on our industrial big data platform. In addition, we present a case study on device data-based scenario analysis of industrial big data. PMID:26927121

  2. Higher-order factors of the Big Five in a multi-informant sample.

    PubMed

    DeYoung, Colin G

    2006-12-01

    In a large community sample (N=490), the Big Five were not orthogonal when modeled as latent variables representing the shared variance of reports from 4 different informants. Additionally, the standard higher-order factor structure was present in latent space: Neuroticism (reversed), Agreeableness, and Conscientiousness formed one factor, labeled Stability, and Extraversion and Openness/Intellect formed a second factor, labeled Plasticity. Comparison of two instruments, the Big Five Inventory and the Mini-Markers, supported the hypotheses that single-adjective rating instruments are likely to yield lower interrater agreement than phrase rating instruments and that lower interrater agreement is associated with weaker correlations among the Big Five and a less coherent higher-order factor structure. In conclusion, an interpretation of the higher-order factors is discussed, including possible neurobiological substrates.

  3. Big Creek Hydroelectric System, East & West Transmission Line, 241mile ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Big Creek Hydroelectric System, East & West Transmission Line, 241-mile transmission corridor extending between the Big Creek Hydroelectric System in the Sierra National Forest in Fresno County and the Eagle Rock Substation in Los Angeles, California, Visalia, Tulare County, CA

  4. Geologic map of Big Bend National Park, Texas

    USGS Publications Warehouse

    Turner, Kenzie J.; Berry, Margaret E.; Page, William R.; Lehman, Thomas M.; Bohannon, Robert G.; Scott, Robert B.; Miggins, Daniel P.; Budahn, James R.; Cooper, Roger W.; Drenth, Benjamin J.; Anderson, Eric D.; Williams, Van S.

    2011-01-01

    The purpose of this map is to provide the National Park Service and the public with an updated digital geologic map of Big Bend National Park (BBNP). The geologic map report of Maxwell and others (1967) provides a fully comprehensive account of the important volcanic, structural, geomorphological, and paleontological features that define BBNP. However, the map is on a geographically distorted planimetric base and lacks topography, which has caused difficulty in conducting GIS-based data analyses and georeferencing the many geologic features investigated and depicted on the map. In addition, the map is outdated, excluding significant data from numerous studies that have been carried out since its publication more than 40 years ago. This report includes a modern digital geologic map that can be utilized with standard GIS applications to aid BBNP researchers in geologic data analysis, natural resource and ecosystem management, monitoring, assessment, inventory activities, and educational and recreational uses. The digital map incorporates new data, many revisions, and greater detail than the original map. Although some geologic issues remain unresolved for BBNP, the updated map serves as a foundation for addressing those issues. Funding for the Big Bend National Park geologic map was provided by the United States Geological Survey (USGS) National Cooperative Geologic Mapping Program and the National Park Service. The Big Bend mapping project was administered by staff in the USGS Geology and Environmental Change Science Center, Denver, Colo. Members of the USGS Mineral and Environmental Resources Science Center completed investigations in parallel with the geologic mapping project. Results of these investigations addressed some significant current issues in BBNP and the U.S.-Mexico border region, including contaminants and human health, ecosystems, and water resources. Funding for the high-resolution aeromagnetic survey in BBNP, and associated data analyses and

  5. Mining of Solar Big Data: Challenges and Opportunities

    NASA Astrophysics Data System (ADS)

    Angryk, R.

    2014-12-01

    The focus of this talk is on the challenges and opportunities linked to the big data analytics of the Solar Observatory (www.nasa.gov/sdo/), which is a flagship of NASA's current "Living with a Star" program. The audience will first learn about the importance of solar data analysis, then about the complexity of data maintained on the servers in our Data Mining Lab (dmlab.cs.montana.edu/). After that, our three ongoing research projects will be discussed: (1) the Content-based Image Retrieval (CBIR) system for solar data, (2) the development of machine learning techniques for automated validation and expansion of FFT's software modules, and (3) the search for spatio-temporal co-occurrence patterns among different types of solar activity. Finally, we will briefly talk about the future of our solar databases and data mining projects.

  6. Early experiences with big data at an academic medical center.

    PubMed

    Halamka, John D

    2014-07-01

    Beth Israel Deaconess Medical Center (BIDMC), an academic health care institution affiliated with Harvard University, has been an early adopter of electronic applications since the 1970s. Various departments of the medical center and the physician practice groups affiliated with it have implemented electronic health records, filmless imaging, and networked medical devices to such an extent that data storage at BIDMC now amounts to three petabytes and continues to grow at a rate of 25 percent a year. Initially, the greatest technical challenge was the cost and complexity of data storage. However, today the major focus is on transforming raw data into information, knowledge, and wisdom. This article discusses the data growth, increasing importance of analytics, and changing user requirements that have shaped the management of big data at BIDMC.

  7. Treatment Integrity: Revisiting Some Big Ideas

    ERIC Educational Resources Information Center

    Greenwood, Charles R.

    2009-01-01

    The contributors to this special issue have helped everyone consider the next steps in building a research and practice agenda regarding the use of treatment integrity. Such an agenda must converge with the big ideas that link treatment integrity to the effectiveness of evidence-based practices (EBPs), and ultimately that of the profession. In…

  8. The Big Ideas behind Whole System Reform

    ERIC Educational Resources Information Center

    Fullan, Michael

    2010-01-01

    Whole system reform means that every vital part of the system--school, community, district, and government--contributes individually and in concert to forward movement and success, using practice, not research, as the driver of reform. With this in mind, several "big ideas", based on successful implementation, informed Ontario's reform…

  9. Marketing Your Library with the Big Read

    ERIC Educational Resources Information Center

    Johnson, Wendell G.

    2012-01-01

    The Big Read was developed by the National Endowment for the Arts to revitalize the role of culture in American society and encourage the reading of landmark literature. Each year since 2007, the DeKalb Public Library, Northern Illinois University, and Kishwaukee Community College have partnered to foster literacy in the community. This article…

  10. Big-Time Fundraising for Today's Schools

    ERIC Educational Resources Information Center

    Levenson, Stanley

    2006-01-01

    In this enlightening book, nationally recognized author and fundraising consultant Stanley Levenson shows school leaders how to move away from labor-intensive, nickel-and-dime bake sales and car washes, and into the world of big-time fundraising. Following the model used by colleges and universities, the author presents a wealth of practical…

  11. Challenges of Big Data in Educational Assessment

    ERIC Educational Resources Information Center

    Gibson, David C.; Webb, Mary; Ifenthaler, Dirk

    2015-01-01

    This paper briefly discusses four measurement challenges of data science or "big data" in educational assessments that are enabled by technology: 1. Dealing with change over time via time-based data. 2. How a digital performance space's relationships interact with learner actions, communications and products. 3. How layers of…

  12. Big-Time Sports in American Universities

    ERIC Educational Resources Information Center

    Clotfelter, Charles T.

    2011-01-01

    For almost a century, big-time college sports has been a wildly popular but consistently problematic part of American higher education. The challenges it poses to traditional academic values have been recognized from the start, but they have grown more ominous in recent decades, as cable television has become ubiquitous, commercial opportunities…

  13. Big Data Cognition for City Emergency Rescue

    NASA Astrophysics Data System (ADS)

    Zhang, Xin; Chen, Yongxin; Wang, Weisheng

    2016-11-01

    There are many kinds of data produced in the city daily life, which operates as an elementary component of the citizen life support system. The city unexpected incidents occurs in a seemingly unpredictable patterns. With the Big Data analysis the emergency rescue can be carried out efficiently. In this paper, the Big Data cognition for city emergency rescue is studied from four perspectives. From the data volume perspective, the spatial data analysis technology is divided into two parts, the indoor data and the outdoor data. From the data velocity perspective, the big data is collected from the eyes in the sky and objects on-the-ground networks, together with demographic data. From the data variety analysis perspective, the population distribution data, the socio-economic data and model estimates are included. From the data value mining perspective, the crime model estimates are studied. In the end, the application in the big public venues emergency rescue is introduced, which is located in Urumqi, Xinjiang, China.

  14. Big Gods: Extended prosociality or group binding?

    PubMed

    Galen, Luke W

    2016-01-01

    Big Gods are described as having a "prosocial" effect. However, this conflates parochialism (group cohesion) with cooperation extended to strangers or out-group members. An examination of the cited experimental studies indicates that religion is actually associated with increased within-group parochialism, rather than extended or universal prosociality, and that the same general mechanisms underlie both religious and secular effects.

  15. The Lure of the Big Time.

    ERIC Educational Resources Information Center

    Krinsky, Ira W.; Rudiger, Charles W.

    1991-01-01

    Despite all the horror stories about big-city politics, diminishing resources, and pressure-cooker workloads, urban superintendencies continue to attract a certain breed of men and women. Frequently cited reasons include the challenge, sophistication, complexity, resources, diversity, people, visibility, and compensation associated with the job.…

  16. Big Bubbles in Boiling Liquids: Students' Views

    ERIC Educational Resources Information Center

    Costu, Bayram

    2008-01-01

    The aim of this study was to elicit students' conceptions about big bubbles in boiling liquids (water, ethanol and aqueous CuSO[subscript 4] solution). The study is based on twenty-four students at different ages and grades. The clinical interviews technique was conducted to solicit students' conceptions and the interviews were analyzed to…

  17. A Big Problem for Magellan: Food Preservation

    ERIC Educational Resources Information Center

    Galvao, Cecilia; Reis, Pedro; Freire, Sofia

    2008-01-01

    In this paper, we present data related to how a Portuguese teacher developed the module "A big problem for Magellan: Food preservation." Students were asked to plan an investigation in order to identify which were the best food preservation methods in the XV and XVI centuries of Portuguese overseas navigation, and then establish a…

  18. Big Broadband Connectivity in the United States

    ERIC Educational Resources Information Center

    Windhausen, John, Jr.

    2008-01-01

    The economic and social future of the United States depends on answering the growing demand for very high-speed broadband connectivity, a capability termed "big broadband." Failure to take on the challenge could lead to a decline in global competitiveness and an inability to educate students. (Contains 20 notes.)

  19. Integrating "big data" into surgical practice.

    PubMed

    Mathias, Brittany; Lipori, Gigi; Moldawer, Lyle L; Efron, Philip A

    2016-02-01

    'Big data' is the next frontier of medicine. We now have the ability to generate and analyze large quantities of healthcare data. Although interpreting and integrating this information into clinical practice poses many challenges, the potential benefits of personalized medicine are seemingly without limit.

  20. Financing Big City Schools: Some Possible Breakthroughs.

    ERIC Educational Resources Information Center

    Marland, S.P., Jr.

    Among the many factors contributing to the crisis in big-city school finance are (1) the in-migration of the poor to the cities accompanied by the out-migration of the higher-income people; (2) higher teacher salaries; (3) the new mandates placed on schools such as cradle-to-grave accomodation in educational opportunities, manpower retraining,…

  1. Big physics quartet win government backing

    NASA Astrophysics Data System (ADS)

    Banks, Michael

    2014-09-01

    Four major physics-based projects are among 10 to have been selected by Japan’s Ministry of Education, Culture, Sports, Science and Technology for funding in the coming decade as part of its “roadmap” of big-science projects.

  2. Black Hole Blows Big Bubble

    NASA Astrophysics Data System (ADS)

    2010-07-01

    astronomers understand the similarity between small black holes formed from exploded stars and the supermassive black holes at the centres of galaxies. Very powerful jets have been seen from supermassive black holes, but are thought to be less frequent in the smaller microquasar variety. The new discovery suggests that many of them may simply have gone unnoticed so far. The gas-blowing black hole is located 12 million light-years away, in the outskirts of the spiral galaxy NGC 7793 (eso0914b). From the size and expansion velocity of the bubble the astronomers have found that the jet activity must have been ongoing for at least 200 000 years. Notes [1] Astronomers do not have yet any means of measuring the size of the black hole itself. The smallest stellar black hole discovered so far has a radius of about 15 km. An average stellar black hole of about 10 solar masses has a radius of about 30 km, while a "big" stellar black hole may have a radius of up to 300 km. This is still much smaller than the jets, which extend out to several hundreds light years on each side of the black hole, or about several thousand million million km! More information This result appears in a paper published in this week's issue of the journal Nature (A 300 parsec long jet-inflated bubble around a powerful microquasar in the galaxy NGC 7793, by Manfred W. Pakull, Roberto Soria and Christian Motch). ESO, the European Southern Observatory, is the foremost intergovernmental astronomy organisation in Europe and the world's most productive astronomical observatory. It is supported by 14 countries: Austria, Belgium, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Portugal, Spain, Sweden, Switzerland and the United Kingdom. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising

  3. Black Hole Blows Big Bubble

    NASA Astrophysics Data System (ADS)

    2010-07-01

    astronomers understand the similarity between small black holes formed from exploded stars and the supermassive black holes at the centres of galaxies. Very powerful jets have been seen from supermassive black holes, but are thought to be less frequent in the smaller microquasar variety. The new discovery suggests that many of them may simply have gone unnoticed so far. The gas-blowing black hole is located 12 million light-years away, in the outskirts of the spiral galaxy NGC 7793 (eso0914b). From the size and expansion velocity of the bubble the astronomers have found that the jet activity must have been ongoing for at least 200 000 years. Note: [1] Astronomers do not have yet any means of measuring the size of the black hole itself. The smallest stellar black hole discovered so far has a radius of about 15 km. An average stellar black hole of about 10 solar masses has a radius of about 30 km, while a "big" stellar black hole may have a radius of up to 300 km. This is still much smaller than the jets, which extend out to 1000 light-years, or about 9000 million million km! More Information: This result appears in a paper published in this week's issue of the journal Nature (A 300 parsec long jet-inflated bubble around a powerful microquasar in the galaxy NGC 7793, by Manfred W. Pakull, Roberto Soria and Christian Motch). ESO, the European Southern Observatory, is the foremost intergovernmental astronomy organisation in Europe and the world's most productive astronomical observatory. It is supported by 14 countries: Austria, Belgium, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Portugal, Spain, Sweden, Switzerland and the United Kingdom. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising cooperation in astronomical research. ESO operates

  4. 75 FR 71069 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-22

    ... Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee will meet in Greybull, Wyoming... December 1, 2010, and will begin at 10 a.m. ADDRESSES: The meeting will be held at the Big Horn County...

  5. 76 FR 59394 - Big Eddy-Knight Transmission Project

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-26

    ... Bonneville Power Administration Big Eddy-Knight Transmission Project AGENCY: Bonneville Power Administration...: This notice announces the availability of the ROD to implement the Big Eddy-Knight Transmission Project in Wasco County, Oregon and Klickitat County, Washington. Construction of the Big...

  6. 7. SOUTHEAST VIEW OF BIG DALTON DAM SHOWING THE MULTIPLE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. SOUTHEAST VIEW OF BIG DALTON DAM SHOWING THE MULTIPLE ARCHES, AN UPSTREAM VIEW OF THE PARAPET WALL ALONG THE CREST OF THE DAM, AND THE SHELTER HOUSE AT THE EAST END OF THE DAM. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  7. Big Ideas in Primary Mathematics: Issues and Directions

    ERIC Educational Resources Information Center

    Askew, Mike

    2013-01-01

    This article is located within the literature arguing for attention to Big Ideas in teaching and learning mathematics for understanding. The focus is on surveying the literature of Big Ideas and clarifying what might constitute Big Ideas in the primary Mathematics Curriculum based on both theoretical and pragmatic considerations. This is…

  8. ["Big data" - large data, a lot of knowledge?].

    PubMed

    Hothorn, Torsten

    2015-01-28

    Since a couple of years, the term Big Data describes technologies to extract knowledge from data. Applications of Big Data and their consequences are also increasingly discussed in the mass media. Because medicine is an empirical science, we discuss the meaning of Big Data and its potential for future medical research.

  9. New Evidence on the Development of the Word "Big."

    ERIC Educational Resources Information Center

    Sena, Rhonda; Smith, Linda B.

    1990-01-01

    Results indicate that curvilinear trend in children's understanding of word "big" is not obtained in all stimulus contexts. This suggests that meaning and use of "big" is complex, and may not refer simply to larger objects in a set. Proposes that meaning of "big" constitutes a dynamic system driven by many perceptual,…

  10. 9. View from middle adit Wawona Tunnel of Big Oak ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. View from middle adit Wawona Tunnel of Big Oak Flat Road with retaining walls at lower left and center left with east portal of tunnel #1. - Big Oak Flat Road, Between Big Oak Flat Entrance & Merced River, Yosemite Village, Mariposa County, CA

  11. 16. AERIAL VIEW OF BIG DALTON DAM TAKEN ON 2161962 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    16. AERIAL VIEW OF BIG DALTON DAM TAKEN ON 2-16-1962 BY L.A. COUNTY PUBLIC WORKS PHOTOGRAPHER SINGER. PHOTO SHOWS THE RESERVOIR NEAR FULL CAPACITY AND WATER BEING RELEASED ON THE DOWNSTREAM SIDE. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  12. 15. UPSTREAM VIEW (PHOTOGRAPHER UNKNOWN) SHOWING BIG DALTON DAM NEAR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. UPSTREAM VIEW (PHOTOGRAPHER UNKNOWN) SHOWING BIG DALTON DAM NEAR FULL CAPACITY AFTER CONSTRUCTION. PICTURE WAS DEVELOPED FROM COPY NEGATIVES WHICH WERE TAKEN ON 2-15-1973 BY PHOTOGRAPHER D. MEIER OF L.A. COUNTY PUBLIC WORKS. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  13. 15. AERIAL VIEW OF BIG TUJUNGA DAM TAKEN ON FEBRUARY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. AERIAL VIEW OF BIG TUJUNGA DAM TAKEN ON FEBRUARY 17, 1962, BY L.A. COUNTY PUBLIC WORKS PHOTOGRAPHER WEBB. PHOTO SHOWS THE RESERVOIR NEAR FULL CAPACITY AND WATER BEING RELEASED ON THE DOWNSTREAM SIDE. - Big Tujunga Dam, 809 West Big Tujunga Road, Sunland, Los Angeles County, CA

  14. View of New Big Oak Flat Road seen from Old ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of New Big Oak Flat Road seen from Old Wawona Road near location of photograph HAER CA-148-17. Note road cuts, alignment, and tunnels. Devils Dance Floor at left distance. Looking northwest - Big Oak Flat Road, Between Big Oak Flat Entrance & Merced River, Yosemite Village, Mariposa County, CA

  15. 76 FR 7837 - Big Rivers Electric Corporation; Notice of Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-11

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Big Rivers Electric Corporation; Notice of Filing Take notice that on February 4, 2011, Big Rivers Electric Corporation (Big Rivers) filed a notice of cancellation of its...

  16. Big data - a 21st century science Maginot Line? No-boundary thinking: shifting from the big data paradigm.

    PubMed

    Huang, Xiuzhen; Jennings, Steven F; Bruce, Barry; Buchan, Alison; Cai, Liming; Chen, Pengyin; Cramer, Carole L; Guan, Weihua; Hilgert, Uwe Kk; Jiang, Hongmei; Li, Zenglu; McClure, Gail; McMullen, Donald F; Nanduri, Bindu; Perkins, Andy; Rekepalli, Bhanu; Salem, Saeed; Specker, Jennifer; Walker, Karl; Wunsch, Donald; Xiong, Donghai; Zhang, Shuzhong; Zhang, Yu; Zhao, Zhongming; Moore, Jason H

    2015-01-01

    Whether your interests lie in scientific arenas, the corporate world, or in government, you have certainly heard the praises of big data: Big data will give you new insights, allow you to become more efficient, and/or will solve your problems. While big data has had some outstanding successes, many are now beginning to see that it is not the Silver Bullet that it has been touted to be. Here our main concern is the overall impact of big data; the current manifestation of big data is constructing a Maginot Line in science in the 21st century. Big data is not "lots of data" as a phenomena anymore; The big data paradigm is putting the spirit of the Maginot Line into lots of data. Big data overall is disconnecting researchers and science challenges. We propose No-Boundary Thinking (NBT), applying no-boundary thinking in problem defining to address science challenges.

  17. The Big Island of Hawaii

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Boasting snow-covered mountain peaks and tropical forest, the Island of Hawaii, the largest of the Hawaiian Islands, is stunning at any altitude. This false-color composite (processed to simulate true color) image of Hawaii was constructed from data gathered between 1999 and 2001 by the Enhanced Thematic Mapper plus (ETM+) instrument, flying aboard the Landsat 7 satellite. The Landsat data were processed by the National Oceanographic and Atmospheric Administration (NOAA) to develop a landcover map. This map will be used as a baseline to chart changes in land use on the islands. Types of change include the construction of resorts along the coastal areas, and the conversion of sugar plantations to other crop types. Hawaii was created by a 'hotspot' beneath the ocean floor. Hotspots form in areas where superheated magma in the Earth's mantle breaks through the Earth's crust. Over the course of millions of years, the Pacific Tectonic Plate has slowly moved over this hotspot to form the entire Hawaiian Island archipelago. The black areas on the island (in this scene) that resemble a pair of sun-baked palm fronds are hardened lava flows formed by the active Mauna Loa Volcano. Just to the north of Mauna Loa is the dormant grayish Mauna Kea Volcano, which hasn't erupted in an estimated 3,500 years. A thin greyish plume of smoke is visible near the island's southeastern shore, rising from Kilauea-the most active volcano on Earth. Heavy rainfall and fertile volcanic soil have given rise to Hawaii's lush tropical forests, which appear as solid dark green areas in the image. The light green, patchy areas near the coasts are likely sugar cane plantations, pineapple farms, and human settlements. Courtesy of the NOAA Coastal Services Center Hawaii Land Cover Analysis project

  18. The Ethics of Big Data: Current and Foreseeable Issues in Biomedical Contexts.

    PubMed

    Mittelstadt, Brent Daniel; Floridi, Luciano

    2016-04-01

    The capacity to collect and analyse data is growing exponentially. Referred to as 'Big Data', this scientific, social and technological trend has helped create destabilising amounts of information, which can challenge accepted social and ethical norms. Big Data remains a fuzzy idea, emerging across social, scientific, and business contexts sometimes seemingly related only by the gigantic size of the datasets being considered. As is often the case with the cutting edge of scientific and technological progress, understanding of the ethical implications of Big Data lags behind. In order to bridge such a gap, this article systematically and comprehensively analyses academic literature concerning the ethical implications of Big Data, providing a watershed for future ethical investigations and regulations. Particular attention is paid to biomedical Big Data due to the inherent sensitivity of medical information. By means of a meta-analysis of the literature, a thematic narrative is provided to guide ethicists, data scientists, regulators and other stakeholders through what is already known or hypothesised about the ethical risks of this emerging and innovative phenomenon. Five key areas of concern are identified: (1) informed consent, (2) privacy (including anonymisation and data protection), (3) ownership, (4) epistemology and objectivity, and (5) 'Big Data Divides' created between those who have or lack the necessary resources to analyse increasingly large datasets. Critical gaps in the treatment of these themes are identified with suggestions for future research. Six additional areas of concern are then suggested which, although related have not yet attracted extensive debate in the existing literature. It is argued that they will require much closer scrutiny in the immediate future: (6) the dangers of ignoring group-level ethical harms; (7) the importance of epistemology in assessing the ethics of Big Data; (8) the changing nature of fiduciary relationships that

  19. Multi-Scale Change Detection Research of Remotely Sensed Big Data in CyberGIS

    NASA Astrophysics Data System (ADS)

    Xing, J.; Sieber, R.

    2015-12-01

    Big remotely sensed data, the heterogeneity of satellite platforms and file formats along with increasing volumes and velocities, offers new types of analyses. This makes big remotely sensed data a good candidate for CyberGIS, the aim of which is to enable knowledge discovery of big data in the cloud. We apply CyberGIS to feature-based multi-scale land use/cover change (LUCC) detection. There have been attempts to do multi-scale LUCC. However, studies were done with small data and could not consider the mismatch between multi-scale analysis and computational scale. They have yet to consider the possibilities for scalar research across numerous temporal and spatial scales afforded by big data, especially if we want to advance beyond pixel-based analysis and also reduce preprocessing requirements. We create a geospatial cyberinfrastructure (GCI) to handle multi-spatio-temporal scale change detection. We first clarify different meanings of scale in CyberGIS and LUCC to derive a feature scope layer in the GCI based on Stommel modelling. Our analysis layer contains a multi-scale segmentation-based method based on normalized cut image segmentation and wavelet-based image scaling algorithms. Our computer resource utilization layer uses Wang and Armstrong's (2009) method for mainly for memory, I/O and CPU time. Our case is urban-rural change detection in the Greater Montreal Area (5 time periods, 2006-2012, 100 virtual machines), 36,000km2 and varying from 0.6m to 38m resolution. We present a ground truthed accuracy assessment of a change matrix that is composed of 6 feature classes at 12 different spatio-temporal scales, and the performance of the change detection GCI for multi-scale LUCC study. The GCI allows us to extract and coordinate different types of changes by varying spatio-temporal scales from the big imagery datasets.

  20. High School Students as Mentors: Findings from the Big Brothers Big Sisters School-Based Mentoring Impact Study

    ERIC Educational Resources Information Center

    Herrera, Carla; Kauh, Tina J.; Cooney, Siobhan M.; Grossman, Jean Baldwin; McMaken, Jennifer

    2008-01-01

    High schools have recently become a popular source of mentors for school-based mentoring (SBM) programs. The high school Bigs program of Big Brothers Big Sisters of America, for example, currently involves close to 50,000 high-school-aged mentors across the country. While the use of these young mentors has several potential advantages, their age…

  1. Primordial comets: big bang nucleosynthesis, dark matter and life

    NASA Astrophysics Data System (ADS)

    Sheldon, Robert B.

    2015-09-01

    Primordial comets are comets made of Big Bang synthesized materials—water, ammonium, and carbon ices. These are the basic elements for life, so that these comets can be colonized by cyanobacteria that grow and bioengineer it for life dispersal. In addition, should they exist in large enough quantities, they would easily satisfy the qualifications for dark matter: low albedo with low visibility, gravitationally femtolensing, galactic negative viscosity, early galaxy formation seeds, and a self-interaction providing cosmic structure. The major arguments against their existence are the absence of metals (elements heavier than He) in ancient Population III stars, and the stringent requirements put on the Big Bang (BB) baryonic density by the BB nucleosynthesis (BBN) models. We argue that CI chondrites, hyperbolic comets, and carbon-enriched Pop III stars are all evidence for primordial comets. The BBN models provide the greater obstacle, but we argue that they crucially omit the magnetic field in their homogeneous, isotropic, "ideal baryon gas" model. Should large magnetic fields exist, not only would they undermine the 1-D models, but if their magnitude exceeds some critical field/density ratio, then the neutrino interacts with the fields, changing the equilibrium ratio of protons to neutrons. Since BBN models are strongly dependent on this ratio, magnetic fields have the potential to radically change the production of C, N, and O (CNO) to produce primordial comets. Then the universe from the earliest moments is not only seeded for galaxy formation, but it is seeded with the ingredients for life.

  2. Big Data Archives: Replication and synchronizing on a large scale

    NASA Astrophysics Data System (ADS)

    King, T. A.; Walker, R. J.

    2015-12-01

    Modern data archives provide unique challenges to replication and synchronization because of their large size. We collect more digital information today than any time before and the volume of data collected is continuously increasing. Some of these data are from unique observations, like those from planetary missions that should be preserved for use by future generations. In addition data from NASA missions are considered federal records and must be retained. While the data may be stored on resilient hardware (i.e. RAID systems) they also must be protected from local or regional disasters. Meeting this challenge requires creating multiple copies. This task is complicated by the fact that new data are constantly being added creating what are called "active archives". Having reliable, high performance tools for replicating and synchronizing active archives in a timely fashion is critical to preservation of the data. When archives were smaller using tools like bbcp, rsync and rcp worked fairly well. While these tools are affective they are not optimized for synchronizing big data archives and their poor performance at scale lead us to develop a new tool designed specifically for big data archives. It combines the best features of git, bbcp, rsync and rcp. We call this tool "Mimic" and we discuss the design of the tool, performance comparisons and its use at NASA's Planetary Plasma Interactions (PPI) Node of the Planetary Data System (PDS).

  3. Surface reclamation of the Big Lake oil field

    SciTech Connect

    Weathers, M.L. ); Moore, K.R. ); Ford, D.L. ); Curlee, C.K. )

    1994-03-01

    Since the discovery of 1 Santa Rita in 1923, millions of barrels of salt water have been produced along with 135 million bbl of oil from the Big Lake oil field in Reagan County, Texas. Until the early 1960s, the accepted disposal method for the produced water was surface discharge to a large evaporation pond north of the field. Produced water was allowed to flow from wells to the pond via natural topographic drainage. This practice resulted in 2000 ac of eroded, barren landscape, characterized by highly saline soils incapable of supporting vegetation. In 1989, the University of Texas System, the U.S. Soil Conservation Service, and Marathon Oil Company, which acquired Big Lake field in 1962, initiated an experimental project to reclaim the affected land and restore rangeland productivity. An underground drainage system, consisting of 125,000 ft of buried drainage conduit and eight collection sumps, was installed over 205 ac of the affected area. Earthen terraces were constructed to capture and hold rain water to facilitate downward percolation and leaching of salts from the soil profile. Salts leached from the soil are captured by the drainage system and pumped to injection wells for disposal. The excellent revegetation that has occurred over the test area after three years of operations is encouraging and has shown the need for expanding and enhancing the existing system with supplemental water from fresh water wells, application of soil-amending agents, additional terracing, and selective planting with salt-tolerant species.

  4. The Interplay of "Big Five" Personality Factors and Metaphorical Schemas: A Pilot Study with 20 Lung Transplant Recipients

    ERIC Educational Resources Information Center

    Goetzmann, Lutz; Moser, Karin S.; Vetsch, Esther; Grieder, Erhard; Klaghofer, Richard; Naef, Rahel; Russi, Erich W.; Boehler, Annette; Buddeberg, Claus

    2007-01-01

    The aim of the present study was to investigate the interplay between personality factors and metaphorical schemas. The "Big Five" personality factors of 20 patients after lung transplantation were examined with the NEO-FFI. Patients were questioned about their social network, and self- and body-image. The interviews were assessed with metaphor…

  5. Focusing on the big picture.

    PubMed

    Chen, Ingfei

    2003-09-10

    As a postdoc in cognitive neuroscience who's also a neurology fellow, Adam Gazzaley is a meld of basic science expertise and clinical experience: He studies brain aging in people by using functional magnetic resonance imaging at the University of California (UC), Berkeley, and he also sees patients at UC San Francisco's Memory and Aging Center. The 34-year-old native New Yorker dives with equal fervor into scientific research and nature photography, two lenses for viewing a single world of discovery. Growing up in Queens, Gazzaley knew from age 7 that he wanted to become a scientist, and as a teenager, he commuted long hours to attend the Bronx High School of Science. He earned an M.D.-Ph.D. from the Mount Sinai School of Medicine in New York City. Gazzaley's hobby as a shutterbug periodically takes him on backpacking trips to document the beauty of the great outdoors. He sells fine-art prints of his photographs to individuals, hospitals, and clinics through his company, Wanderings Inc.

  6. SETI as a part of Big History

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2014-08-01

    Big History is an emerging academic discipline which examines history scientifically from the Big Bang to the present. It uses a multidisciplinary approach based on combining numerous disciplines from science and the humanities, and explores human existence in the context of this bigger picture. It is taught at some universities. In a series of recent papers ([11] through [15] and [17] through [18]) and in a book [16], we developed a new mathematical model embracing Darwinian Evolution (RNA to Humans, see, in particular, [17] and Human History (Aztecs to USA, see [16]) and then we extrapolated even that into the future up to ten million years (see 18), the minimum time requested for a civilization to expand to the whole Milky Way (Fermi paradox). In this paper, we further extend that model in the past so as to let it start at the Big Bang (13.8 billion years ago) thus merging Big History, Evolution on Earth and SETI (the modern Search for ExtraTerrestrial Intelligence) into a single body of knowledge of a statistical type. Our idea is that the Geometric Brownian Motion (GBM), so far used as the key stochastic process of financial mathematics (Black-Sholes models and related 1997 Nobel Prize in Economics!) may be successfully applied to the whole of Big History. In particular, in this paper we derive Big History Theory based on GBMs: just as the GBM is the “movie” unfolding in time, so the Statistical Drake Equation is its “still picture”, static in time, and the GBM is the time-extension of the Drake Equation. Darwinian Evolution on Earth may be easily described as an increasing GBM in the number of living species on Earth over the last 3.5 billion years. The first of them was RNA 3.5 billion years ago, and now 50

  7. Fixing the Big Bang Theory's Lithium Problem

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2017-02-01

    How did our universe come into being? The Big Bang theory is a widely accepted and highly successful cosmological model of the universe, but it does introduce one puzzle: the cosmological lithium problem. Have scientists now found a solution?Too Much LithiumIn the Big Bang theory, the universe expanded rapidly from a very high-density and high-temperature state dominated by radiation. This theory has been validated again and again: the discovery of the cosmic microwave background radiation and observations of the large-scale structure of the universe both beautifully support the Big Bang theory, for instance. But one pesky trouble-spot remains: the abundance of lithium.The arrows show the primary reactions involved in Big Bang nucleosynthesis, and their flux ratios, as predicted by the authors model, are given on the right. Synthesizing primordial elements is complicated! [Hou et al. 2017]According to Big Bang nucleosynthesis theory, primordial nucleosynthesis ran wild during the first half hour of the universes existence. This produced most of the universes helium and small amounts of other light nuclides, including deuterium and lithium.But while predictions match the observed primordial deuterium and helium abundances, Big Bang nucleosynthesis theory overpredicts the abundance of primordial lithium by about a factor of three. This inconsistency is known as the cosmological lithium problem and attempts to resolve it using conventional astrophysics and nuclear physics over the past few decades have not been successful.In a recent publicationled by Suqing Hou (Institute of Modern Physics, Chinese Academy of Sciences) and advisorJianjun He (Institute of Modern Physics National Astronomical Observatories, Chinese Academy of Sciences), however, a team of scientists has proposed an elegant solution to this problem.Time and temperature evolution of the abundances of primordial light elements during the beginning of the universe. The authors model (dotted lines

  8. Big Data - What is it and why it matters.

    PubMed

    Tattersall, Andy; Grant, Maria J

    2016-06-01

    Big data, like MOOCs, altmetrics and open access, is a term that has been commonplace in the library community for some time yet, despite its prevalence, many in the library and information sector remain unsure of the relationship between big data and their roles. This editorial explores what big data could mean for the day-to-day practice of health library and information workers, presenting examples of big data in action, considering the ethics of accessing big data sets and the potential for new roles for library and information workers.

  9. Methodological challenges and analytic opportunities for modeling and interpreting Big Healthcare Data.

    PubMed

    Dinov, Ivo D

    2016-01-01

    Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'.

  10. WE-H-BRB-00: Big Data in Radiation Oncology.

    PubMed

    Benedict, Stanley

    2016-06-01

    Big Data in Radiation Oncology: (1) Overview of the NIH 2015 Big Data Workshop, (2) Where do we stand in the applications of big data in radiation oncology?, and (3) Learning Health Systems for Radiation Oncology: Needs and Challenges for Future Success The overriding goal of this trio panel of presentations is to improve awareness of the wide ranging opportunities for big data impact on patient quality care and enhancing potential for research and collaboration opportunities with NIH and a host of new big data initiatives. This presentation will also summarize the Big Data workshop that was held at the NIH Campus on August 13-14, 2015 and sponsored by AAPM, ASTRO, and NIH. The workshop included discussion of current Big Data cancer registry initiatives, safety and incident reporting systems, and other strategies that will have the greatest impact on radiation oncology research, quality assurance, safety, and outcomes analysis.

  11. Water quality time series for Big Melen stream (Turkey): its decomposition analysis and comparison to upstream.

    PubMed

    Karakaya, N; Evrendilek, F

    2010-06-01

    Big Melen stream is one of the major water resources providing 0.268 [corrected] km(3) year(-1) of drinking and municipal water for Istanbul. Monthly time series data between 1991 and 2004 for 25 chemical, biological, and physical water properties of Big Melen stream were separated into linear trend, seasonality, and error components using additive decomposition models. Water quality index (WQI) derived from 17 water quality variables were used to compare Aksu upstream and Big Melen downstream water quality. Twenty-six additive decomposition models of water quality time series data including WQI had R (2) values ranging from 88% for log(water temperature) (P < or = 0.001) to 3% for log(total dissolved solids) (P < or = 0.026). Linear trend models revealed that total hardness, calcium concentration, and log(nitrite concentration) had the highest rate of increase over time. Tukey's multiple comparison pointed to significant decreases in 17 water quality variables including WQI of Big Melen downstream relative to those of Aksu upstream (P < or = 0.001). Monitoring changes in water quality on the basis of watersheds through WQI and decomposition analysis of time series data paves the way for an adaptive management process of water resources that can be tailored in response to effectiveness and dynamics of management practices.

  12. Anticipated Changes in Conducting Scientific Data-Analysis Research in the Big-Data Era

    NASA Astrophysics Data System (ADS)

    Kuo, Kwo-Sen; Seablom, Michael; Clune, Thomas; Ramachandran, Rahul

    2014-05-01

    A Big-Data environment is one that is capable of orchestrating quick-turnaround analyses involving large volumes of data for numerous simultaneous users. Based on our experiences with a prototype Big-Data analysis environment, we anticipate some important changes in research behaviors and processes while conducting scientific data-analysis research in the near future as such Big-Data environments become the mainstream. The first anticipated change will be the reduced effort and difficulty in most parts of the data management process. A Big-Data analysis environment is likely to house most of the data required for a particular research discipline along with appropriate analysis capabilities. This will reduce the need for researchers to download local copies of data. In turn, this also reduces the need for compute and storage procurement by individual researchers or groups, as well as associated maintenance and management afterwards. It is almost certain that Big-Data environments will require a different "programming language" to fully exploit the latent potential. In addition, the process of extending the environment to provide new analysis capabilities will likely be more involved than, say, compiling a piece of new or revised code. We thus anticipate that researchers will require support from dedicated organizations associated with the environment that are composed of professional software engineers and data scientists. A major benefit will likely be that such extensions are of higher-quality and broader applicability than ad hoc changes by physical scientists. Another anticipated significant change is improved collaboration among the researchers using the same environment. Since the environment is homogeneous within itself, many barriers to collaboration are minimized or eliminated. For example, data and analysis algorithms can be seamlessly shared, reused and re-purposed. In conclusion, we will be able to achieve a new level of scientific productivity in the

  13. Anticipated Changes in Conducting Scientific Data-Analysis Research in the Big-Data Era

    NASA Technical Reports Server (NTRS)

    Kuo, Kwo-Sen; Seablom, Michael; Clune, Thomas; Ramachandran, Rahul

    2014-01-01

    A Big-Data environment is one that is capable of orchestrating quick-turnaround analyses involving large volumes of data for numerous simultaneous users. Based on our experiences with a prototype Big-Data analysis environment, we anticipate some important changes in research behaviors and processes while conducting scientific data-analysis research in the near future as such Big-Data environments become the mainstream. The first anticipated change will be the reduced effort and difficulty in most parts of the data management process. A Big-Data analysis environment is likely to house most of the data required for a particular research discipline along with appropriate analysis capabilities. This will reduce the need for researchers to download local copies of data. In turn, this also reduces the need for compute and storage procurement by individual researchers or groups, as well as associated maintenance and management afterwards. It is almost certain that Big-Data environments will require a different "programming language" to fully exploit the latent potential. In addition, the process of extending the environment to provide new analysis capabilities will likely be more involved than, say, compiling a piece of new or revised code.We thus anticipate that researchers will require support from dedicated organizations associated with the environment that are composed of professional software engineers and data scientists. A major benefit will likely be that such extensions are of higherquality and broader applicability than ad hoc changes by physical scientists. Another anticipated significant change is improved collaboration among the researchers using the same environment. Since the environment is homogeneous within itself, many barriers to collaboration are minimized or eliminated. For example, data and analysis algorithms can be seamlessly shared, reused and re-purposed. In conclusion, we will be able to achieve a new level of scientific productivity in the Big

  14. Big Data Analytics for Disaster Preparedness and Response of Mobile Communication Infrastructure during Natural Hazards

    NASA Astrophysics Data System (ADS)

    Zhong, L.; Takano, K.; Ji, Y.; Yamada, S.

    2015-12-01

    The disruption of telecommunications is one of the most critical disasters during natural hazards. As the rapid expanding of mobile communications, the mobile communication infrastructure plays a very fundamental role in the disaster response and recovery activities. For this reason, its disruption will lead to loss of life and property, due to information delays and errors. Therefore, disaster preparedness and response of mobile communication infrastructure itself is quite important. In many cases of experienced disasters, the disruption of mobile communication networks is usually caused by the network congestion and afterward long-term power outage. In order to reduce this disruption, the knowledge of communication demands during disasters is necessary. And big data analytics will provide a very promising way to predict the communication demands by analyzing the big amount of operational data of mobile users in a large-scale mobile network. Under the US-Japan collaborative project on 'Big Data and Disaster Research (BDD)' supported by the Japan Science and Technology Agency (JST) and National Science Foundation (NSF), we are going to investigate the application of big data techniques in the disaster preparedness and response of mobile communication infrastructure. Specifically, in this research, we have considered to exploit the big amount of operational information of mobile users for predicting the communications needs in different time and locations. By incorporating with other data such as shake distribution of an estimated major earthquake and the power outage map, we are able to provide the prediction information of stranded people who are difficult to confirm safety or ask for help due to network disruption. In addition, this result could further facilitate the network operators to assess the vulnerability of their infrastructure and make suitable decision for the disaster preparedness and response. In this presentation, we are going to introduce the

  15. Big Data Analytics in Chemical Engineering.

    PubMed

    Chiang, Leo; Lu, Bo; Castillo, Ivan

    2017-02-27

    Big data analytics is the journey to turn data into insights for more informed business and operational decisions. As the chemical engineering community is collecting more data (volume) from different sources (variety), this journey becomes more challenging in terms of using the right data and the right tools (analytics) to make the right decisions in real time (velocity). This article highlights recent big data advancements in five industries, including chemicals, energy, semiconductors, pharmaceuticals, and food, and then discusses technical, platform, and culture challenges. To reach the next milestone in multiplying successes to the enterprise level, government, academia, and industry need to collaboratively focus on workforce development and innovation. Expected final online publication date for the Annual Review of Chemical and Biomolecular Engineering Volume 8 is June 7, 2017. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.

  16. The discovery value of "Big Science".

    PubMed

    Esparza, José; Yamada, Tadataka

    2007-04-16

    The increasing complexity of biomedical research is leading to the exploration of new models for large-scale collaborative research. This Big Science approach, however, has created anxieties and potential tensions between investigator-driven research, and research guided by a more organized, collaborative effort. Another potential tension exists between research conducted purely in search of new knowledge and research aimed at finding solutions. We argue that big biomedicine--the work of coordinated multidisciplinary groups that use the latest technologies to solve complex problems--can be an important way to harness the creativity of individual investigators, stimulate innovation, and supply the infrastructure, experimental systems, and resources needed to solve the urgent health problems confronted by our global society. We discuss this using the example of the Global HIV Vaccine Enterprise.

  17. Big Data and Deep data in scanning and electron microscopies: functionality from multidimensional data sets

    SciTech Connect

    Belianinov, Alex; Vasudevan, Rama K; Strelcov, Evgheni; Steed, Chad A; Yang, Sang Mo; Tselev, Alexander; Jesse, Stephen; Biegalski, Michael D; Shipman, Galen M; Symons, Christopher T; Borisevich, Albina Y; Archibald, Richard K; Kalinin, Sergei

    2015-01-01

    The development of electron, and scanning probe microscopies in the second half of the twentieth century have produced spectacular images of internal structure and composition of matter with, at nanometer, molecular, and atomic resolution. Largely, this progress was enabled by computer-assisted methods of microscope operation, data acquisition and analysis. The progress in imaging technologies in the beginning of the twenty first century has opened the proverbial floodgates of high-veracity information on structure and functionality. High resolution imaging now allows information on atomic positions with picometer precision, allowing for quantitative measurements of individual bond length and angles. Functional imaging often leads to multidimensional data sets containing partial or full information on properties of interest, acquired as a function of multiple parameters (time, temperature, or other external stimuli). Here, we review several recent applications of the big and deep data analysis methods to visualize, compress, and translate this data into physically and chemically relevant information from imaging data.

  18. Singularities in big-bang cosmology

    NASA Astrophysics Data System (ADS)

    Penrose, R.

    1988-03-01

    A review of the history of the development of the big bang theory is presented, including the nature of singularities in black holes and their contribution to the study of the origin of the universe. Various models of the origin of the universe, the question of cosmic censorship, and the possible effects of gravitational collapse are examined. The relationship between considerations of quantum gravity and the structure of quantum theory is discussed.

  19. Funding big research with small money.

    PubMed

    Hickey, Joanne V; Koithan, Mary; Unruh, Lynn; Lundmark, Vicki

    2014-06-01

    This department highlights change management strategies that maybe successful in strategically planning and executing organizational change initiatives.With the goal of presenting practical approaches helpful to nurse leaders advancing organizational change, content includes evidence-based projects, tools,and resources that mobilize and sustain organizational change initiatives.In this article, the guest authors introduce crowd sourcing asa strategy for funding big research with small money.

  20. Big Data Challenges for Large Radio Arrays

    NASA Technical Reports Server (NTRS)

    Jones, Dayton L.; Wagstaff, Kiri; Thompson, David; D'Addario, Larry; Navarro, Robert; Mattmann, Chris; Majid, Walid; Lazio, Joseph; Preston, Robert; Rebbapragada, Umaa

    2012-01-01

    Future large radio astronomy arrays, particularly the Square Kilometre Array (SKA), will be able to generate data at rates far higher than can be analyzed or stored affordably with current practices. This is, by definition, a "big data" problem, and requires an end-to-end solution if future radio arrays are to reach their full scientific potential. Similar data processing, transport, storage, and management challenges face next-generation facilities in many other fields.

  1. Meaningfully Integrating Big Earth Science Data

    NASA Astrophysics Data System (ADS)

    Pebesma, E. J.; Stasch, C.

    2014-12-01

    After taking the technical hurdles to deal with big earth observationdata, large challenges remain to avoid that operations are carried out that are not meaningful. Examples of this are summing things that should not be summed, or interpolating phenomena that shouldnot be interpolated. We propose a description of data at the level of their meaning, to allow for notifying data users whenmeaningless operations are being executed. We present a prototypicalimplementation in R.

  2. Livermore Big Trees Park: 1998 summary results

    SciTech Connect

    Gallegos, G; MacQueen, D; Surano, K

    1999-08-13

    This report summarizes work conducted in 1998 by the Lawrence Livermore National Laboratory (LLNL) to determine the extent and origin of plutonium at concentrations above background levels at Big Trees Park in the city of Livermore. This summary includes the project background and sections that explain the sampling, radiochemical and data analysis, and data interpretation. This report is a summary report only and is not intended as a rigorous technical or statistical analysis of the data.

  3. The Big Idea. Dynamic Stakeholder Management

    DTIC Science & Technology

    2014-12-01

    Defense AT&L: November–December 2014 8 The Big IDEA Dynamic Stakeholder Management Lt. Col. Franklin D. Gaillard II, USAF Frank Gaillard, Ph.D...information systems at Global Campus, Troy University. Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources

  4. Can big business save health care?

    PubMed

    Dunn, Philip

    2007-01-01

    Corporate America has decided to stop bellyaching about the cost and quality of the health care it helps buy for its employees. Now it's taking concrete action. Large employers such as Wal-Mart, Oracle, Cisco, BP America and many, many others are pressuring providers to meet performance standards, adopt information technology and transform the efficiency of their operations. Big Business wants value for its buck, and it's now putting money where its mouth is.

  5. Big Bend National Park, TX, USA, Mexico

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The Sierra del Carmen of Mexico, across the Rio Grande River from Big Bend National Park, TX, (28.5N, 104.0W) is centered in this photo. The Rio Grande River bisects the scene; Mexico to the east, USA to the west. The thousand ft. Boquillas limestone cliff on the Mexican side of the river changes colors from white to pink to lavender at sunset. This severely eroded sedimentary landscape was once an ancient seabed later overlaid with volcanic activity.

  6. Big bounce from spin and torsion

    NASA Astrophysics Data System (ADS)

    Popławski, Nikodem J.

    2012-04-01

    The Einstein-Cartan-Sciama-Kibble theory of gravity naturally extends general relativity to account for the intrinsic spin of matter. Spacetime torsion, generated by spin of Dirac fields, induces gravitational repulsion in fermionic matter at extremely high densities and prevents the formation of singularities. Accordingly, the big bang is replaced by a bounce that occurred when the energy density {ɛ ∝ gT^4} was on the order of {n^2/m_Pl^2} (in natural units), where {n ∝ gT^3} is the fermion number density and g is the number of thermal degrees of freedom. If the early Universe contained only the known standard-model particles ( g ≈ 100), then the energy density at the big bounce was about 15 times larger than the Planck energy. The minimum scale factor of the Universe (at the bounce) was about 1032 times smaller than its present value, giving ≈ 50 μm. If more fermions existed in the early Universe, then the spin-torsion coupling causes a bounce at a lower energy and larger scale factor. Recent observations of high-energy photons from gamma-ray bursts indicate that spacetime may behave classically even at scales below the Planck length, supporting the classical spin-torsion mechanism of the big bounce. Such a classical bounce prevents the matter in the contracting Universe from reaching the conditions at which a quantum bounce could possibly occur.

  7. Adapting bioinformatics curricula for big data

    PubMed Central

    Greene, Anna C.; Giffin, Kristine A.; Greene, Casey S.

    2016-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs. PMID:25829469

  8. Statistical methods and computing for big data

    PubMed Central

    Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing

    2016-01-01

    Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay. PMID:27695593

  9. Big Data Analytics for Prostate Radiotherapy.

    PubMed

    Coates, James; Souhami, Luis; El Naqa, Issam

    2016-01-01

    Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose-volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the "RadoncSpace") in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches.

  10. Vertical landscraping, a big regionalism for Dubai.

    PubMed

    Wilson, Matthew

    2010-01-01

    Dubai's ecologic and economic complications are exacerbated by six years of accelerated expansion, a fixed top-down approach to urbanism and the construction of iconic single-phase mega-projects. With recent construction delays, project cancellations and growing landscape issues, Dubai's tower typologies have been unresponsive to changing environmental, socio-cultural and economic patterns (BBC, 2009; Gillet, 2009; Lewis, 2009). In this essay, a theory of "Big Regionalism" guides an argument for an economically and ecologically linked tower typology called the Condenser. This phased "box-to-tower" typology is part of a greater Landscape Urbanist strategy called Vertical Landscraping. Within this strategy, the Condenser's role is to densify the city, facilitating the creation of ecologic voids that order the urban region. Delineating "Big Regional" principles, the Condenser provides a time-based, global-local urban growth approach that weaves Bigness into a series of urban-regional, economic and ecological relationships, builds upon the environmental performance of the city's regional architecture and planning, promotes a continuity of Dubai's urban history, and responds to its landscape issues while condensing development. These speculations permit consideration of the overlooked opportunities embedded within Dubai's mega-projects and their long-term impact on the urban morphology.

  11. Addition goes where the big numbers are: evidence for a reversed operational momentum effect.

    PubMed

    Pinhas, Michal; Shaki, Samuel; Fischer, Martin H

    2015-08-01

    Number processing evokes spatial biases, both when dealing with single digits and in more complex mental calculations. Here we investigated whether these two biases have a common origin, by examining their flexibility. Participants pointed to the locations of arithmetic results on a visually presented line with an inverted, right-to-left number arrangement. We found directionally opposite spatial biases for mental arithmetic and for a parity task administered both before and after the arithmetic task. We discuss implications of this dissociation in our results for the task-dependent cognitive representation of numbers.

  12. 76 FR 29786 - Environmental Impact Statement for the Big Cypress National Preserve Addition, Florida

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-23

    ... Floodplain Statement of Findings for the General Management Plan/Wilderness Study/ Off-Road Vehicle... availability of the Record of Decision (ROD) and Floodplain Statement of Findings for the General Management... Floodplain Statement of Findings. ADDRESSES: The ROD is available online at...

  13. Using Data and Big Ideas: Teaching Distribution as an Instance of Repeated Addition. CRESST Report 734

    ERIC Educational Resources Information Center

    Vendlinski, Terry P.; Howard, Keith E.; Hemberg, Bryan C.; Vinyard, Laura; Martel, Annabel; Kyriacou, Elizabeth; Casper, Jennifer; Chai, Yourim; Phelan, Julia C.; Baker, Eva L.

    2008-01-01

    The inability of students to become proficient in algebra seems to be widespread in American schools. One of the reasons often cited for this inability is that instruction seldom builds on prior knowledge. Research suggests that teacher effectiveness is the most critical controllable variable in improving student achievement. This report details a…

  14. EDITORIAL: Big challenges and nanosolutions Big challenges and nanosolutions

    NASA Astrophysics Data System (ADS)

    Demming, Anna

    2011-07-01

    Population increases have triggered a number of concerns over the impact of human activity on the global environment. In addition these anxieties are exacerbated by the trend towards high levels of energy consumption and waste generation in developed nations. Pollutants that figure highly in environmental debate include greenhouse gases from fuel combustion and waste decomposition [1] and nitrogen from fertilisers [2]. In fact, human activity is transforming the nitrogen cycle at a record pace [3], and the pressure on available natural resources is mounting. As a collaboration of researchers in Saudi Arabia and the US explain in this issue, 26 countries across the world do not have sufficient water resources to sustain agriculture and economic development, and approximately one billion people lack access to safe drinking water [4]. They also point out a number of ways the potential of nanoscience and technology can be harnessed to tackle the problem. The key to managing pollutants is their detection. The biodegradation of waste in land fill sites can generate a build up of a number of green house and other gases. Olfactometry using the human expertise of a trained test panel is not a viable option for continuous monitoring of potentially odourless gases on industrial scales with any valid objectivity. Researchers in Italy have fabricated forest-like structures of carbon nanotubes loaded with metal nanoparticles and unmodified nanotubes on low-cost iron-coated alumina substrates [1]. The structure was able to detect NO2 in a multicomponent gas mixture of CO2, CH4, H2, NH3, CO and NO2 with sensitivity better than one part per million. Nanostructures exhibit a number of properties that lend themselves to sensing applications. They often have unique electrical properties that are readily affected by their environment. Such features were exploited by researchers in China who created nanoporous structures in ZnO sheets that can detect formaldehyde and ammonia, the

  15. Optical Investigation of a Connecting-rod Big End Bearing Model Under Dynamic Loads

    NASA Astrophysics Data System (ADS)

    Optasanu, V.; Bonneau, D.

    A new experimental device used for optical investigations of transient elastohydrodynamic behaviour of connecting-rod big end bearing models is presented. Photoelasticity method and digital image correlation method are used in order to visualise isochromatic fringe patterns and, respectively, displacement field. Validation of recording methods in dynamic regime are made. An isochromatic fringe pattern of the whole bearing is reconstructed using images taken for different regions of the model. An example of displacement visualisation of the interface cap/body of the bearing is presented

  16. Quark mass variation constraints from Big Bang nucleosynthesis

    NASA Astrophysics Data System (ADS)

    Bedaque, Paulo F.; Luu, Thomas; Platter, Lucas

    2011-04-01

    We study the impact on the primordial abundances of light elements created by a variation of the quark masses at the time of Big Bang nucleosynthesis (BBN). In order to navigate through the particle and nuclear physics required to connect quark masses to binding energies and reaction rates in a model-independent way, we use lattice QCD data and a hierarchy of effective field theories. We find that the measured He4 abundances put a bound of -1%≲δmq/mq≲0.7% on a possible variation of quark masses. The effect of quark mass variations on the deuterium abundances can be largely compensated by changes of the baryon-to-photon ratio η. Including bounds on the variation of η coming from WMAP results and adding some additional assumptions further narrows the range of allowed values of δmq/mq.

  17. Big Data Visual Analytics for Exploratory Earth System Simulation Analysis

    SciTech Connect

    Steed, Chad A.; Ricciuto, Daniel M.; Shipman, Galen M.; Smith, Brian E.; Thornton, Peter E.; Wang, Dali; Shi, Xiaoying; Williams, Dean N.

    2013-12-01

    Rapid increases in high performance computing are feeding the development of larger and more complex data sets in climate research, which sets the stage for so-called big data analysis challenges. However, conventional climate analysis techniques are inadequate in dealing with the complexities of today s data. In this paper, we describe and demonstrate a visual analytics system, called the Exploratory Data analysis ENvironment (EDEN), with specific application to the analysis of complex earth system simulation data sets. EDEN represents the type of interactive visual analysis tools that are necessary to transform data into insight, thereby improving critical comprehension of earth system processes. In addition to providing an overview of EDEN, we describe real-world studies using both point ensembles and global Community Land Model Version 4 (CLM4) simulations.

  18. Fast algorithm for relaxation processes in big-data systems

    NASA Astrophysics Data System (ADS)

    Hwang, S.; Lee, D.-S.; Kahng, B.

    2014-10-01

    Relaxation processes driven by a Laplacian matrix can be found in many real-world big-data systems, for example, in search engines on the World Wide Web and the dynamic load-balancing protocols in mesh networks. To numerically implement such processes, a fast-running algorithm for the calculation of the pseudoinverse of the Laplacian matrix is essential. Here we propose an algorithm which computes quickly and efficiently the pseudoinverse of Markov chain generator matrices satisfying the detailed-balance condition, a general class of matrices including the Laplacian. The algorithm utilizes the renormalization of the Gaussian integral. In addition to its applicability to a wide range of problems, the algorithm outperforms other algorithms in its ability to compute within a manageable computing time arbitrary elements of the pseudoinverse of a matrix of size millions by millions. Therefore our algorithm can be used very widely in analyzing the relaxation processes occurring on large-scale networked systems.

  19. Next-generation sequencing: big data meets high performance computing.

    PubMed

    Schmidt, Bertil; Hildebrandt, Andreas

    2017-02-02

    The progress of next-generation sequencing has a major impact on medical and genomic research. This high-throughput technology can now produce billions of short DNA or RNA fragments in excess of a few terabytes of data in a single run. This leads to massive datasets used by a wide range of applications including personalized cancer treatment and precision medicine. In addition to the hugely increased throughput, the cost of using high-throughput technologies has been dramatically decreasing. A low sequencing cost of around US$1000 per genome has now rendered large population-scale projects feasible. However, to make effective use of the produced data, the design of big data algorithms and their efficient implementation on modern high performance computing systems is required.

  20. Big bad data: law, public health, and biomedical databases.

    PubMed

    Hoffman, Sharona; Podgurski, Andy

    2013-03-01

    The accelerating adoption of electronic health record (EHR) systems will have far-reaching implications for public health research and surveillance, which in turn could lead to changes in public policy, statutes, and regulations. The public health benefits of EHR use can be significant. However, researchers and analysts who rely on EHR data must proceed with caution and understand the potential limitations of EHRs. Because of clinicians' workloads, poor user-interface design, and other factors, EHR data can be erroneous, miscoded, fragmented, and incomplete. In addition, public health findings can be tainted by the problems of selection bias, confounding bias, and measurement bias. These flaws may become all the more troubling and important in an era of electronic "big data," in which a massive amount of information is processed automatically, without human checks. Thus, we conclude the paper by outlining several regulatory and other interventions to address data analysis difficulties that could result in invalid conclusions and unsound public health policies.

  1. Measurements of Radiative Capture Cross Sections at Big Bang Energies

    NASA Astrophysics Data System (ADS)

    Tanaka, Masaomi; Fukuda, Mitsunori; Tanaka, Yutaro; Du, Hang; Ohnishi, Kousuke; Yagi, Shoichi; Sugihara, Takanobu; Hori, Taichi; Nakamura, Shoken; Yanagihara, Rikuto; Matsuta, Kensaku; Mihara, Mototsugu; Nishimura, Daiki; Iwakiri, Shuichi; Kambayashi, Shohei; Kunimatsu, Shota; Sakakibara, Hikaru; Yamaoka, Shintaro

    We measured d(p, γ )3He cross sections at ECM = 0.12, 0.19, 0.44, and 0.57 MeV. In this energy region, available experimental values are systematically smaller than the recent calculation, so that additional experiments are desired for understanding the Big Bang Nucleosynthesis. The experiment was performed by bombarding proton beams to the D2 gas target with the 5 MV Van de Graaff accelerator at Osaka University. The experimental d(p, γ )3He cross sections of the present study are systematically larger than previous data. On the other hand, recent theoretical results by Marcucci et al. are in good agreement with present experimental results.

  2. Quark mass variation constraints from Big Bang nucleosynthesis

    SciTech Connect

    Bedaque, P; Luu, T; Platter, L

    2010-12-13

    We study the impact on the primordial abundances of light elements created of a variation of the quark masses at the time of Big Bang nucleosynthesis (BBN). In order to navigate through the particle and nuclear physics required to connect quark masses to binding energies and reaction rates in a model-independent way we use lattice QCD data and an hierarchy of effective field theories. We find that the measured {sup 4}He abundances put a bound of {delta}-1% {approx}< m{sub q}/m{sub 1} {approx}< 0.7%. The effect of quark mass variations on the deuterium abundances can be largely compensated by changes of the baryon-to-photon ratio {eta}. Including the bounds on the variation of {eta} coming from WMAP results and some additional assumptions narrows the range of allowed values of {delta}m{sub q}/m{sub q} somewhat.

  3. Building Simulation Modelers are we big-data ready?

    SciTech Connect

    Sanyal, Jibonananda; New, Joshua Ryan

    2014-01-01

    Recent advances in computing and sensor technologies have pushed the amount of data we collect or generate to limits previously unheard of. Sub-minute resolution data from dozens of channels is becoming increasingly common and is expected to increase with the prevalence of non-intrusive load monitoring. Experts are running larger building simulation experiments and are faced with an increasingly complex data set to analyze and derive meaningful insight. This paper focuses on the data management challenges that building modeling experts may face in data collected from a large array of sensors, or generated from running a large number of building energy/performance simulations. The paper highlights the technical difficulties that were encountered and overcome in order to run 3.5 million EnergyPlus simulations on supercomputers and generating over 200 TBs of simulation output. This extreme case involved development of technologies and insights that will be beneficial to modelers in the immediate future. The paper discusses different database technologies (including relational databases, columnar storage, and schema-less Hadoop) in order to contrast the advantages and disadvantages of employing each for storage of EnergyPlus output. Scalability, analysis requirements, and the adaptability of these database technologies are discussed. Additionally, unique attributes of EnergyPlus output are highlighted which make data-entry non-trivial for multiple simulations. Practical experience regarding cost-effective strategies for big-data storage is provided. The paper also discusses network performance issues when transferring large amounts of data across a network to different computing devices. Practical issues involving lag, bandwidth, and methods for synchronizing or transferring logical portions of the data are presented. A cornerstone of big-data is its use for analytics; data is useless unless information can be meaningfully derived from it. In addition to technical

  4. Combining Human Computing and Machine Learning to Make Sense of Big (Aerial) Data for Disaster Response.

    PubMed

    Ofli, Ferda; Meier, Patrick; Imran, Muhammad; Castillo, Carlos; Tuia, Devis; Rey, Nicolas; Briant, Julien; Millet, Pauline; Reinhard, Friedrich; Parkan, Matthew; Joost, Stéphane

    2016-03-01

    Aerial imagery captured via unmanned aerial vehicles (UAVs) is playing an increasingly important role in disaster response. Unlike satellite imagery, aerial imagery can be captured and processed within hours rather than days. In addition, the spatial resolution of aerial imagery is an order of magnitude higher than the imagery produced by the most sophisticated commercial satellites today. Both the United States Federal Emergency Management Agency (FEMA) and the European Commission's Joint Research Center (JRC) have noted that aerial imagery will inevitably present a big data challenge. The purpose of this article is to get ahead of this future challenge by proposing a hybrid crowdsourcing and real-time machine learning solution to rapidly process large volumes of aerial data for disaster response in a time-sensitive manner. Crowdsourcing can be used to annotate features of interest in aerial images (such as damaged shelters and roads blocked by debris). These human-annotated features can then be used to train a supervised machine learning system to learn to recognize such features in new unseen images. In this article, we describe how this hybrid solution for image analysis can be implemented as a module (i.e., Aerial Clicker) to extend an existing platform called Artificial Intelligence for Disaster Response (AIDR), which has already been deployed to classify microblog messages during disasters using its Text Clicker module and in response to Cyclone Pam, a category 5 cyclone that devastated Vanuatu in March 2015. The hybrid solution we present can be applied to both aerial and satellite imagery and has applications beyond disaster response such as wildlife protection, human rights, and archeological exploration. As a proof of concept, we recently piloted this solution using very high-resolution aerial photographs of a wildlife reserve in Namibia to support rangers with their wildlife conservation efforts (SAVMAP project, http://lasig.epfl.ch/savmap ). The

  5. When small is better than BIG

    SciTech Connect

    McDaniel, Hunter; Beard, Matthew C; Wheeler, Lance M; Pietryga, Jeffrey M

    2013-07-18

    Representing the Center for Advanced Solar Photophysics (CASP), this document is one of the entries in the Ten Hundred and One Word Challenge and was awarded “Overall Winner Runner-up and People’s Choice Winner.” As part of the challenge, the 46 Energy Frontier Research Centers were invited to represent their science in images, cartoons, photos, words and original paintings, but any descriptions or words could only use the 1000 most commonly used words in the English language, with the addition of one word important to each of the EFRCs and the mission of DOE: energy. The mission of CASP is to explore and exploit the unique physics of nanostructured materials to boost the efficiency of solar energy conversion through novel light-matter interactions, controlled excited-state dynamics, and engineered carrier-carrier coupling.

  6. Big Soda Lake (Nevada). 2. Pelagic sulfate reduction

    USGS Publications Warehouse

    Smith, Richard L.; Oremland, Ronald S.

    1987-01-01

    The epilimnion of hypersaline, alkaline, meromictic Big Soda Lake contains an average 58 mmol sulfate liter−1 and 0.4 µmol dissolved iron liter−1. The monimolimnion, which is permanently anoxic, has a sulfide concentration ranging seasonally from 4 to 7 mmol liter−1. Depth profiles of sulfate reduction in the monimolimnion, assayed with a 35S tracer technique and in situ incubations, demonstrated that sulfate reduction occurs within the water column of this extreme environment. The average rate of reduction in the monimolimnion was 3 µmol sulfate liter−1 d−1in May compared to 0.9 in October. These values are comparable to rates of sulfate reduction reported for anoxic waters of more moderate environments. Sulfate reduction also occurred in the anoxic zone of the mixolimnion, though at significantly lower rates (0.025–0.090 µmol liter−1 d−1 at 25 m). Additions of FeS (1.0 mmol liter−1) doubled the endogenous rate of sulfate reduction in the monimolimnion, while MnS and kaolinite had no effect. These results suggest that sulfate reduction in Big Soda Lake is iron limited and controlled by seasonal variables other than temperature. Estimates of the organic carbon mineralized by sulfate reduction exceed measured fluxes of particulate organic carbon sinking from the mixolimnion. Thus, additional sources of electron donors (other than those derived from the sinking of pelagic autotrophs) may also fuel monimolimnetic sulfate reduction in the lake.

  7. Additive and subtractive transparent depth displays

    NASA Astrophysics Data System (ADS)

    Kooi, Frank L.; Toet, Alexander

    2003-09-01

    Image fusion is the generally preferred method to combine two or more images for visual display on a single screen. We demonstrate that perceptual image separation may be preferable over perceptual image fusion for the combined display of enhanced and synthetic imagery. In this context image separation refers to the simultaneous presentation of images on different depth planes of a single display. Image separation allows the user to recognize the source of the information that is displayed. This can be important because synthetic images are more liable to flaws. We have examined methods to optimize perceptual image separation. A true depth difference between enhanced and synthetic imagery works quite well. A standard stereoscopic display based on convergence is less suitable since the two images tend to interfere: the image behind is masked (occluded) by the image in front, which results in poor viewing comfort. This effect places 3D systems based on 3D glasses, as well as most autostereoscopic displays, at a serious disadvantage. A 3D display based on additive or subtractive transparency is acceptable: both the perceptual separation and the viewing comfort are good, but the color of objects depends on the color in the other depth layer(s). A combined additive and subtractive transparent display eliminates this disadvantage and is most suitable for the combined display of enhanced and synthetic imagery. We suggest that the development of such a display system is of a greater practical value than increasing the number of depth planes in autostereoscopic displays.

  8. Occurrence and transport of nitrogen in the Big Sunflower River, northwestern Mississippi, October 2009-June 2011

    USGS Publications Warehouse

    Barlow, Jeannie R.B.; Coupe, Richard H.

    2014-01-01

    The Big Sunflower River Basin, located within the Yazoo River Basin, is subject to large annual inputs of nitrogen from agriculture, atmospheric deposition, and point sources. Understanding how nutrients are transported in, and downstream from, the Big Sunflower River is key to quantifying their eutrophying effects on the Gulf. Recent results from two Spatially Referenced Regressions on Watershed attributes (SPARROW models), which include the Big Sunflower River, indicate minimal losses of nitrogen in stream reaches typical of the main channels of major river systems. If SPARROW assumptions of relatively conservative transport of nitrogen are correct and surface-water losses through the bed of the Big Sunflower River are negligible, then options for managing nutrient loads to the Gulf of Mexico may be limited. Simply put, if every pound of nitrogen entering the Delta is eventually delivered to the Gulf, then the only effective nutrient management option in the Delta is to reduce inputs. If, on the other hand, it can be shown that processes within river channels of the Mississippi Delta act to reduce the mass of nitrogen in transport, other hydrologic approaches may be designed to further limit nitrogen transport. Direct validation of existing SPARROW models for the Delta is a first step in assessing the assumptions underlying those models. In order to characterize spatial and temporal variability of nitrogen in the Big Sunflower River Basin, water samples were collected at four U.S. Geological Survey gaging stations located on the Big Sunflower River between October 1, 2009, and June 30, 2011. Nitrogen concentrations were generally highest at each site during the spring of the 2010 water year and the fall and winter of the 2011 water year. Additionally, the dominant form of nitrogen varied between sites. For example, in samples collected from the most upstream site (Clarksdale), the concentration of organic nitrogen was generally higher than the concentrations of

  9. Classification Algorithms for Big Data Analysis, a Map Reduce Approach

    NASA Astrophysics Data System (ADS)

    Ayma, V. A.; Ferreira, R. S.; Happ, P.; Oliveira, D.; Feitosa, R.; Costa, G.; Plaza, A.; Gamba, P.

    2015-03-01

    Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP), which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA's machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM). The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.

  10. Big Data Analytics for Scanning Transmission Electron Microscopy Ptychography

    PubMed Central

    Jesse, S.; Chi, M.; Belianinov, A.; Beekman, C.; Kalinin, S. V.; Borisevich, A. Y.; Lupini, A. R.

    2016-01-01

    Electron microscopy is undergoing a transition; from the model of producing only a few micrographs, through the current state where many images and spectra can be digitally recorded, to a new mode where very large volumes of data (movies, ptychographic and multi-dimensional series) can be rapidly obtained. Here, we discuss the application of so-called “big-data” methods to high dimensional microscopy data, using unsupervised multivariate statistical techniques, in order to explore salient image features in a specific example of BiFeO3 domains. Remarkably, k-means clustering reveals domain differentiation despite the fact that the algorithm is purely statistical in nature and does not require any prior information regarding the material, any coexisting phases, or any differentiating structures. While this is a somewhat trivial case, this example signifies the extraction of useful physical and structural information without any prior bias regarding the sample or the instrumental modality. Further interpretation of these types of results may still require human intervention. However, the open nature of this algorithm and its wide availability, enable broad collaborations and exploratory work necessary to enable efficient data analysis in electron microscopy. PMID:27211523

  11. Big Data Analytics for Scanning Transmission Electron Microscopy Ptychography

    SciTech Connect

    Jesse, S.; Chi, M.; Belianinov, A.; Beekman, C.; Kalinin, S. V.; Borisevich, A. Y.; Lupini, A. R.

    2016-05-23

    Electron microscopy is undergoing a transition; from the model of producing only a few micrographs, through the current state where many images and spectra can be digitally recorded, to a new mode where very large volumes of data (movies, ptychographic and multi-dimensional series) can be rapidly obtained. In this paper, we discuss the application of so-called “big-data” methods to high dimensional microscopy data, using unsupervised multivariate statistical techniques, in order to explore salient image features in a specific example of BiFeO3 domains. Remarkably, k-means clustering reveals domain differentiation despite the fact that the algorithm is purely statistical in nature and does not require any prior information regarding the material, any coexisting phases, or any differentiating structures. While this is a somewhat trivial case, this example signifies the extraction of useful physical and structural information without any prior bias regarding the sample or the instrumental modality. Further interpretation of these types of results may still require human intervention. Finally, however, the open nature of this algorithm and its wide availability, enable broad collaborations and exploratory work necessary to enable efficient data analysis in electron microscopy.

  12. Big Data Analytics for Scanning Transmission Electron Microscopy Ptychography

    DOE PAGES

    Jesse, S.; Chi, M.; Belianinov, A.; ...

    2016-05-23

    Electron microscopy is undergoing a transition; from the model of producing only a few micrographs, through the current state where many images and spectra can be digitally recorded, to a new mode where very large volumes of data (movies, ptychographic and multi-dimensional series) can be rapidly obtained. In this paper, we discuss the application of so-called “big-data” methods to high dimensional microscopy data, using unsupervised multivariate statistical techniques, in order to explore salient image features in a specific example of BiFeO3 domains. Remarkably, k-means clustering reveals domain differentiation despite the fact that the algorithm is purely statistical in nature andmore » does not require any prior information regarding the material, any coexisting phases, or any differentiating structures. While this is a somewhat trivial case, this example signifies the extraction of useful physical and structural information without any prior bias regarding the sample or the instrumental modality. Further interpretation of these types of results may still require human intervention. Finally, however, the open nature of this algorithm and its wide availability, enable broad collaborations and exploratory work necessary to enable efficient data analysis in electron microscopy.« less

  13. Big Data Analytics for Scanning Transmission Electron Microscopy Ptychography.

    PubMed

    Jesse, S; Chi, M; Belianinov, A; Beekman, C; Kalinin, S V; Borisevich, A Y; Lupini, A R

    2016-05-23

    Electron microscopy is undergoing a transition; from the model of producing only a few micrographs, through the current state where many images and spectra can be digitally recorded, to a new mode where very large volumes of data (movies, ptychographic and multi-dimensional series) can be rapidly obtained. Here, we discuss the application of so-called "big-data" methods to high dimensional microscopy data, using unsupervised multivariate statistical techniques, in order to explore salient image features in a specific example of BiFeO3 domains. Remarkably, k-means clustering reveals domain differentiation despite the fact that the algorithm is purely statistical in nature and does not require any prior information regarding the material, any coexisting phases, or any differentiating structures. While this is a somewhat trivial case, this example signifies the extraction of useful physical and structural information without any prior bias regarding the sample or the instrumental modality. Further interpretation of these types of results may still require human intervention. However, the open nature of this algorithm and its wide availability, enable broad collaborations and exploratory work necessary to enable efficient data analysis in electron microscopy.

  14. Big Data Analytics for Scanning Transmission Electron Microscopy Ptychography

    NASA Astrophysics Data System (ADS)

    Jesse, S.; Chi, M.; Belianinov, A.; Beekman, C.; Kalinin, S. V.; Borisevich, A. Y.; Lupini, A. R.

    2016-05-01

    Electron microscopy is undergoing a transition; from the model of producing only a few micrographs, through the current state where many images and spectra can be digitally recorded, to a new mode where very large volumes of data (movies, ptychographic and multi-dimensional series) can be rapidly obtained. Here, we discuss the application of so-called “big-data” methods to high dimensional microscopy data, using unsupervised multivariate statistical techniques, in order to explore salient image features in a specific example of BiFeO3 domains. Remarkably, k-means clustering reveals domain differentiation despite the fact that the algorithm is purely statistical in nature and does not require any prior information regarding the material, any coexisting phases, or any differentiating structures. While this is a somewhat trivial case, this example signifies the extraction of useful physical and structural information without any prior bias regarding the sample or the instrumental modality. Further interpretation of these types of results may still require human intervention. However, the open nature of this algorithm and its wide availability, enable broad collaborations and exploratory work necessary to enable efficient data analysis in electron microscopy.

  15. "small problems, Big Trouble": An Art and Science Collaborative Exhibition Reflecting Seemingly small problems Leading to Big Threats

    NASA Astrophysics Data System (ADS)

    Waller, J. L.; Brey, J. A.

    2014-12-01

    disasters continues to inspire new chapters in their "Layers: Places in Peril" exhibit! A slide show includes images of paintings for "small problems, Big Trouble". Brey and Waller will lead a discussion on their process of incorporating broader collaboration with geoscientists and others in an educational art exhibition.

  16. Relevance of eHealth standards for big data interoperability in radiology and beyond.

    PubMed

    Marcheschi, Paolo

    2016-11-04

    The aim of this paper is to report on the implementation of radiology and related information technology standards to feed big data repositories and so to be able to create a solid substrate on which to operate with analysis software. Digital Imaging and Communications in Medicine (DICOM) and Health Level 7 (HL7) are the major standards for radiology and medical information technology. They define formats and protocols to transmit medical images, signals, and patient data inside and outside hospital facilities. These standards can be implemented but big data expectations are stimulating a new approach, simplifying data collection and interoperability, seeking reduction of time to full implementation inside health organizations. Virtual Medical Record, DICOM Structured Reporting and HL7 Fast Healthcare Interoperability Resources (FHIR) are changing the way medical data are shared among organization and they will be the keys to big data interoperability. Until we do not find simple and comprehensive methods to store and disseminate detailed information on the patient's health we will not be able to get optimum results from the analysis of those data.

  17. Additive manufacturing of polymer-derived ceramics

    NASA Astrophysics Data System (ADS)

    Eckel, Zak C.; Zhou, Chaoyin; Martin, John H.; Jacobsen, Alan J.; Carter, William B.; Schaedler, Tobias A.

    2016-01-01

    The extremely high melting point of many ceramics adds challenges to additive manufacturing as compared with metals and polymers. Because ceramics cannot be cast or machined easily, three-dimensional (3D) printing enables a big leap in geometrical flexibility. We report preceramic monomers that are cured with ultraviolet light in a stereolithography 3D printer or through a patterned mask, forming 3D polymer structures that can have complex shape and cellular architecture. These polymer structures can be pyrolyzed to a ceramic with uniform shrinkage and virtually no porosity. Silicon oxycarbide microlattice and honeycomb cellular materials fabricated with this approach exhibit higher strength than ceramic foams of similar density. Additive manufacturing of such materials is of interest for propulsion components, thermal protection systems, porous burners, microelectromechanical systems, and electronic device packaging.

  18. Additive manufacturing of polymer-derived ceramics.

    PubMed

    Eckel, Zak C; Zhou, Chaoyin; Martin, John H; Jacobsen, Alan J; Carter, William B; Schaedler, Tobias A

    2016-01-01

    The extremely high melting point of many ceramics adds challenges to additive manufacturing as compared with metals and polymers. Because ceramics cannot be cast or machined easily, three-dimensional (3D) printing enables a big leap in geometrical flexibility. We report preceramic monomers that are cured with ultraviolet light in a stereolithography 3D printer or through a patterned mask, forming 3D polymer structures that can have complex shape and cellular architecture. These polymer structures can be pyrolyzed to a ceramic with uniform shrinkage and virtually no porosity. Silicon oxycarbide microlattice and honeycomb cellular materials fabricated with this approach exhibit higher strength than ceramic foams of similar density. Additive manufacturing of such materials is of interest for propulsion components, thermal protection systems, porous burners, microelectromechanical systems, and electronic device packaging.

  19. Reclaiming 'Big Nurse': a feminist critique of Ken Kesey's portrayal of Nurse Ratched in One Flew Over the Cuckoo's Nest.

    PubMed

    Darbyshire, P

    1995-12-01

    Nurse Ratched or 'Big Nurse' in Ken Kesey's counter-culture novel One Flew Over the Cuckoo's Nest is one of popular culture's most arresting and memorable images of the nurse. She is, however, deemed to be remarkable primarily for her malice and authoritarianism. This paper argues that such a purely realist reading fails to fully appreciate the significance of the character of Nurse Ratched. A feminist critique of the novel contends that the importance of 'Big Nurse' is less related to how realistic/unrealistic or good/bad she is as a nurse. Nurse Ratched is important because she exemplifies all that traditional masculinity abhors in women, and particularly in strong women in positions of power and influence. This paper explores the stereotype of 'Big Nurse' and argues that Kesey's vision of her ultimate 'conquest' is not a progressive allegory of 'individual freedom', but a reactionary misogyny which would deny women any function other than that of sexual trophy.

  20. Biobanking with Big Data: A Need for Developing "Big Data Metrics".

    PubMed

    Zisis, Kozlakidis

    2016-10-01

    The term "big data" has often been used as an all-encompassing phrase for research that involves the use of large-scale data sets. However, the use of the term does little to signify the underlying complexity of definitions, of data sets, and of the requirements that need to be taken into consideration for sustainable research and the estimation of downstream impact. In particular, "big data" is frequently connected with biobanks and biobank networks as the institutions involved in tissue preservation are increasingly and perhaps unavoidably linked to the de facto preservation of information.

  1. Big Data, Big Problems: Incorporating Mission, Values, and Culture in Provider Affiliations.

    PubMed

    Shaha, Steven H; Sayeed, Zain; Anoushiravani, Afshin A; El-Othmani, Mouhanad M; Saleh, Khaled J

    2016-10-01

    This article explores how integration of data from clinical registries and electronic health records produces a quality impact within orthopedic practices. Data are differentiated from information, and several types of data that are collected and used in orthopedic outcome measurement are defined. Furthermore, the concept of comparative effectiveness and its impact on orthopedic clinical research are assessed. This article places emphasis on how the concept of big data produces health care challenges balanced with benefits that may be faced by patients and orthopedic surgeons. Finally, essential characteristics of an electronic health record that interlinks musculoskeletal care and big data initiatives are reviewed.

  2. The challenge of big data in public health: an opportunity for visual analytics.

    PubMed

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data's volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research.

  3. A Demonstration of Big Data Technology for Data Intensive Earth Science (Invited)

    NASA Astrophysics Data System (ADS)

    Kuo, K.; Clune, T.; Ramachandran, R.; Rushing, J.; Fekete, G.; Lin, A.; Doan, K.; Oloso, A. O.; Duffy, D.

    2013-12-01

    Big Data technologies exhibit great potential to change the way we conduct scientific investigations, especially analysis of voluminous and diverse data sets. Obviously, not all Big Data technologies are applicable to all aspects of scientific data analysis. Our NASA Earth Science Technology Office (ESTO) Advanced Information Systems Technology (AIST) project, Automated Event Service (AES), pioneers the exploration of Big Data technologies for data intensive Earth science. Since Earth science data are largely stored and manipulated in the form of multidimensional arrays, the project first evaluates array performance of several candidate Big Data technologies, including MapReduce (Hadoop), SciDB, and a custom-built Polaris system, which have one important feature in common: shared nothing architecture. The evaluation finds SicDB to be the most promising. In this presentation, we demonstrate SciDB using a couple of use cases, each operating on a distinct data set in the regular latitude-longitude grid. The first use case is the discovery and identification of blizzards using NASA's Modern Era Retrospective-analysis for Research and Application (MERRA) data sets. The other finds diurnal signals in the same 8-year period using SSMI data from three different instruments with different equator crossing times by correlating their retrieved parameters. In addition, the AES project is also developing a collaborative component to enable the sharing of event queries and results. Preliminary capabilities will be presented as well.

  4. 10 Aspects of the Big Five in the Personality Inventory for DSM-5

    PubMed Central

    DeYoung, Colin. G.; Carey, Bridget E.; Krueger, Robert F.; Ross, Scott R.

    2015-01-01

    DSM-5 includes a dimensional model of personality pathology, operationalized in the Personality Inventory for DSM-5 (PID-5), with 25 facets grouped into five higher-order factors resembling the Big Five personality dimensions. The present study tested how well these 25 facets could be integrated with the 10-factor structure of traits within the Big Five that is operationalized by the Big Five Aspect Scales (BFAS). In two healthy adult samples, 10-factor solutions largely confirmed our hypothesis that each of the 10 BFAS scales would be the highest loading BFAS scale on one and only one factor. Varying numbers of PID-5 scales were additional markers of each factor, and the overall factor structure in the first sample was well replicated in the second. Our results allow Cybernetic Big Five Theory (CB5T) to be brought to bear on manifestations of personality disorder, because CB5T offers mechanistic explanations of the 10 factors measured by the BFAS. Future research, therefore, may begin to test hypotheses derived from CB5T regarding the mechanisms that are dysfunctional in specific personality disorders. PMID:27032017

  5. Addressing the Big-Earth-Data Variety Challenge with the Hierarchical Triangular Mesh

    NASA Technical Reports Server (NTRS)

    Rilee, Michael L.; Kuo, Kwo-Sen; Clune, Thomas; Oloso, Amidu; Brown, Paul G.; Yu, Honfeng

    2016-01-01

    We have implemented an updated Hierarchical Triangular Mesh (HTM) as the basis for a unified data model and an indexing scheme for geoscience data to address the variety challenge of Big Earth Data. We observe that, in the absence of variety, the volume challenge of Big Data is relatively easily addressable with parallel processing. The more important challenge in achieving optimal value with a Big Data solution for Earth Science (ES) data analysis, however, is being able to achieve good scalability with variety. With HTM unifying at least the three popular data models, i.e. Grid, Swath, and Point, used by current ES data products, data preparation time for integrative analysis of diverse datasets can be drastically reduced and better variety scaling can be achieved. In addition, since HTM is also an indexing scheme, when it is used to index all ES datasets, data placement alignment (or co-location) on the shared nothing architecture, which most Big Data systems are based on, is guaranteed and better performance is ensured. Moreover, our updated HTM encoding turns most geospatial set operations into integer interval operations, gaining further performance advantages.

  6. Ten aspects of the Big Five in the Personality Inventory for DSM-5.

    PubMed

    DeYoung, Colin G; Carey, Bridget E; Krueger, Robert F; Ross, Scott R

    2016-04-01

    Diagnostic and Statistical Manual of Mental Disorders (5th ed.; DSM-5) includes a dimensional model of personality pathology, operationalized in the Personality Inventory for DSM-5 (PID-5), with 25 facets grouped into 5 higher order factors resembling the Big Five personality dimensions. The present study tested how well these 25 facets could be integrated with the 10-factor structure of traits within the Big Five that is operationalized by the Big Five Aspect Scales (BFAS). In 2 healthy adult samples, 10-factor solutions largely confirmed our hypothesis that each of the 10 BFAS would be the highest loading BFAS on 1 and only 1 factor. Varying numbers of PID-5 scales were additional markers of each factor, and the overall factor structure in the first sample was well replicated in the second. Our results allow Cybernetic Big Five Theory (CB5T) to be brought to bear on manifestations of personality disorder, because CB5T offers mechanistic explanations of the 10 factors measured by the BFAS. Future research, therefore, may begin to test hypotheses derived from CB5T regarding the mechanisms that are dysfunctional in specific personality disorders.

  7. Using the ACR CT accreditation phantom for routine image quality assurance on both CT and CBCT imaging systems in a radiotherapy environment.

    PubMed

    Hobson, Maritza A; Soisson, Emilie T; Davis, Stephen D; Parker, William

    2014-07-08

    Image-guided radiation therapy using cone-beam computed tomography (CBCT) is becoming routine practice in modern radiation therapy. The purpose of this work was to develop an imaging QA program for CT and CBCT units in our department, based on the American College of Radiology (ACR) CT accreditation phantom. The phantom has four testing modules, permitting one to test CT number accuracy, slice width, low contrast resolution, image uniformity, in-plane distance accuracy, and high-contrast resolution reproducibly with suggested window/levels for image analysis. Additional tests for contrast-to-noise ratio (CNR) and noise were added using the polyethylene and acrylic plugs. Baseline values were obtained from CT simulator images acquired on a Phillips Brilliance Big Bore CT simulator and CBCT images acquired on three Varian CBCTs for the imaging protocols most used clinically. Images were then acquired quarterly over a period of two years. Images were exported via DICOM and analyzed manually using OsiriX. Baseline values were used to ensure that image quality remained consistent quarterly, and baselines were reset at any major maintenance or recalibration. Analysis of CT simulator images showed that image quality was within ACR guidelines for all tested scanning protocols. All three CBCT systems were unable to distinguish the low-contrast resolution plugs and had the same high-contrast resolution over all imaging protocols. Analysis of CBCT results over time determined a range of values that could be used to establish quantitative tolerance levels for image quality deterioration. While appropriate for the helical CT, the ACR phantom and guidelines could be modified to be more useful in evaluating CBCT systems. In addition, the observed values for the CT simulator were well within ACR tolerances.

  8. [Big data and their perspectives in radiation therapy].

    PubMed

    Guihard, Sébastien; Thariat, Juliette; Clavier, Jean-Baptiste

    2017-02-01

    The concept of big data indicates a change of scale in the use of data and data aggregation into large databases through improved computer technology. One of the current challenges in the creation of big data in the context of radiation therapy is the transformation of routine care items into dark data, i.e. data not yet collected, and the fusion of databases collecting different types of information (dose-volume histograms and toxicity data for example). Processes and infrastructures devoted to big data collection should not impact negatively on the doctor-patient relationship, the general process of care or the quality of the data collected. The use of big data requires a collective effort of physicians, physicists, software manufacturers and health authorities to create, organize and exploit big data in radiotherapy and, beyond, oncology. Big data involve a new culture to build an appropriate infrastructure legally and ethically. Processes and issues are discussed in this article.

  9. Data management by using R: big data clinical research series.

    PubMed

    Zhang, Zhongheng

    2015-11-01

    Electronic medical record (EMR) system has been widely used in clinical practice. Instead of traditional record system by hand writing and recording, the EMR makes big data clinical research feasible. The most important feature of big data research is its real-world setting. Furthermore, big data research can provide all aspects of information related to healthcare. However, big data research requires some skills on data management, which however, is always lacking in the curriculum of medical education. This greatly hinders doctors from testing their clinical hypothesis by using EMR. To make ends meet, a series of articles introducing data management techniques are put forward to guide clinicians to big data clinical research. The present educational article firstly introduces some basic knowledge on R language, followed by some data management skills on creating new variables, recoding variables and renaming variables. These are very basic skills and may be used in every project of big data research.

  10. Floods in the Big Creek basin, Linn County, Iowa

    USGS Publications Warehouse

    Heinitz, Albert J.

    1977-01-01

    Flood information for the Big Creek basin in Linn County, Iowa, should be of use to those concerned with the design of bridges and other structures on the flood plains of the streams. Water-surface profiles for the flood of May 1974 are given for Big Creek and its major tributaries, East Big, Crabapple, Elbow, and Abbe Creeks. The May 1974 flood was at least a 50-year flood on East Big Creek and along certain reaches of Big and Abbe Creeks. Also included for Big Creek are a profile of the December 1971 medium-stage flow and a partial profile for the minor flood of July 1971. Profiles for the low-water condition of October 26, 1972, are shown for all reaches. Water-surface profiles for the 25- and 50-year floods are estimated in relation to the May 1974 flood.

  11. Parallel and Scalable Big Data Analysis in the Earth Sciences with JuML

    NASA Astrophysics Data System (ADS)

    Goetz, M.

    2015-12-01

    Recent developments of using a significantly increasing number of sensors with better resolutions in the wide variety of different earth observation projects continously contribute to the availability of 'big data' in the earth sciences. Not only the volume, velocity, and variety of the datasets pose increasing challenges for its analysis, but also the complexity of datasets (e.g. high number of dimensions in hyper-spectral images) requires data algorithms that are able to scale. This contribution will provide insights about the Juelich Machine learning Library (JuML) and its contents that have been actively used in several scientific use cases in the earth sciences. We discuss and categorize challenges related to 'big data' analysis and outline parallel algorithmic solutions driven by those use cases.

  12. 'Big data' in pharmaceutical science: challenges and opportunities.

    PubMed

    Dossetter, Al G; Ecker, Gerhard; Laverty, Hugh; Overington, John

    2014-05-01

    Future Medicinal Chemistry invited a selection of experts to express their views on the current impact of big data in drug discovery and design, as well as speculate on future developments in the field. The topics discussed include the challenges of implementing big data technologies, maintaining the quality and privacy of data sets, and how the industry will need to adapt to welcome the big data era. Their enlightening responses provide a snapshot of the many and varied contributions being made by big data to the advancement of pharmaceutical science.

  13. [Contemplation on the application of big data in clinical medicine].

    PubMed

    Lian, Lei

    2015-01-01

    Medicine is another area where big data is being used. The link between clinical treatment and outcome is the key step when applying big data in medicine. In the era of big data, it is critical to collect complete outcome data. Patient follow-up, comprehensive integration of data resources, quality control and standardized data management are the predominant approaches to avoid missing data and data island. Therefore, establishment of systemic patients follow-up protocol and prospective data management strategy are the important aspects of big data in medicine.

  14. Differential Privacy Preserving in Big Data Analytics for Connected Health.

    PubMed

    Lin, Chi; Song, Zihao; Song, Houbing; Zhou, Yanhong; Wang, Yi; Wu, Guowei

    2016-04-01

    In Body Area Networks (BANs), big data collected by wearable sensors usually contain sensitive information, which is compulsory to be appropriately protected. Previous methods neglected privacy protection issue, leading to privacy exposure. In this paper, a differential privacy protection scheme for big data in body sensor network is developed. Compared with previous methods, this scheme will provide privacy protection with higher availability and reliability. We introduce the concept of dynamic noise thresholds, which makes our scheme more suitable to process big data. Experimental results demonstrate that, even when the attacker has full background knowledge, the proposed scheme can still provide enough interference to big sensitive data so as to preserve the privacy.

  15. Big Impacts and Transient Oceans on Titan

    NASA Technical Reports Server (NTRS)

    Zahnle, K. J.; Korycansky, D. G.; Nixon, C. A.

    2014-01-01

    We have studied the thermal consequences of very big impacts on Titan [1]. Titan's thick atmosphere and volatile-rich surface cause it to respond to big impacts in a somewhat Earth-like manner. Here we construct a simple globally-averaged model that tracks the flow of energy through the environment in the weeks, years, and millenia after a big comet strikes Titan. The model Titan is endowed with 1.4 bars of N2 and 0.07 bars of CH4, methane lakes, a water ice crust, and enough methane underground to saturate the regolith to the surface. We assume that half of the impact energy is immediately available to the atmosphere and surface while the other half is buried at the site of the crater and is unavailable on time scales of interest. The atmosphere and surface are treated as isothermal. We make the simplifying assumptions that the crust is everywhere as methane saturated as it was at the Huygens landing site, that the concentration of methane in the regolith is the same as it is at the surface, and that the crust is made of water ice. Heat flow into and out of the crust is approximated by step-functions. If the impact is great enough, ice melts. The meltwater oceans cool to the atmosphere conductively through an ice lid while at the base melting their way into the interior, driven down in part through Rayleigh-Taylor instabilities between the dense water and the warm ice. Topography, CO2, and hydrocarbons other than methane are ignored. Methane and ethane clathrate hydrates are discussed quantitatively but not fully incorporated into the model.

  16. Big Data Analytics for Prostate Radiotherapy

    PubMed Central

    Coates, James; Souhami, Luis; El Naqa, Issam

    2016-01-01

    Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose–volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the “RadoncSpace”) in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches. PMID:27379211

  17. Probing the Big Bang with LEP

    NASA Technical Reports Server (NTRS)

    Schramm, David N.

    1990-01-01

    It is shown that LEP probes the Big Bang in two significant ways: (1) nucleosynthesis, and (2) dark matter constraints. In the first case, LEP verifies the cosmological standard model prediction on the number of neutrino types, thus strengthening the conclusion that the cosmological baryon density is approximately 6 percent of the critical value. In the second case, LEP shows that the remaining non-baryonic cosmological matter must be somewhat more massive and/or more weakly interacting than the favorite non-baryonic dark matter candidates of a few years ago.

  18. Nuclear Receptors, RXR, and the Big Bang.

    PubMed

    Evans, Ronald M; Mangelsdorf, David J

    2014-03-27

    Isolation of genes encoding the receptors for steroids, retinoids, vitamin D, and thyroid hormone and their structural and functional analysis revealed an evolutionarily conserved template for nuclear hormone receptors. This discovery sparked identification of numerous genes encoding related proteins, termed orphan receptors. Characterization of these orphan receptors and, in particular, of the retinoid X receptor (RXR) positioned nuclear receptors at the epicenter of the "Big Bang" of molecular endocrinology. This Review provides a personal perspective on nuclear receptors and explores their integrated and coordinated signaling networks that are essential for multicellular life, highlighting the RXR heterodimer and its associated ligands and transcriptional mechanism.

  19. Probing the Big Bang with LEP

    SciTech Connect

    Schramm, D.N. Fermi National Accelerator Lab., Batavia, IL )

    1990-06-01

    It is shown that LEP probes the Big Bang in two significant ways: (1) nucleosynthesis and (2) dark matter constraints. In the first case, LEP verifies the cosmological standard model prediction on the number of neutrino types, thus strengthening the conclusion that the cosmological baryon density is {approximately}6% of the critical value. In the second case, LEP shows that the remaining non-baryonic cosmological matter must be somewhat more massive and/or more weakly interacting that the favorite non-baryonic dark matter candidates of a few years ago. 59 refs., 4 figs., 2 tabs.

  20. Nuclear Receptors, RXR & the Big Bang

    PubMed Central

    Evans, Ronald M.; Mangelsdorf, David J.

    2014-01-01

    Summary Isolation of genes encoding the receptors for steroids, retinoids, vitamin D and thyroid hormone, and their structural and functional analysis revealed an evolutionarily conserved template for nuclear hormone receptors. This discovery sparked identification of numerous genes encoding related proteins, termed orphan receptors. Characterization of these orphan receptors, and in particular of the retinoid X receptor (RXR), positioned nuclear receptors at the epicenter of the “Big Bang” of molecular endocrinology. This review provides a personal perspective on nuclear receptors and explores their integrated and coordinated signaling networks that are essential for multi-cellular life, highlighting the RXR heterodimer and its associated ligands and transcriptional mechanism. PMID:24679540

  1. The Big Bang and Cosmic Inflation

    NASA Astrophysics Data System (ADS)

    Guth, Alan H.

    2014-03-01

    A summary is given of the key developments of cosmology in the 20th century, from the work of Albert Einstein to the emergence of the generally accepted hot big bang model. The successes of this model are reviewed, but emphasis is placed on the questions that the model leaves unanswered. The remainder of the paper describes the inflationary universe model, which provides plausible answers to a number of these questions. It also offers a possible explanation for the origin of essentially all the matter and energy in the observed universe.

  2. Nanobiotech in big pharma: a business perspective.

    PubMed

    Würmseher, Martin; Firmin, Lea

    2017-03-01

    Since the early 2000s, numerous publications have presented major scientific opportunities that can be achieved through integrating insights from the area of nanotech into biotech (nanobiotech). This paper aims to explore the economic significance that nanobiotech has gained in the established pharmaceutical industry (big pharma). The empirical investigation draws on patent data as well as product revenue data; and to put the results into perspective, the amounts are compared with the established/traditional biotech sector. The results indicate that the new technology still plays only a minor role - at least from a commercial perspective.

  3. The Next Big Thing - Eric Haseltine

    ScienceCinema

    Eric Haseltine

    2016-07-12

    Eric Haseltine, Haseltine Partners president and former chief of Walt Disney Imagineering, presented "The Next Big Thing," on Sept. 11, at the ORNL. He described the four "early warning signs" that a scientific breakthrough is imminent, and then suggested practical ways to turn these insights into breakthrough innovations. Haseltine is former director of research at the National Security Agency and associate director for science and technology for the director of National Intelligence, former executive vice president of Walt Disney Imagineering and director of engineering for Hughes Aircraft. He has 15 patents in optics, special effects and electronic media, and more than 100 publications in science and technical journals, the web and Discover Magazine.

  4. The New Big Science at the NSLS

    NASA Astrophysics Data System (ADS)

    Crease, Robert

    2016-03-01

    The term ``New Big Science'' refers to a phase shift in the kind of large-scale science that was carried out throughout the U.S. National Laboratory system, when large-scale materials science accelerators rather than high-energy physics accelerators became marquee projects at most major basic research laboratories in the post-Cold War era, accompanied by important changes in the character and culture of the research ecosystem at these laboratories. This talk explores some aspects of this phase shift at BNL's National Synchrotron Light Source.

  5. [Research with big data: the European perspective].

    PubMed

    Bender, Stefan; Elias, P

    2015-08-01

    The article examines the impact that legislative developments in the European Union have had, still have and are continuing to have on cross-border access to microdata for research purposes. Therefore, we describe two competing aims: the tension between the ambitions of the EU to create a European Research Area within which research communities gain access to and share data across national boundaries; and the desire within the EU to establish a harmonious legislative framework that provides protection from the misuse of personal information. We attempt to examine which new developments at the EU level will have an impact upon research plans and the challenges researchers face when analysing big data.

  6. Pre - big bang inflation requires fine tuning

    SciTech Connect

    Turner, Michael S.; Weinberg, Erick J.

    1997-10-01

    The pre-big-bang cosmology inspired by superstring theories has been suggested as an alternative to slow-roll inflation. We analyze, in both the Jordan and Einstein frames, the effect of spatial curvature on this scenario and show that too much curvature --- of either sign --- reduces the duration of the inflationary era to such an extent that the flatness and horizon problems are not solved. Hence, a fine-tuning of initial conditions is required to obtain enough inflation to solve the cosmological problems.

  7. The Next Big Thing - Eric Haseltine

    SciTech Connect

    Eric Haseltine

    2009-09-16

    Eric Haseltine, Haseltine Partners president and former chief of Walt Disney Imagineering, presented "The Next Big Thing," on Sept. 11, at the ORNL. He described the four "early warning signs" that a scientific breakthrough is imminent, and then suggested practical ways to turn these insights into breakthrough innovations. Haseltine is former director of research at the National Security Agency and associate director for science and technology for the director of National Intelligence, former executive vice president of Walt Disney Imagineering and director of engineering for Hughes Aircraft. He has 15 patents in optics, special effects and electronic media, and more than 100 publications in science and technical journals, the web and Discover Magazine.

  8. Results of High Velocity Tests at Tampa Electric Company`s Big Bend 4 FGD System.

    SciTech Connect

    DeKraker, D.P.

    1997-10-15

    Test were conducted at the Big Bend Station to determine the feasibility of scrubbing gas from an additional boiler in the existing FGD system. Testing was accomplished by increasing the gas flow from the D absorber tower and measuring the performance of this module. Key performance aspects evaluated during the testing include mist eliminator performance, SO2 removal efficiency, oxidation of absorbed SO2, and limestone utilization.

  9. Feature Extraction in Sequential Multimedia Images: with Applications in Satellite Images and On-line Videos

    NASA Astrophysics Data System (ADS)

    Liang, Yu-Li

    Multimedia data is increasingly important in scientific discovery and people's daily lives. Content of massive multimedia is often diverse and noisy, and motion between frames is sometimes crucial in analyzing those data. Among all, still images and videos are commonly used formats. Images are compact in size but do not contain motion information. Videos record motion but are sometimes too big to be analyzed. Sequential images, which are a set of continuous images with low frame rate, stand out because they are smaller than videos and still maintain motion information. This thesis investigates features in different types of noisy sequential images, and the proposed solutions that intelligently combined multiple features to successfully retrieve visual information from on-line videos and cloudy satellite images. The first task is detecting supraglacial lakes above ice sheet in sequential satellite images. The dynamics of supraglacial lakes on the Greenland ice sheet deeply affect glacier movement, which is directly related to sea level rise and global environment change. Detecting lakes above ice is suffering from diverse image qualities and unexpected clouds. A new method is proposed to efficiently extract prominent lake candidates with irregular shapes, heterogeneous backgrounds, and in cloudy images. The proposed system fully automatize the procedure that track lakes with high accuracy. We further cooperated with geoscientists to examine the tracked lakes and found new scientific findings. The second one is detecting obscene content in on-line video chat services, such as Chatroulette, that randomly match pairs of users in video chat sessions. A big problem encountered in such systems is the presence of flashers and obscene content. Because of various obscene content and unstable qualities of videos capture by home web-camera, detecting misbehaving users is a highly challenging task. We propose SafeVchat, which is the first solution that achieves satisfactory

  10. Seed bank and big sagebrush plant community composition in a range margin for big sagebrush

    USGS Publications Warehouse

    Martyn, Trace E.; Bradford, John B.; Schlaepfer, Daniel R.; Burke, Ingrid C.; Laurenroth, William K.

    2016-01-01

    The potential influence of seed bank composition on range shifts of species due to climate change is unclear. Seed banks can provide a means of both species persistence in an area and local range expansion in the case of increasing habitat suitability, as may occur under future climate change. However, a mismatch between the seed bank and the established plant community may represent an obstacle to persistence and expansion. In big sagebrush (Artemisia tridentata) plant communities in Montana, USA, we compared the seed bank to the established plant community. There was less than a 20% similarity in the relative abundance of species between the established plant community and the seed bank. This difference was primarily driven by an overrepresentation of native annual forbs and an underrepresentation of big sagebrush in the seed bank compared to the established plant community. Even though we expect an increase in habitat suitability for big sagebrush under future climate conditions at our sites, the current mismatch between the plant community and the seed bank could impede big sagebrush range expansion into increasingly suitable habitat in the future.

  11. Making big sense from big data in toxicology by read-across.

    PubMed

    Hartung, Thomas

    2016-01-01

    Modern information technologies have made big data available in safety sciences, i.e., extremely large data sets that may be analyzed only computationally to reveal patterns, trends and associations. This happens by (1) compilation of large sets of existing data, e.g., as a result of the European REACH regulation, (2) the use of omics technologies and (3) systematic robotized testing in a high-throughput manner. All three approaches and some other high-content technologies leave us with big data--the challenge is now to make big sense of these data. Read-across, i.e., the local similarity-based intrapolation of properties, is gaining momentum with increasing data availability and consensus on how to process and report it. It is predominantly applied to in vivo test data as a gap-filling approach, but can similarly complement other incomplete datasets. Big data are first of all repositories for finding similar substances and ensure that the available data is fully exploited. High-content and high-throughput approaches similarly require focusing on clusters, in this case formed by underlying mechanisms such as pathways of toxicity. The closely connected properties, i.e., structural and biological similarity, create the confidence needed for predictions of toxic properties. Here, a new web-based tool under development called REACH-across, which aims to support and automate structure-based read-across, is presented among others.

  12. Big science and big administration. Confronting the governance, financial and legal challenges of FuturICT

    NASA Astrophysics Data System (ADS)

    Smart, J.; Scott, M.; McCarthy, J. B.; Tan, K. T.; Argyrakis, P.; Bishop, S.; Conte, R.; Havlin, S.; San Miguel, M.; Stauffacher, D.

    2012-11-01

    This paper considers the issues around managing large scientific projects, and draws conclusions for the governance and management of FuturICT, based on previous experience of Big Science projects, such as CERN and ATLAS. We also consider the legal and ethical issues of the FuturICT project as the funding instrument moves from the Seventh Framework Programme to Horizon 2020.

  13. Standard big bang nucleosynthesis and primordial CNO abundances after Planck

    SciTech Connect

    Coc, Alain

    2014-10-01

    Primordial or big bang nucleosynthesis (BBN) is one of the three historical strong evidences for the big bang model. The recent results by the Planck satellite mission have slightly changed the estimate of the baryonic density compared to the previous WMAP analysis. This article updates the BBN predictions for the light elements using the cosmological parameters determined by Planck, as well as an improvement of the nuclear network and new spectroscopic observations. There is a slight lowering of the primordial Li/H abundance, however, this lithium value still remains typically 3 times larger than its observed spectroscopic abundance in halo stars of the Galaxy. According to the importance of this ''lithium problem{sup ,} we trace the small changes in its BBN calculated abundance following updates of the baryonic density, neutron lifetime and networks. In addition, for the first time, we provide confidence limits for the production of {sup 6}Li, {sup 9}Be, {sup 11}B and CNO, resulting from our extensive Monte Carlo calculation with our extended network. A specific focus is cast on CNO primordial production. Considering uncertainties on the nuclear rates around the CNO formation, we obtain CNO/H ≈ (5-30)×10{sup -15}. We further improve this estimate by analyzing correlations between yields and reaction rates and identified new influential reaction rates. These uncertain rates, if simultaneously varied could lead to a significant increase of CNO production: CNO/H∼10{sup -13}. This result is important for the study of population III star formation during the dark ages.

  14. Additive Similarity Trees

    ERIC Educational Resources Information Center

    Sattath, Shmuel; Tversky, Amos

    1977-01-01

    Tree representations of similarity data are investigated. Hierarchical clustering is critically examined, and a more general procedure, called the additive tree, is presented. The additive tree representation is then compared to multidimensional scaling. (Author/JKS)

  15. Bad Colourmaps Can Hide Big Structures

    NASA Astrophysics Data System (ADS)

    Kovesi, Peter

    2014-05-01

    Colourmaps are often selected with little awareness of the perceptual distortions they might introduce. A colourmap can be thought of as a line or curve drawn through a three dimensional colour space. Individual data values are mapped to positions along this line which, in turn, allows them to be mapped to a colour. For a colourmap to be effective it is important that the perceptual contrast that occurs as one moves along the line in the colour space is close to uniform. Many colourmaps are designed as piecewise linear paths through RGB space. This is a poor colour space to use because it is not perceptually uniform. Accordingly many colourmaps supplied by vendors have uneven perceptual contrast over their range. They may include points of locally high colour contrast leading you to think there might be some anomaly in your data when there is none. Conversely, colourmaps may also have flat spots of low perceptual contrast that prevent you from seeing features in your data. In some cases it is possible for structures having a magnitude of 10% of the full data range to be completely hidden by a flat spot in the colourmap. The deficiencies of many colourmaps can be revealed using a simple test image consisting of a high frequency sine wave superimposed on a ramp function. The amplitude of the sine wave is modulated from a maximum value at the top of the image to zero at the bottom. Ideally the sine wave should be uniformly visible across the image at all points on the ramp. For many colourmaps this will not be the case. At the very bottom of the image, where the sine wave amplitude has been modulated to 0, we just have a linear ramp which simply reproduces the colourmap. Given that the underlying data is a featureless ramp the colourmap should not induce the perception of any features across the bottom of the test image. Good colourmaps are difficult to design. A greyscale colourmap is generally a safe choice but is not always what is desired. For non

  16. Frequent arousals from winter torpor in Rafinesque's big-eared bat (Corynorhinus rafinesquii).

    PubMed

    Johnson, Joseph S; Lacki, Michael J; Thomas, Steven C; Grider, John F

    2012-01-01

    Extensive use of torpor is a common winter survival strategy among bats; however, data comparing various torpor behaviors among species are scarce. Winter torpor behaviors are likely to vary among species with different physiologies and species inhabiting different regional climates. Understanding these differences may be important in identifying differing susceptibilities of species to white-nose syndrome (WNS) in North America. We fitted 24 Rafinesque's big-eared bats (Corynorhinus rafinesquii) with temperature-sensitive radio-transmitters, and monitored 128 PIT-tagged big-eared bats, during the winter months of 2010 to 2012. We tested the hypothesis that Rafinesque's big-eared bats use torpor less often than values reported for other North American cave-hibernators. Additionally, we tested the hypothesis that Rafinesque's big-eared bats arouse on winter nights more suitable for nocturnal foraging. Radio-tagged bats used short (2.4 d ± 0.3 (SE)), shallow (13.9°C ± 0.6) torpor bouts and switched roosts every 4.1 d ± 0.6. Probability of arousal from torpor increased linearly with ambient temperature at sunset (P<0.0001), and 83% (n=86) of arousals occurred within 1 hr of sunset. Activity of PIT-tagged bats at an artificial maternity/hibernaculum roost between November and March was positively correlated with ambient temperature at sunset (P<0.0001), with males more active at the roost than females. These data show Rafinesque's big-eared bat is a shallow hibernator and is relatively active during winter. We hypothesize that winter activity patterns provide Corynorhinus species with an ecological and physiological defense against the fungus causing WNS, and that these bats may be better suited to withstand fungal infection than other cave-hibernating bat species in eastern North America.

  17. Frequent Arousals from Winter Torpor in Rafinesque’s Big-Eared Bat (Corynorhinus rafinesquii)

    PubMed Central

    Johnson, Joseph S.; Lacki, Michael J.; Thomas, Steven C.; Grider, John F.

    2012-01-01

    Extensive use of torpor is a common winter survival strategy among bats; however, data comparing various torpor behaviors among species are scarce. Winter torpor behaviors are likely to vary among species with different physiologies and species inhabiting different regional climates. Understanding these differences may be important in identifying differing susceptibilities of species to white-nose syndrome (WNS) in North America. We fitted 24 Rafinesque’s big-eared bats (Corynorhinus rafinesquii) with temperature-sensitive radio-transmitters, and monitored 128 PIT-tagged big-eared bats, during the winter months of 2010 to 2012. We tested the hypothesis that Rafinesque’s big-eared bats use torpor less often than values reported for other North American cave-hibernators. Additionally, we tested the hypothesis that Rafinesque’s big-eared bats arouse on winter nights more suitable for nocturnal foraging. Radio-tagged bats used short (2.4 d ± 0.3 (SE)), shallow (13.9°C ± 0.6) torpor bouts and switched roosts every 4.1 d ± 0.6. Probability of arousal from torpor increased linearly with ambient temperature at sunset (P<0.0001), and 83% (n = 86) of arousals occurred within 1 hr of sunset. Activity of PIT-tagged bats at an artificial maternity/hibernaculum roost between November and March was positively correlated with ambient temperature at sunset (P<0.0001), with males more active at the roost than females. These data show Rafinesque’s big-eared bat is a shallow hibernator and is relatively active during winter. We hypothesize that winter activity patterns provide Corynorhinus species with an ecological and physiological defense against the fungus causing WNS, and that these bats may be better suited to withstand fungal infection than other cave-hibernating bat species in eastern North America. PMID:23185427

  18. Using 'big data' to validate claims made in the pharmaceutical approval process.

    PubMed

    Wasser, Thomas; Haynes, Kevin; Barron, John; Cziraky, Mark

    2015-01-01

    Big Data in the healthcare setting refers to the storage, assimilation, and analysis of large quantities of information regarding patient care. These data can be collected and stored in a wide variety of ways including electronic medical records collected at the patient bedside, or through medical records that are coded and passed to insurance companies for reimbursement. When these data are processed it is possible to validate claims as a part of the regulatory review process regarding the anticipated performance of medications and devices. In order to analyze properly claims by manufacturers and others, there is a need to express claims in terms that are testable in a timeframe that is useful and meaningful to formulary committees. Claims for the comparative benefits and costs, including budget impact, of products and devices need to be expressed in measurable terms, ideally in the context of submission or validation protocols. Claims should be either consistent with accessible Big Data or able to support observational studies where Big Data identifies target populations. Protocols should identify, in disaggregated terms, key variables that would lead to direct or proxy validation. Once these variables are identified, Big Data can be used to query massive quantities of data in the validation process. Research can be passive or active in nature. Passive, where the data are collected retrospectively; active where the researcher is prospectively looking for indicators of co-morbid conditions, side-effects or adverse events, testing these indicators to determine if claims are within desired ranges set forth by the manufacturer. Additionally, Big Data can be used to assess the effectiveness of therapy through health insurance records. This, for example, could indicate that disease or co-morbid conditions cease to be treated. Understanding the basic strengths and weaknesses of Big Data in the claim validation process provides a glimpse of the value that this research

  19. Modeling canopy-level productivity: is the "big-leaf" simplification acceptable?

    NASA Astrophysics Data System (ADS)

    Sprintsin, M.; Chen, J. M.

    2009-05-01

    The "big-leaf" approach to calculating the carbon balance of plant canopies assumes that canopy carbon fluxes have the same relative responses to the environment as any single unshaded leaf in the upper canopy. Widely used light use efficiency models are essentially simplified versions of the big-leaf model. Despite its wide acceptance, subsequent developments in the modeling of leaf photosynthesis and measurements of canopy physiology have brought into question the assumptions behind this approach showing that big leaf approximation is inadequate for simulating canopy photosynthesis because of the additional leaf internal control on carbon assimilation and because of the non-linear response of photosynthesis on leaf nitrogen and absorbed light, and changes in leaf microenvironment with canopy depth. To avoid this problem a sunlit/shaded leaf separation approach, within which the vegetation is treated as two big leaves under different illumination conditions, is gradually replacing the "big-leaf" strategy, for applications at local and regional scales. Such separation is now widely accepted as a more accurate and physiologically based approach for modeling canopy photosynthesis. Here we compare both strategies for Gross Primary Production (GPP) modeling using the Boreal Ecosystem Productivity Simulator (BEPS) at local (tower footprint) scale for different land cover types spread over North America: two broadleaf forests (Harvard, Massachusetts and Missouri Ozark, Missouri); two coniferous forests (Howland, Maine and Old Black Spruce, Saskatchewan); Lost Creek shrubland site (Wisconsin) and Mer Bleue petland (Ontario). BEPS calculates carbon fixation by scaling Farquhar's leaf biochemical model up to canopy level with stomatal conductance estimated by a modified version of the Ball-Woodrow-Berry model. The "big-leaf" approach was parameterized using derived leaf level parameters scaled up to canopy level by means of Leaf Area Index. The influence of sunlit

  20. High energy neutrinos from big bang particles.

    NASA Astrophysics Data System (ADS)

    Berezinskij, V. S.

    1992-10-01

    The production of high energy neutrinos by big bang particles is reviewed. The big bang particles are divided into two categories: dark matter particles (DMP) and the exotic relics whose mass density can be smaller than the critical one. For the case of DMP the neutralino and the gravitino are considered. High energy neutrinos can be produced due to the capture of the neutralinos in the earth and the sun, with the subsequent annihilation of these particles there. If R-parity is weakly violated, the neutralino decay can be a source of high energy neutrinos. The gravitino as DMP is unobservable directly, unless R-parity is violated and the gravitino decays. For thermal exotic relics a very general conclusion is reached: the detectable neutrino flux can be produced only by long-lived particles with τx > t0, where t0 is the age of the universe. Very large neutrino fluxes can be produced by superheavy metastable relics in the particular cosmological scenario where the violent entropy production occurs.